-
Data Engineer
- Insight Global (New York, NY)
-
Job Description
Insight Global is seeking a fully remote hands‑on Data Engineer. This person will sit fully remote working East Coast hours for our client in New York City. This person will be responsible for building and owning the data foundation behind revenue, pricing, promotions, and inventory decisions for a ticketing program. You’ll design the data warehouse and pipelines that convert messy, scheduled reports and HTML/CSV files into clean, reliable datasets that power pricing strategy, sales analytics, and inventory visibility across shows and venues. This person with use analytics to turn raw sales signals into automated, trustworthy tables used by Tableau dashboards and decision models.
Responsibilities:
• Architect & implement the data platform: Stand up cloud data warehousing (preferably Google BigQuery) and define storage, partitioning, and modeling standards for sales, promotions, and inventory tables (e.g., star/snowflake schemas).
• Build ingestion & transformation pipelines: Create robust, scheduled ETL/ELT jobs that ingest data from Google Drive/CSV/HTML and other sources; normalize and enrich datasets; and publish curated marts for analytics and pricing.
• Automate manual processes: Replace ad‑hoc, manual pulls with reliable, monitored pipelines; implement job orchestration, alerting, and data‑quality checks (e.g., freshness, completeness, referential integrity).
• Enable pricing & promo strategy: Provide fast, accurate tables that support dynamic pricing, discounting, and campaign outcomes; surface inventory positions by show/date/section to guide strategy
• Partner with analytics & business users: Collaborate with revenue leaders and analysts using Tableau/Excel to define SLAs, data contracts, and semantic layers; deliver well‑documented datasets that are easy to consume.
• Productionize & operate: Own deployment, monitoring, and incident response for pipelines; optimize SQL and storage costs in BigQuery; continuously improve performance and reliability.
• Security & governance: Implement access controls, data lineage, and audit trails; establish naming conventions and versioning for transformations.
Compensation:
$60/hr to $70/hr.
Exact compensation may vary based on several factors, including skills, experience, and education.
Employees in this role will enjoy a comprehensive benefits package starting on day one of employment, including options for medical, dental, and vision insurance. Eligibility to enroll in the 401(k) retirement plan begins after 90 days of employment. Additionally, employees in this role will have access to paid sick leave and other paid time off benefits as required under the applicable law of the worksite location.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to [email protected] learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Skills and Requirements
• 5+ years in data engineering building production pipelines and warehouses at scale.
• Advanced SQL (window functions, CTEs, performance tuning) and practical Python (ETL/ELT, parsing HTML/CSV, API/file handling, testing).
• Proven experience with Google BigQuery (or equivalent columnar cloud warehouse) including partitioning, clustering, and cost/performance optimization.
• Experience ingesting non‑API data sources (scheduled reports, HTML/CSV files) and turning them into clean, reliable tables.
• Strong understanding of data modeling (star/snowflake), data quality (validation, reconciliation), and orchestration (e.g., Airflow/Cloud Composer or similar).
Ability to translate pricing/inventory business needs into scalable dataset designs; excellent documentation and stakeholder communication.
-