-
Lead Cloud Data Engineer - Remote Opportunity
- GuideOne Insurance (Chicago, IL)
-
We are building a next-generation Cloud Data Platform to unify data from Policy, Claims, Billing, and Administration systems into a single source of truth. We are seeking a Lead Cloud Data Engineer who will be 75% hand-on and play a critical role in designing, building, and optimizing our modern data ecosystem leveraging Medallion architecture, Delta Lake, and modern data warehouse technologies such as Snowflake, Synapse, or Redshift.
As a technical leader, the Lead Data Engineer will define and execute the end-to-end data engineering strategy from data ingestion and modeling to governance and performance optimization enabling scalable, high-quality, and analytics-ready data assets. This role requires hands on deep technical expertise in cloud-native data engineering, automation, and architecture design, coupled with strong leadership to mentor teams and align solutions with business goals.
Responsibilities:
Data Platform Design & Architecture
+ Define the strategic roadmap for the enterprise data platform, ensuring scalability, performance, and interoperability across business domains.
+ Architect and implement cloud-native, Medallion-based data architectures (Bronze–Silver–Gold layers) for unified and governed data delivery.
+ Drive standardization of data models, pipelines, and quality frameworks across Policy, Claims, Billing, and Administrative data assets.
+ Evaluate and implement emerging data technologies to strengthen the platform’s performance, cost efficiency, and resilience
Data Integration & Ingestion
+ Design, build, and optimize high-performance ingestion pipelines, using AWS Glue, Databricks, or custom Spark applications.
+ Automate ingestion of structured, semi-structured, and unstructured data from APIs databases, and external data feeds.
+ Tune and monitor ingestion pipelines for throughput, cost control, and reliability across dev/test/prod environments.
Data Transformation & Modeling
+ Hands on Development of ETL/ELT pipelines using Databricks or similar frameworks to transform raw data into curated and consumption-ready datasets.
+ Design and develop relational, Vault, and dimensional data models to support analytics, BI, and AI/ML workloads.
+ Define and enforce data quality standards, validation frameworks, and enrichment rules to ensure trusted business data.
+ Apply data quality, cleansing, and enrichment logic to ensure accuracy and completeness of business-critical data.
Cloud Infrastructure, Automation and Performance
+ Collaborate with DevOps and Cloud Engineering teams to design automated, infrastructure-as-code environments using Terraform, CloudFormation, or equivalent tools.
+ Implement CI/CD pipelines for data pipeline deployment, versioning, and testing.
+ Lead performance tuning and scalability optimization to ensure highly available, cost-efficient data platform.
Governance, Security & Compliance
+ Implement and enforce data governance, cataloging, and lineage practices using tools such as Purview, Alation, or Collibra.
+ Partner with InfoSec to implement data privacy, access control, and compliance frameworks aligned with regulatory standards.
+ Drive consistency and accountability in data stewardship across business and IT teams.
Leadership, Collaboration & Mentorship
+ Lead a team of data engineers, providing technical guidance, coaching, and performance mentorship.
+ Collaborate with Data Architects, Analysts, and Business Leaders to align data solutions with enterprise strategy.
+ Promote a culture of engineering excellence, reusability, and knowledge sharing across data organization.
+ Influence enterprise-wide standards for data engineering, automation, and governance.
Qualifications:
+ Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.
+ 12+ years of experience in data engineering with at 3+ years in a lead or architect-level role and least 8+ years on cloud platforms (AWS, Azure, or GCP).
+ Deep hands-on experience with Python, SQL, and data modeling (relational, and Dimensional), Databricks, Spark, AWS Glue, Delta Lake, Snowflake, Synapse, or Redshift
+ Proven experience with Medallion architecture, modern data warehousing principles., data governance, lineage, and CI/CD for data pipelines
+ Excellent leadership, communication, and cross-functional collaboration skills.
+ Experience in the Property & Casualty (P&C) Insurance domain such as Policy, Claims, or Billing data preferred.
+ Familiarity with event-driven architectures (Kafka, Kinesis) and real-time data streaming.
+ Knowledge of machine learning pipeline integration and feature engineering.
+ Proven ability to lead large-scale data modernization or cloud migration initiatives.
Compensation:
+ $140,000 - $165,000 commensurate with experience, plus bonus eligibility
Benefits:
We are proud to offer a robust benefits suite that includes:
+ Competitive base salary plus incentive plans for eligible team members
+ 401(K) retirement plan that includes a company match of up to 6% of your eligible salary
+ Free basic life and AD&D, long-term disability and short-term disability insurance
+ Medical, dental and vision plans to meet your unique healthcare needs
+ Wellness incentives
+ Generous time off program that includes personal, holiday and volunteer paid time off
+ Flexible work schedules and hybrid/remote options for eligible positions
+ Educational assistance
#TMG
-
Recent Searches
- Senior Field Engineer (Texas)
- Security Officer Front Desk (California)
- Senior Bankruptcy Process Engineer (Florida)
Recent Jobs
-
Lead Cloud Data Engineer - Remote Opportunity
- GuideOne Insurance (Chicago, IL)
-
Dry Cask Manager - Fleet
- Vistra (Oak Harbor, OH)
-
Central Service Manager I
- The County of Los Angeles (Los Angeles, CA)