"Alerted.org

Job Title, Industry, Employer
City & State or Zip Code
20 mi
  • 0 mi
  • 5 mi
  • 10 mi
  • 20 mi
  • 50 mi
  • 100 mi
Advanced Search

Advanced Search

Cancel
Remove
+ Add search criteria
City & State or Zip Code
20 mi
  • 0 mi
  • 5 mi
  • 10 mi
  • 20 mi
  • 50 mi
  • 100 mi
Related to

  • Data Engineer

    Insight Global (Alpharetta, GA)



    Apply Now

    Job Description

    We’re looking for a seasoned Data Engineer III who is passionate about building scalable and

    resilient cloud-native data infrastructure — with a focus on governance, CI/CD, automation, and

    platform maturity. You will be a key contributor in evolving our modern data stack, ensuring

     

    operational excellence and code quality across ETL pipelines, metadata frameworks, and realtime/batch data services.

     

    You’ll work at the intersection of data engineering, DevOps, and governance, setting standards

    across code repositories, orchestrators (Airflow), compute layers (Glue/EMR), and ingestion

    tools (DMS, Kafka, etc.).

    Primary Responsibilities

    • Maintain and evolve OLTP (Postgres) and OLAP (Redshift) data models /data lakes by

     

    evaluating new feature requirements, ensuring alignment with dimensional modeling best

     

    practices, and executing schema changes via Liquibase pipelines.

     

    • Develop and maintain metadata-driven data pipeline frameworks that support validation,

     

    logging, auditing, and job orchestration.

     

    • Standardize and govern Bitbucket/Git repositories, manage branching strategies, enforce

     

    code review and CI pipelines for ETL/data jobs.

     

    • Design and implement CI/CD workflows for data services using tools like Jenkins, Liquibase,

     

    and Shell/Python scripting.

     

    • Support automated deployment of ETL, Airflow DAGs, Glue jobs, and DB schema changes

     

    across environments (QA, Stage, Prod).

     

    • Collaborate with DataOps and DevOps teams to maintain infrastructure as code (IaC)

     

    standards and shared configuration patterns.

     

    • Build and scale data quality frameworks, including pre/post validations, job restartability,

     

    and alerting (CloudWatch, SNS).

     

    • Implement data masking and access control standards (RBAC, column-level masking, rolebased access) across Redshift and Iceberg.

    • Optimize DMS/Kafka-based CDC pipelines and help reduce dependency through automation

     

    or zero-ETL patterns.

     

    • Define standards for data retention, archival, and operational efficiency across OLTP/OLAP

     

    environments.

     

    • Partner with data engineers and analysts to align platform standards with business needs and

     

    analytical readiness

     

    We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to [email protected] learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.

     

    Skills and Requirements

     

    8+ years of experience in data engineering or platform engineering with exposure to

     

    production-grade data pipelines and systems.

     

    • Deep expertise in Python and SQL, with strong understanding of pipeline design patterns and

     

    modular codebases.

     

    • Solid understanding of AWS cloud services: S3, Glue, Redshift, DMS, Lambda, EMR, IAM,

     

    CloudWatch.

     

    • Experience with workflow orchestration tools like Airflow (DAG scheduling, dependency

     

    mapping, alerts).

     

    • Hands-on experience maintaining data lakehouse platforms (e.g., Apache Iceberg, Delta Lake)

     

    and managing batch vs. streaming ingestion.

     

    • Experience managing schema changes, migrations, and rollback strategies across databases

     

    (Postgres, Redshift).

     

    • Strong understanding of data security practices, including PII masking, row/column-level

     

    controls, and audit logging.

     

    • Familiarity with dimensional modeling and differences between OLTP vs. OLAP patterns.

    • Strong documentation and process-driven mindset to define standards and maintain

     

    operational transparency • Experience with CI/CD tooling (e.g., Jenkins, Liquibase, Bitbucket Pipelines) and

     

    managing deployment pipelines for data workloads.

     


    Apply Now



Recent Searches

  • Internal Audit IT Senior (Oregon)
  • Assembler Level 2 2nd (United States)
  • Organization Workforce Design Consultant (Texas)
[X] Clear History

Recent Jobs

  • Data Engineer
    Insight Global (Alpharetta, GA)
[X] Clear History

Account Login

Cancel
 
Forgot your password?

Not a member? Sign up

Sign Up

Cancel
 

Already have an account? Log in
Forgot your password?

Forgot your password?

Cancel
 
Enter the email associated with your account.

Already have an account? Sign in
Not a member? Sign up

© 2025 Alerted.org