"Alerted.org

Job Title, Industry, Employer
City & State or Zip Code
20 mi
  • 0 mi
  • 5 mi
  • 10 mi
  • 20 mi
  • 50 mi
  • 100 mi
Advanced Search

Advanced Search

Cancel
Remove
+ Add search criteria
City & State or Zip Code
20 mi
  • 0 mi
  • 5 mi
  • 10 mi
  • 20 mi
  • 50 mi
  • 100 mi
Related to

  • Data Engineer

    HTC Global Services Inc (Dearborn, MI)



    Apply Now

    HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies.

     

    At HTC Global Services our consultants have access to a comprehensive benefits package. Benefits can include Paid-Time-Off, Paid Holidays, 401K matching, Life and Accidental Death Insurance, Short & Long Term Disability Insurance, and a variety of other perks.

     

    Job Description: Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately

    Key Responsibilities:

    + Collaborate with business and technology stakeholders to understand current and future data requirements

    + Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis

    + Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow

    + Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data lakehouse etc. for structured and unstructured data

    + Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks

    + Ensure optimum performance and identify improvement opportunities

    Skills Required:

    + Google Cloud Platform, ETL, Apache Spark, Data Architecture, Python, SQL, KAFKA

    Skills Preferred:

    + Java, Powershell, Data Acquisition, Data Analysis, Data Collection, Data Conversion, Data Integrity, Data/Analytics dashboards

    Experience Required:

    + Engineer 2 Exp: 4+ years Data Engineering work experience

    Experience Preferred:

    + Data Pipeline Architecture & Development: Design, build, and maintain highly scalable, fault-tolerant, and performant data pipelines to ingest and process data from 10+ siloed sources, including both structured and unstructured formats.

    + ML-Driven ETL Implementation: Operationalize ETL pipelines for intelligent data ingestion, automated cataloging, and sophisticated normalization of diverse datasets.

    + Unified Data Model Creation: Architect and implement a unified data model capable of connecting all relevant data elements across various sources, optimized for efficient querying and insight generation by AI agents and chatbot interfaces.

    + Big Data Processing: Utilize advanced distributed processing frameworks (Apache Beam, Apache Spark, Google Cloud Dataflow) to handle large-scale data transformations and data flow.

    + Cloud-Native Data Infrastructure: Leverage GCP services to build and manage robust data storage, processing, and orchestration layers.

    + Data Quality, Governance & Security: Implement rigorous data quality gates, validation rules, bad record handling, and comprehensive logging. Ensure strict adherence to data security policies, IAM role management, and GCP perimeter security.

    + Automation & Orchestration: Develop shell scripts, Cloud Build YAMLs, and utilize Cloud Scheduler/PubSub for E2E automation of data pipelines and infrastructure provisioning.

    + Collaboration with AI/ML Teams: Work closely with AI/ML engineers, data scientists, and product managers to understand data reqts, integrate data solutions with multi-agentic systems, and optimize data delivery for chatbot functionalities.

    + Testing & CI/CD: Implement robust testing strategies, maintain high code quality through active participation in Git/GitHub, perform code reviews, and manage CI/CD pipelines via Cloud Build.

    + Perf. Tuning & Optimization: Continuously monitor, optimize, and troubleshoot data pipelines and BQ performance using techniques like table partitioning, clustering, and sharding.

    Education Required:

    + Bachelor's Degree

    Education Preferred:

    + Certification Program

     

    Our success as a company is built on practicing inclusion and embracing diversity. HTC Global Services is committed to providing a work environment free from discrimination and harassment, where all employees are treated with respect and dignity. Together we work to create and maintain an environment where everyone feels valued, included, and respected. At HTC Global Services, our differences are embraced and celebrated. HTC is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills, and experiences within our workforce. HTC is proud to be recognized as a National Minority Supplier.

     

    #LI-SK8 #LI-Hybrid

     


    Apply Now



Recent Searches

[X] Clear History

Recent Jobs

  • Data Engineer
    HTC Global Services Inc (Dearborn, MI)
  • Senior Software Engineer
    Belay Technologies (Columbia, MD)
  • Cloud Developer
    Cadmus (Helena, MT)
  • SAP Commerce Cloud - Technical Architect
    Capgemini (Dallas, TX)
[X] Clear History

Account Login

Cancel
 
Forgot your password?

Not a member? Sign up

Sign Up

Cancel
 

Already have an account? Log in
Forgot your password?

Forgot your password?

Cancel
 
Enter the email associated with your account.

Already have an account? Sign in
Not a member? Sign up

© 2025 Alerted.org