"Alerted.org

Job Title, Industry, Employer
City & State or Zip Code
20 mi
  • 0 mi
  • 5 mi
  • 10 mi
  • 20 mi
  • 50 mi
  • 100 mi
Advanced Search

Advanced Search

Cancel
Remove
+ Add search criteria
City & State or Zip Code
20 mi
  • 0 mi
  • 5 mi
  • 10 mi
  • 20 mi
  • 50 mi
  • 100 mi
Related to

  • Senior Associate Data Engineering

    Publicis Groupe (Atlanta, GA)



    Apply Now

    Company description

    Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting and customer obsession to accelerate our clients’ businesses through designing the products and services their customers truly value.

    Overview

    **Employer** : Sapient Corporation

    **Job Title** : Senior Associate Data Engineering

    **Job Requisition** : 6630.7898.6

    **Job Location** : 384 Northyards Blvd. NW, Atlanta, GA 30313. Will work from company office Atlanta, GA and various unanticipated client sites and Sapient offices nationally; Telecommuting available on a hybrid basis at company discretion.

    **Job Type** : Full Time

    **Rate of Pay** : $129,381.00 to $135,850.00 per year

    **Duties** : Write scripts and troubleshoot for performance using relevant programming languages. Build test automation framework for existing data ingestion code. Work with large data sets, real-time/near real-time analytics, on distributed big data platforms. Work on data engineering functions including, but not limited to Data Extract, transformation, loading and integration in support of data warehouse. Data ingestion, validation, and enrichment pipeline design and implementation. Build performance testing framework for existing data ingestion pipelines. Involved in implementing Transaction History Batch processing on a daily/weekly schedule. Develop and Implement ETL jobs using AWS Glue to extract, transform, and load data from various sources. Demonstrable experience in data platforms involving implementation of end-to-end data pipelines. Experience in data modelling, warehouse design and fact/dimension implementations. Experience working with code repositories and continuous integration. Data modelling, querying, and optimization for relational, NoSQL, timeseries, data warehouses and data lakes. Experience in implementing data pipelines for both streaming and batch integrations.

    \#LI-DNI

    Qualifications

    Employer will accept a Bachelor’s degree in Computer Science, Information Technology, Engineering or a related field and three years of progressive, post-baccalaureate experience in the job offered or three years of progressive, post-baccalaureate experience in any occupation in which the required experience was gained.

    Position also requires three years of experience in each of the following:

    1. Design, develop, and manage data pipelines (both batch and streaming) using Python, PySpark, UNIX Shell Scripting, SQL.

    2. Integrate AWS Services including Lambda, DynamoDB, EC2, RDS, S3, Athena, Data pipeline, API gateway, Glue, EMR to extract, transform, and load data from various sources and also to improve accessibility/efficiency

    3. Design, develop, and implement complex ETL processes using Informatica PowerCenter ETL tool for pulling the sales/reporting/alerting data from source systems, doing various data transformations like filtering, aggregation, sort, routing, joining.

    4. Use GIT, Bitbucket for code versioning check-ins to enable retrieve any previous version of the code.

    5. Automate the process of ETL testing using Informatica, Python, Robot Framework by integrating with Jira dashboard to reduce the manual testing effort.

    6. Develop and optimize complex SQL queries in Oracle Database for performance improvement of the batch processes.

    7. Manage enterprise job scheduling using TWS (Tivoli Workload Scheduler), Autosys , Event Engine, and Control-M.

    8. CI/CD pipelines using Jenkins for automating build, test, and deployment processes. Integrate Jenkins with various tools including GIT.

    9. Integrated XL Release with Jenkins, GIT, and other CI/CD tools for an efficient release process.

    10. Develop and consume RESTful APIs using Python for system integrations and data exchange.

    11. Build scalable Python APIs using frameworks like Flask and FastAPI for various data-driven applications.

     

    Additional information

     

    Will work from company office Atlanta, GA and various unanticipated client sites and Sapient offices nationally; Telecommuting available on a hybrid basis at company discretion.

     


    Apply Now



Recent Searches

[X] Clear History

Recent Jobs

  • Senior Associate Data Engineering
    Publicis Groupe (Atlanta, GA)
[X] Clear History

Account Login

Cancel
 
Forgot your password?

Not a member? Sign up

Sign Up

Cancel
 

Already have an account? Log in
Forgot your password?

Forgot your password?

Cancel
 
Enter the email associated with your account.

Already have an account? Sign in
Not a member? Sign up

© 2025 Alerted.org