-
ETL/Data Architect
- MSys Inc. (Jersey City, NJ)
-
Job summary:
Title:
ETL/Data Architect
Location:
Jersey City, NJ, USA
Length and terms:
Long term - W2 or C2C
Position created on 05/21/2025 01:45 pm
Job description:
** Webcam interview *** Long term project *** ***Hybrid*
Description:
About the Role:
We are seeking an experienced Data Architect to lead and design modern data solutions for a Property & Casualty (P&C) customer undergoing a major data modernization initiative involving Guidewire Claim Data Access (CDA). The ideal candidate will possess strong technical expertise, hands-on experience, and excellent communication skills to successfully deliver enterprise-grade data solutions in Azure/Informatica. This role requires a proactive problem solver who can troubleshoot and optimize complex data pipelines and workflows for maximum efficiency and reliability.
Key Responsibilities:
+ Architect and implement enterprise metadata-driven data pipelines using ETL tools like Azure Data Factory (ADF) and Informatica.
+ Design and develop an Operational Data Store (ODS) sourced from Azure Data Lake, ensuring robust, scalable, and high-performing architecture.
+ Collaborate with stakeholders to integrate and optimize Guidewire Data (CDA) into the data lake architecture, enabling advanced analytics and reporting.
+ Troubleshoot and resolve issues in data pipelines, workflows, and related processes to ensure reliability and data accuracy.
+ Continuously monitor and optimize current workflows for performance, scalability, and cost-efficiency, adhering to best practices.
+ Develop and maintain custom processes using Python, T-SQL, and Spark, tailored to business requirements.
+ Leverage Azure Functions to design serverless compute solutions for event-driven and scheduled data workflows.
+ Optimize data workflows and resource usage to ensure cost-efficiency in Azure Cloud environments.
+ Provide leadership and guidance for implementing Hadoop-based big data solutions where applicable.
+ Develop a comprehensive understanding of P&C domain data, ensuring alignment with business objectives and compliance requirements.
+ Communicate technical solutions effectively with cross-functional teams, stakeholders, and non-technical audiences.
Required Qualifications:
+ 13+ years of experience in data architecture, data engineering, and/or ETL development roles.
+ Proven experience with Azure Cloud Services, including Azure Data Lake, Azure Data Factory, and SQL Server.
+ Leverage Informatica for robust ETL workflows, data integration, and metadata-driven pipeline automation to streamline data processing
+ Build end-to-end metadata-driven frameworks and continuously optimize existing workflows for improved performance, scalability, and efficiency.
+ Expertise in troubleshooting and optimizing data pipelines and workflows for enhanced reliability and performance.
+ Proficiency in scripting and programming with Python, T-SQL, and Spark for custom data workflows.
+ Hands-on expertise in building and managing ODS systems from data lakes.
+ Demonstrated ability to design solutions for Azure Cloud Cost Optimization.
+ Excellent communication skills to engage with technical and business stakeholders effectively.
Contact the recruiter working on this position:
The recruiter working on this position is Sandeep(Shaji Team) Maraganti
His/her contact number is His/her contact email is [email protected]
Our recruiters will be more than happy to help you to get this contract.
-