-
Lead Data Integration Engineer
- Raymond James Financial, Inc. (St. Petersburg, FL)
-
Job Description
_This position follows our hybrid workstyle policy: Expected to be in a Raymond James office location a minimum of 10-12 days a month._
_Please note: This role is not eligible for Work Visa sponsorship, either currently or in the future._
Responsibilities:
+ Deep expertise in Microsoft SQL Server, SSIS, and SQL development.
+ Strong proficiency in writing and optimizing complex stored procedures, functions, and packages.
+ Hands-on experience with Python for data manipulation, automation, and pipeline development.
+ Familiarity with Oracle databases and PL/SQL development is required for cross-platform data integration.
+ Experience in implementing CI/CD pipelines and DevOps practices for data solutions.
+ Understanding data warehousing concepts, ETL methodologies, and data modeling techniques.
+ Experience with Unix and Shell scripting
+ Experience with job scheduler tools such as BMC Control-M
+ Proven track record working in both waterfall and agile SDLC frameworks
+ Knowledge of the Financial Services industry including middle and back-office functions
+ Experience in collaborating with business counterparts to understand detailed requirements
+ Excellent verbal and written communication skills
+ Produce and maintain detailed technical documentation for all development efforts.
Skills:
+ MS SQL Server & SQL Proficiency: Deep expertise in writing and optimizing complex SQL queries, stored procedures, functions, and triggers is fundamental.
+ SSIS Expertise: In-depth knowledge of designing, developing, deploying, and maintaining ETL (Extract, Transform, Load) processes and packages using SQL Server Integration Services (SSIS). This includes robust error handling and logging mechanisms.
+ ETL & Data Warehousing: Strong understanding of ETL methodologies, data warehousing concepts (e.g., Kimball methodology, star schemas), and data modeling techniques (normalization/denormalization).
+ Performance Tuning: Ability to identify, investigate, and resolve database and ETL performance issues, including capacity and scalability planning.
+ Programming Languages: Proficiency in additional programming/scripting languages, such as Python or PowerShell/Shell scripting, for automation, data manipulation, and pipeline development.
+ Cloud & DevOps (Desired): Familiarity with cloud platforms (e.g., Azure Data Factory, AWS Glue, Google Cloud) and experience implementing CI/CD pipelines and DevOps practices for data solutions is a strong advantage.
+ Exposure to streaming technologies such as Kafka is a plus.
+ Experience in financial services or enterprise-scale applications is preferred.
+ Excellent communication, analytical, and problem-solving skills.
-
Recent Searches
- Screen Printing Machine Operator (Florida)
- Client Reporting Associate (Minnesota)
- Principal Software Engineer AI (Kansas)
- early talent undergraduate internship (United States)
Recent Jobs
-
Lead Data Integration Engineer
- Raymond James Financial, Inc. (St. Petersburg, FL)
-
Senior, Controls Design Engineer
- ATS Automation (Lewis Center, OH)
-
Front Desk Guest Experience
- Marriott (Aspen, CO)
-
Senior Data Center Construction Manager
- Oracle (Indianapolis, IN)