-
Hadoop Developer
- Insight Global (Charlotte, NC)
-
Job Description
Insight Global is looking to hire a Hadoop Developer for one of our large financial clients in the Charlotte, NC area. We are looking fro a Mid-level to Senior software developer to design, develop, maintain and test software for the ECR RCF team, collaborating with the development team and business partners to ensure successful delivery and implementation of application solutions in a dynamic Agile environment. Candidate’s ability to achieve subject matter expertise quickly on new applications is needed. This role requires proven track record in strong data base skills working on Oracle/SQL Server/Exadata, performance tuning, ETL processes, Windows and Linux environments. Experience in multiple database platforms is preferred: SQL Server, Oracle, Exadata, Big data. Programming experience on Java, Hadoop / Big Data, Hadoop and Java scripting, Angular 5, React, NodeJS, a plus.
Responsibilities include:
o Develop, support Enterprise Credit Risk ETL platform
o Develop unit / integration / regression / performance test scripts/ test suites for the framework
o Follow Agile development process - work refinement, estimation, retrospectives, etc. rituals
o Interface with users and other team members, understand requirements and engineer and analyze solution options
o Communicate solution effectively to team / teams and play a mentor role for junior resources
o Engage Architecture team as needed during development and support
o Participate in POCs - evaluate tools and technologies as needed
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to [email protected] learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/.
Skills and Requirements
Required Qualifications
• 5+ years of experience working with Hadoop and its ecosystem
• Proficiency in Hadoop and SQL
• Familiarity with big data frameworks like Apache Spark and Kafka
• Strong understanding of Linux/Unix systems and shell scripting.
• Knowledge of data security practices in Hadoop environments.
• Excellent problem-solving and communication skills.
• Must be able to handle multiple tasks, lead the team through the delivery and adapt to a constantly changing environment.
• Ability to learn quickly and work with minimal supervision
-