Position: DataStage Developer Location: New York City or Troy, MI or Clevland, OH (onsite from day one and 3day a week.)
Duration: Full Time Job Responsibilities: Technical Leadership -Lead data integration across the enterprise thru design, build and implementation of large scale, high volume, high performance data pipelines for both on-prem and cloud data lake and data warehouses. Lead the development and documentation of technical best practices for ELT/ETL activities. Also, oversee a program inception to build a new product if needed.
Solution Design - Lead the design of technical solution including code, scripts, data pipelines, processes/procedures for integration of data lake and data warehouse solutions in an operative IT environment.
Code Development - Ensures data engineering activities are aligned with scope, schedule, priority and business objectives. Oversees code development, unit and performance testing activities. Responsible to code and lead the team to implement the solution.
Testing - Leads validation efforts by verifying the data at various middle stages that are being used between source and destination and assisting others in validating the solution performs as expected. Meets or exceeds all operational readiness requirements (e.g., operations engineering, performance, and risk management) Ensure compliance with applicable federal, state and local laws and regulations. Complete all required compliance training. Maintain knowledge of and adhere to Flagstar's internal compliance policies and procedures. Take responsibility to keep up to date with changing regulations and policies.
Job Requirements:
- 10 years of experience designing, developing, testing, and implementing Extract, Transform and Load (ELT/ETL) solutions using DataStage.
- 10 years of experience developing and implementing data integration, data lake and data warehouse solutions in an on-premises and cloud environment.
- 10 years of experience with various Software Development Life Cycle methods such as Agile, SCRUM, Waterfall, etc.
- 3-year experience in 100+ TB data environment.
- Proven experience developing and maintaining data pipelines and ETL jobs using IBM DataStage, Informatica, Marillion, Five Tran, Talend or Dbt.
- Knowledge of AWS cloud services such as S3, EMR, Lambda, Glue, Sage Maker, Redshift & Athena and/or Snowflake.
- Experienced in data modelling for self-service business intelligence, advanced analytics, and user application.
- Ability to communicate complex technical concepts by adjusting messaging to the audience: business partners, IT peers, external stakeholders, etc.
- Proven ability to design and build technical solutions using applicable technologies; ability to demonstrate exceptional data engineering skills.
- Ability to prioritize work by dividing time, attention and effort between current project workload and on-going day to day activities.
- Demonstrates strength in adapting to change in processes, procedures and priorities.
- Proven ability to establish a high level of trust and confidence in both the business and IT communities.
- Strong teamwork and interpersonal skills at all management levels.
- Proven ability to manage to a project budget.
- Experience applying agile practices to solution delivery.
- Must be team-oriented and have excellent oral and written communication skills.
- Strong analytic and problem-solving skills.
- Good organizational and time-management skills.
- Experience in Strategic Thinking and Solutioning.
- Must be a self-starter to understand existing bottlenecks and come up with innovative solution.
- Demonstrated ability to work with key stakeholders outside the project to understand requirements/resolve issues.
- Experience with data model design, writing complex SQL queries, etc., and should have a good understanding of BI/DWH principles.
- Expertise in Relational Database Management System, Data Mart and Data Warehouse design.
- Expert-level SQL development skills in a multi-tier environment.
- Expertise in flat file formats, XML within PL/SQL, and file format conversion.
- Strong understanding of SDLC and Agile Methodologies.
- Strong understanding of model driven development.
- Strong understanding of ETL best practices.
- Proven strength in interpreting customer business needs and translating them into application and operational requirements.
- Strong problem-solving skills and analytic skills with proven strength in applying root cause analysis.
Ramu.B Talent Workforce specialist
[email protected] 703-###-####