One of our clients is looking for a forward-thinking Data Engineer to build a robust Vault 2.0 architecture on Snowflake, to better serve and deliver valuable data products to our customers (internal and external client).
Key Responsibilities:Data Vault 2.0 Implementation:
- Design and implement Data Vault 2.0 on Snowflake.
- Develop and maintain hubs, links, and satellites.
Data Integration:
- Build data pipelines for various data sources (databases, APIs, streaming).
- Implement ETL/ELT processes for data loading, transformation, and storage.
Data Marts:
- Design data marts for business intelligence and analytics.
- Collaborate with stakeholders to deliver tailored data products.
Performance Optimization:
- Optimize data models, queries, and storage for performance and cost.
- Monitor and troubleshoot data pipeline and warehouse performance.
Collaboration & Communication:
- Work with analysts and stakeholders to gather requirements and deliver solutions.
Documentation & Standards:
- Maintain documentation in Confluence.
- Follow best practices and company data standards.
Required Skills and Qualifications:Technical Expertise:
- Experience with Data Vault 2.0 and Snowflake.
- Proficient in SQL and ETL/ELT processes.
- Familiarity with streaming tech (Kafka, Kinesis).
Programming & Tools:
- Proficient in Python, Java, or similar languages.
- Experience with DBT is a plus.
Analytical Skills:
- Strong problem-solving skills and ability to design efficient data models.
If this role sounds like a good fit for you, I’d love to hear from you. Feel free to reach out to me directly or apply here!