Data Architect - Remote / Telecommute - Cynet Systems : Job Details

Data Architect - Remote / Telecommute

Cynet Systems

Job Location : all cities,AK, USA

Posted on : 2024-11-16T07:36:12Z

Job Description :
Job Description: Pay Range: $65hr - $70hr Responsibilities:
  • Data Architecture Design: Design and develop scalable, high-performance data architecture solutions using PySpark, ADF, and Power BI to support business intelligence, analytics, and reporting needs.
  • Data Pipeline Development: Build and manage robust data pipelines using PySpark and Azure Data Factory, ensuring efficient data extraction, transformation, and loading (ETL) processes across various data sources.
  • Data Modeling: Develop and maintain data models that optimize query performance and support the needs of analytics and reporting teams.
  • Integration and Automation: Design and implement integration strategies to automate data flows between systems and ensure data consistency and accuracy.
  • Collaboration: Work closely with data engineers, data analysts, business intelligence teams, and other stakeholders to understand data requirements and deliver effective solutions.
  • Data Governance and Security: Ensure data solutions adhere to best practices in data governance, security, and compliance, including data privacy regulations and policies.
  • Performance Optimization: Continuously monitor and optimize data processes and architectures for performance, scalability, and cost-efficiency.
  • Reporting Visualization: Utilize Power BI to design and develop interactive dashboards and reports that provide actionable insights for business stakeholders.
  • Documentation: Create comprehensive documentation for data architecture, data flows, ETL processes, and reporting solutions.
  • Troubleshooting Support: Provide technical support and troubleshooting for data-related issues, ensuring timely resolution and minimal impact on business operations.
Qualifications:
  • Bachelor s degree in Computer Science, Information Technology, Data Science, or a related field.
Experience:
  • Minimum of 15+ years of experience in data architecture and engineering, with a focus on PySpark, ADF, and Power BI.
  • Proven experience in designing and implementing data pipelines, ETL processes, and data integration solutions.
  • Strong experience in data modeling and data warehouse design.
Technical Skills:
  • Proficiency in PySpark for big data processing and transformation.
  • Extensive experience with Azure Data Factory (ADF) for data orchestration and ETL workflows.
  • Strong expertise in Power BI for data visualization, dashboard creation, and reporting.
  • Knowledge of Azure services (e.g., Azure Data Lake, Azure Synapse) and other relevant cloud-based data technologies.
  • Strong SQL skills and experience with relational databases. Soft Skills: Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams.
  • bility to manage multiple priorities in a fast-paced environment.
Preferred Qualifications / Certifications:
  • Microsoft certifications related to Azure, Power BI, or data engineering are a plus. Experience in a similar role within a large enterprise environment is preferred.
Apply Now!

Similar Jobs ( 0)