GCP Data Engineer (MUST HAVE DASH +&Healthcare experience)
: Job Details :


GCP Data Engineer (MUST HAVE DASH +&Healthcare experience)

BPO Recruit

Job Location : New York,NY, USA

Posted on : 2025-01-14T15:12:53Z

Job Description :

We are recruiting for an ICT Company. Open to anywhere in US

The candidate MUST have DASH & Healthcare experience.

As a Data Engineer you will work on process of transforming raw data to a usable format, which is then further analyzed by other teams. Cleansing, organizing, and manipulating data using pipelines are some of the key responsibilities of a data engineer. You will also work on applying data engineering principles on the Google Cloud Platform to optimize its services. And create interactive dashboards/reports to present it to the stakeholders.

Role and Responsibilities:

Create and maintain optimal data pipeline architecture,

Assemble large, complex data sets that meet functional / non-functional business requirements.

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and other data sources

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.

Design intuitive and visually appearing visualizations to communicate complex data insights to users using Tableau, Streamlit, Dash Enterprise and Power BI

Use Flask to develop APIs that integrate data from various sources and facilitate automation of business processes.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Candidate Profile:

Required Qualification and Skills:

Expert in DASH Enterprise in developing friendly UI and strong story telling experience.

Ability to conduct real-time data exploration, predictive modeling integration and a seamless deployment of machine learning modules to GCP.

Overall, 5-8 years of experience on ETL technologies

3+ years of experience in data engineering technologies like SQL, Hadoop, Big Query, Dataproc, Composer

Strong Python/Pyspark Data Engineer Skills

Hands-on experience with data visualizations like Tableau, Power BI, Dash and Streamlit

Experience in building interactive dashboards, visualizations, and custom reports for business users.

Knowledge of Flask for developing APIs and automating data workflows

Experience in data automation and implemented workflows in a cloud environment.

Strong SQL ETL skills

GCP data engineer certified to be preferred.

Ability to understand and design the underlying data/schema

Strong communication skills to effectively communicate client updates.

Apply Now!

Similar Jobs (0)