The Role
Bottomline is looking for a Data Engineer to grow with us in a hybrid environment including occasionally working out of our Portsmouth, NH office!
The data engineer is responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and analysis. They play a crucial role in building and managing the data pipelines that enable efficient and reliable data integration, transformation, and delivery for all data users across the enterprise.
This person must be based in the United States.
How you'll contribute:
- Design and develop data pipelines that extract data from various sources, transform it into the desired format, and load it into the appropriate data storage systems.
- Collaborate with data scientists and analysts to optimize models and algorithms for data quality, security, and governance.
- Integrate data from different sources, including databases, data warehouses, APIs, and external systems.
- Ensure data consistency and integrity during the integration process, performing data validation and cleaning as needed.
- Transform raw data into a usable format by applying data cleansing, aggregation, filtering, and enrichment techniques.
- Optimize data pipelines and data processing workflows for performance, scalability, and efficiency.
- Monitor and tunes data systems, identifies and resolves performance bottlenecks, and implements caching and indexing strategies to enhance query performance.
- Implement data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of data.
- Take authority, responsibility, and accountability for exploiting the value of enterprise information assets and of the analytics used to render insights for decision making automated decisions and augmentation of human performance.
- Collaborate with leaders to establish the vision for managing data as a business asset.
- Establish the governance of data and algorithms used for analysis, analytical applications, and automated decision making.
What will make you successful:
- A bachelor's degree in computer science, data science, software engineering, information systems, or related quantitative field; master's degree preferred.
- At least six years of work experience in data management disciplines, including data integration, modeling, optimization and data quality, or other areas directly relevant to data engineering responsibilities and tasks.
- Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake)
- Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data to support AI, ML, and BI
- Strong ability in programming languages such as Java or Python, or and C/C++, or other scripting languages
- Previous experience with languages/tools such as SQL
- Significant experience working in the ETL process.
- Experience working in a structured development environment (i.e., environment with the standard SDLC process).
- Proficiency in the design and implementation of modern data architectures and concepts such as cloud services (AWS, Azure, GCP) and modern data warehouse tools (Snowflake, Databricks)
- Experience with database technologies such as SQL, NoSQL, Oracle, Hadoop, or Teradata
- Knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines (nice to have).
- Ability to collaborate within and across teams of different technical knowledge to support delivery and educate end users on data products.
- Expert problem-solving skills, including debugging skills, allowing the determination of sources of issues in unfamiliar code or systems, and the ability to recognize and solve repetitive problems.
- Excellent business acumen and interpersonal skills; able to work across business lines at a senior level to influence and effect change to achieve common goals.
- Ability to describe business use cases/outcomes, data sources and management concepts, and analytical approaches/options.
- Ability to translate among the languages used by executive, business, IT, and quant stakeholders.
#LifeAtBottomline