Tasks
In this role you will collaborate with customers on diverse data engineering projects designing and maintaining scalable data pipelines. Ideal for those with a strong data engineering foundation and a passion for operational excellence this position involves working closely with data scientists analysts and stakeholders to deliver impactful data solutions.
Responsibilities:
- Develop and manage scalable efficient data pipelines.
- Improve data quality accuracy and consistency.
- Automate repetitive tasks to enhance efficiency.
- Build and maintain data integrations across files messaging and APIs.
- Design data models aligned with business needs.
- Implement infrastructureascode solutions.
- Collaborate on data architecture and governance models.
- Optimize pipeline performance for faster data retrieval.
- Enforce security measures to protect sensitive data.
- Ensure compliance with data governance and regulatory policies.
Requirements
- A strong desire to learn and a foundational understanding of working with data sets.
- Experience with ETL workflows and eager to refine your skills in optimizing these processes.
- A grasp of big data concepts cloud technologies and data modeling.
- Solid SQL and Python skills with a focus on writing efficient queries.
- Handson experience with cloud platforms like AWS Azure or Google Cloud and are motivated to deepen your expertise.
- Technical skills: SQL Python Cloud Computing Data Pipelines
- Soft skills: Teamwork Communication Problem Solving Adaptability Eagerness to
Learn