Were looking for a skilled and proactive Data Engineer to join our Data Engineering & BI team.
In this role you will contribute to the design development and maintenance of our data infrastructure working closely with both technical and business teams. Youll play an important part in building reliable scalable data solutions with a strong focus on Snowflake DBT and SQL to support analytics and decision-making across the organization.
Your responsibilities will include:
- Design implement and maintain Snowflake architecture including data modeling security performance tuning and cost optimization.
- Build and maintain DBT models with clear structure testing and documentation aligned with business logic and transformation best practices.
- Write optimize and maintain SQL queries across Snowflake and traditional RDBMS (Oracle PostgreSQL etc.).
- Develop and automate data pipelines using Python for data extraction transformation and loading.
- Use Git for version control and participate in CI/CD workflows.
- Collaborate with business stakeholders to understand data needs and deliver scalable self-service data solutions.
- Support reporting and BI platforms (such as Tableau) ensuring data quality performance and usability.
- Contribute to technical standards best practices and continuous improvement within the data team.
Qualifications :
Required:
- 35 years of experience in data engineering or a similar role.
- Strong SQL skills with the ability to write clear efficient and maintainable queries.
- Hands-on experience with Snowflake including data modeling and performance considerations.
- Practical experience using DBT for data transformations and model organization.
- Good understanding of relational databases and their performance characteristics.
- Proficiency in Python for scripting and automation.
- Experience with Git and collaborative development workflows.
- Good communication skills and the ability to work effectively with both technical and business teams.
Nice to have:
- Experience with Tableau or similar BI/reporting tools.
- Familiarity with data governance concepts (cataloging lineage RBAC).
- Exposure to event-based or streaming architectures (Kafka Confluent etc.).
- Experience working in Agile environments (Scrum Kanban).
Additional Information :
SQ2
Remote Work :
No
Employment Type :
Full-time
Were looking for a skilled and proactive Data Engineer to join our Data Engineering & BI team.In this role you will contribute to the design development and maintenance of our data infrastructure working closely with both technical and business teams. Youll play an important part in building reliabl...
Were looking for a skilled and proactive Data Engineer to join our Data Engineering & BI team.
In this role you will contribute to the design development and maintenance of our data infrastructure working closely with both technical and business teams. Youll play an important part in building reliable scalable data solutions with a strong focus on Snowflake DBT and SQL to support analytics and decision-making across the organization.
Your responsibilities will include:
- Design implement and maintain Snowflake architecture including data modeling security performance tuning and cost optimization.
- Build and maintain DBT models with clear structure testing and documentation aligned with business logic and transformation best practices.
- Write optimize and maintain SQL queries across Snowflake and traditional RDBMS (Oracle PostgreSQL etc.).
- Develop and automate data pipelines using Python for data extraction transformation and loading.
- Use Git for version control and participate in CI/CD workflows.
- Collaborate with business stakeholders to understand data needs and deliver scalable self-service data solutions.
- Support reporting and BI platforms (such as Tableau) ensuring data quality performance and usability.
- Contribute to technical standards best practices and continuous improvement within the data team.
Qualifications :
Required:
- 35 years of experience in data engineering or a similar role.
- Strong SQL skills with the ability to write clear efficient and maintainable queries.
- Hands-on experience with Snowflake including data modeling and performance considerations.
- Practical experience using DBT for data transformations and model organization.
- Good understanding of relational databases and their performance characteristics.
- Proficiency in Python for scripting and automation.
- Experience with Git and collaborative development workflows.
- Good communication skills and the ability to work effectively with both technical and business teams.
Nice to have:
- Experience with Tableau or similar BI/reporting tools.
- Familiarity with data governance concepts (cataloging lineage RBAC).
- Exposure to event-based or streaming architectures (Kafka Confluent etc.).
- Experience working in Agile environments (Scrum Kanban).
Additional Information :
SQ2
Remote Work :
No
Employment Type :
Full-time
View more
View less