Purpose
The Senior Data Engineer plays a pivotal role in driving the design development and optimization of robust scalable and high-performance data products and pipelines on Databricks (Spark/Delta Lake/Unity Catalog). This role is critical in enabling a domain-oriented data mesh architecture ensuring exceptional data quality governance reliability and cost efficiency.
As a Senior Data Engineer you will not only build and maintain complex data solutions but also provide technical provide technical guidance and support to junior engineers and contribute to best practices across the team as well as bridge towards business stakeholders translating complex business requirements into technical solutions and effectively communicating technical concepts and challenges to non-technical audiences. You will actively contribute to the technical evolution of our Data Mesh driving business impact through AI ready data. Additionally you will be critical in the process of onboarding emerging technologies like SAP Business Data Cloud and Data Sphere.
Accountabilities
- Design develop and optimize complex data pipelines and model AI ready data products using Databricks and potentially later in SAP Business Data Cloud ensuring scalability reliability and performance.
- Implement and refine advanced ETL/ELT processes adhering to best practices and optimizing for efficiency and data integrity.
- Act as a liaison with business stakeholders proactively engaging to understand complex requirements clarify definitions identify issues and effectively communicate technical challenges and solutions in an accessible manner.
- Conduct thorough code reviews provide constructive feedback and ensure adherence to high software engineering standards and technical documentation practices.
- Provide technical guidance and support to junior and mid-level data engineers fostering a culture of continuous learning and technical excellence.
- Champion data quality and governance standards implementing robust monitoring and validation frameworks.
- Actively participate in architectural discussions and contribute to the technical evolution of the data mesh and data product strategy including evaluating and integrating new technologies.
- Design and implement data integration solutions with such as SAP S/4HANA BW/4HANA ECCs Workday ONIT etc. ensuring seamless data flow and accessibility.
Professional Experience:
- 9 years of progressive experience in data engineering with a significant portion in a senior individual contributor capacity delivering data products.
- Proven track record of successfully delivering complex data engineering projects from inception to production.
- Experience in engaging with non-technical audiences as well as guiding and supporting junior engineers and contributing to technical initiatives.
- Multiple years of business or IT experience ideally related to corporate functions (Finance HR Procurement Legal).
Technical Expertise:
- Expert-level proficiency in SQL and Python (including PySpark) with a strong emphasis on modular code testing and robust software engineering practices.
- Deep hands-on expertise with Databricks on cloud: Spark Delta Lake Unity Catalog Delta Live Tables Workflows Repos/CLI Asset Bundles. Familiarity with MLflow is a plus.
- Extensive experience with data warehousing and lakehouse patterns advanced dimensional and domain-oriented data modeling and complex ETL/ELT design and orchestration using platforms like Fivetran Snaplogic or Glue
- Demonstrated experience extracting and transforming data from SAP S/4HANA SAP BW/4HANA ECC and related systems (IDocs BAPIs OData services).
- Direct experience with SAP Business Data Cloud and Data Sphere is highly desirable.
- Proficiency with version control systems (Git preferably GitLab) and advanced branching/merge request workflows.
- Expertise in CI/CD pipelines and deployment automation for data platforms.
- Strong understanding and practical application of data quality principles data governance and data security.
- Experience with cloud platforms (AWS preferred) and their data-related services.
Qualifications :
Bachelors or Masters degree in Computer Science Information Systems or a related Data & Analytics field.
Additional Information :
Note: Syngenta is an Equal Opportunity Employer and does not discriminate in recruitment hiring training promotion or any other employment practices for reasons of race color religion gender national origin age sexual orientation gender identity marital or veteran status disability or any other legally protected status.
Follow us on: LinkedIn
LI page - Work :
No
Employment Type :
Full-time
Purpose The Senior Data Engineer plays a pivotal role in driving the design development and optimization of robust scalable and high-performance data products and pipelines on Databricks (Spark/Delta Lake/Unity Catalog). This role is critical in enabling a domain-oriented data mesh architecture ensu...
Purpose
The Senior Data Engineer plays a pivotal role in driving the design development and optimization of robust scalable and high-performance data products and pipelines on Databricks (Spark/Delta Lake/Unity Catalog). This role is critical in enabling a domain-oriented data mesh architecture ensuring exceptional data quality governance reliability and cost efficiency.
As a Senior Data Engineer you will not only build and maintain complex data solutions but also provide technical provide technical guidance and support to junior engineers and contribute to best practices across the team as well as bridge towards business stakeholders translating complex business requirements into technical solutions and effectively communicating technical concepts and challenges to non-technical audiences. You will actively contribute to the technical evolution of our Data Mesh driving business impact through AI ready data. Additionally you will be critical in the process of onboarding emerging technologies like SAP Business Data Cloud and Data Sphere.
Accountabilities
- Design develop and optimize complex data pipelines and model AI ready data products using Databricks and potentially later in SAP Business Data Cloud ensuring scalability reliability and performance.
- Implement and refine advanced ETL/ELT processes adhering to best practices and optimizing for efficiency and data integrity.
- Act as a liaison with business stakeholders proactively engaging to understand complex requirements clarify definitions identify issues and effectively communicate technical challenges and solutions in an accessible manner.
- Conduct thorough code reviews provide constructive feedback and ensure adherence to high software engineering standards and technical documentation practices.
- Provide technical guidance and support to junior and mid-level data engineers fostering a culture of continuous learning and technical excellence.
- Champion data quality and governance standards implementing robust monitoring and validation frameworks.
- Actively participate in architectural discussions and contribute to the technical evolution of the data mesh and data product strategy including evaluating and integrating new technologies.
- Design and implement data integration solutions with such as SAP S/4HANA BW/4HANA ECCs Workday ONIT etc. ensuring seamless data flow and accessibility.
Professional Experience:
- 9 years of progressive experience in data engineering with a significant portion in a senior individual contributor capacity delivering data products.
- Proven track record of successfully delivering complex data engineering projects from inception to production.
- Experience in engaging with non-technical audiences as well as guiding and supporting junior engineers and contributing to technical initiatives.
- Multiple years of business or IT experience ideally related to corporate functions (Finance HR Procurement Legal).
Technical Expertise:
- Expert-level proficiency in SQL and Python (including PySpark) with a strong emphasis on modular code testing and robust software engineering practices.
- Deep hands-on expertise with Databricks on cloud: Spark Delta Lake Unity Catalog Delta Live Tables Workflows Repos/CLI Asset Bundles. Familiarity with MLflow is a plus.
- Extensive experience with data warehousing and lakehouse patterns advanced dimensional and domain-oriented data modeling and complex ETL/ELT design and orchestration using platforms like Fivetran Snaplogic or Glue
- Demonstrated experience extracting and transforming data from SAP S/4HANA SAP BW/4HANA ECC and related systems (IDocs BAPIs OData services).
- Direct experience with SAP Business Data Cloud and Data Sphere is highly desirable.
- Proficiency with version control systems (Git preferably GitLab) and advanced branching/merge request workflows.
- Expertise in CI/CD pipelines and deployment automation for data platforms.
- Strong understanding and practical application of data quality principles data governance and data security.
- Experience with cloud platforms (AWS preferred) and their data-related services.
Qualifications :
Bachelors or Masters degree in Computer Science Information Systems or a related Data & Analytics field.
Additional Information :
Note: Syngenta is an Equal Opportunity Employer and does not discriminate in recruitment hiring training promotion or any other employment practices for reasons of race color religion gender national origin age sexual orientation gender identity marital or veteran status disability or any other legally protected status.
Follow us on: LinkedIn
LI page - Work :
No
Employment Type :
Full-time
View more
View less