Role Summary
Syngenta is looking for a proactive and driven Data Architect to join our cloud and Data Ops this role you will work on designing the system architecture and solution ensuring the platform is scalable while performant and creating automated data pipelines.
Responsibilities:
- Design and lead implementation of end-to-end Databricks Lakehouse Platforms using Delta Lake Delta Live Tables and MLflow.
- Architect Medallion Architecture (Bronze/Silver/Gold) for structured semi-structured and streaming workloads.
- Implement governed Lakehouse patterns using Unity Catalog for access control lineage data classification and secure sharing.
- Build scalable ETL/ELT pipelines using Databricks Notebooks Workflows SQL Warehouses and Spark-based transformations.
- Develop real-time streaming pipelines with Auto Loader Structured Streaming and event-driven platforms (Kafka Kinesis Pub/Sub).
- Integrate Databricks with cloud-native services such as AWS Glue Azure Data Factory and GCP Dataform.
- Define distributed integration patterns using REST APIs microservices and event-driven architectures.
- Enforce data governance RBAC/ABAC encryption secret management and compliance controls.
- Optimize Delta Lake tables Spark workloads and cluster configurations using Photon and autoscaling patterns.
- Drive cloud cost optimization across storage compute and workflow orchestration.
- Participate in architecture reviews set standards and support engineering teams throughout execution.
- Stay current on Databricks capabilities including Unity Catalog updates Lakehouse Federation serverless compute and AI/ML features.
Qualifications :
- Bachelors or masters degree in computer science Data Engineering or related field.
- 10 years of experience in enterprise software cloud architecture or data engineering roles.
- Strong hands-on experience with Databricks Apache Spark Delta Lake and Lakehouse platform design.
- Experience implementing and administering Unity Catalog for governance lineage and fine-grained access control.
- Experience designing Medallion Architecture for analytics and engineering workloads.
- Hands-on experience with cloud platforms such as AWS Azure or GCP including storage compute and networking services.
- Experience with streaming technologies such as Kafka Kinesis or Pub/Sub.
- Strong understanding of data modeling workflow orchestration (Airflow Databricks Workflows dbt) and pipeline automation.
- Familiarity with Scala-based Spark workloads in addition to PySpark and SQL pipelines.
- Skilled in performance tuning Spark optimization cluster policies and cloud cost management.
- Excellent communication skills for technical leadership and stakeholder collaboration.
- Certifications in Databricks AWS Solution Architecture or TOGAF are a plus.
Additional Information :
Note: Syngenta is an Equal Opportunity Employer and does not discriminate in recruitment hiring training promotion or any other employment practices for reasons of race color religion gender national origin age sexual orientation gender identity marital or veteran status disability or any other legally protected status.
Follow us on: Twitter & LinkedIn
page
Work :
No
Employment Type :
Full-time
Role SummarySyngenta is looking for a proactive and driven Data Architect to join our cloud and Data Ops this role you will work on designing the system architecture and solution ensuring the platform is scalable while performant and creating automated data pipelines.Responsibilities:Design and lea...
Role Summary
Syngenta is looking for a proactive and driven Data Architect to join our cloud and Data Ops this role you will work on designing the system architecture and solution ensuring the platform is scalable while performant and creating automated data pipelines.
Responsibilities:
- Design and lead implementation of end-to-end Databricks Lakehouse Platforms using Delta Lake Delta Live Tables and MLflow.
- Architect Medallion Architecture (Bronze/Silver/Gold) for structured semi-structured and streaming workloads.
- Implement governed Lakehouse patterns using Unity Catalog for access control lineage data classification and secure sharing.
- Build scalable ETL/ELT pipelines using Databricks Notebooks Workflows SQL Warehouses and Spark-based transformations.
- Develop real-time streaming pipelines with Auto Loader Structured Streaming and event-driven platforms (Kafka Kinesis Pub/Sub).
- Integrate Databricks with cloud-native services such as AWS Glue Azure Data Factory and GCP Dataform.
- Define distributed integration patterns using REST APIs microservices and event-driven architectures.
- Enforce data governance RBAC/ABAC encryption secret management and compliance controls.
- Optimize Delta Lake tables Spark workloads and cluster configurations using Photon and autoscaling patterns.
- Drive cloud cost optimization across storage compute and workflow orchestration.
- Participate in architecture reviews set standards and support engineering teams throughout execution.
- Stay current on Databricks capabilities including Unity Catalog updates Lakehouse Federation serverless compute and AI/ML features.
Qualifications :
- Bachelors or masters degree in computer science Data Engineering or related field.
- 10 years of experience in enterprise software cloud architecture or data engineering roles.
- Strong hands-on experience with Databricks Apache Spark Delta Lake and Lakehouse platform design.
- Experience implementing and administering Unity Catalog for governance lineage and fine-grained access control.
- Experience designing Medallion Architecture for analytics and engineering workloads.
- Hands-on experience with cloud platforms such as AWS Azure or GCP including storage compute and networking services.
- Experience with streaming technologies such as Kafka Kinesis or Pub/Sub.
- Strong understanding of data modeling workflow orchestration (Airflow Databricks Workflows dbt) and pipeline automation.
- Familiarity with Scala-based Spark workloads in addition to PySpark and SQL pipelines.
- Skilled in performance tuning Spark optimization cluster policies and cloud cost management.
- Excellent communication skills for technical leadership and stakeholder collaboration.
- Certifications in Databricks AWS Solution Architecture or TOGAF are a plus.
Additional Information :
Note: Syngenta is an Equal Opportunity Employer and does not discriminate in recruitment hiring training promotion or any other employment practices for reasons of race color religion gender national origin age sexual orientation gender identity marital or veteran status disability or any other legally protected status.
Follow us on: Twitter & LinkedIn
page
Work :
No
Employment Type :
Full-time
View more
View less