Azure Data Engineer – Databricks, Lakehouse Architecture & Cloud Data Security

Synechron

Not Interested
Bookmark
Report This Job

profile Job Location:

Pune - India

profile Monthly Salary: Not Disclosed
Posted on: Yesterday
Vacancies: 1 Vacancy

Job Summary

Job Summary

Synechron is seeking a highly skilled Azure Data Engineer with extensive expertise in Databricks and Lakehouse architecture to lead the development of scalable high-performance data pipelines supporting enterprise analytics this role you will design implement and optimize large-scale data workflows across cloud platforms ensuring compliance security and operational efficiency. Your contributions will enable data-driven decision-making and support our organizations digital transformation objectives with innovative data management practices.


Software Requirements

Required:

  • Proven experience with Azure Cloud Services including Azure Data Factory Azure Data Lake Azure Databricks (latest runtime versions preferred) (5 years)

  • Deep expertise in Databricks components: Delta Lake Unity Catalog Lakehouse architecture Delta Live Pipelines and Table Triggers

  • Strong hands-on knowledge of Spark and PySpark for large-scale data processing and transformation

  • Experience developing and managing Databricks Asset Bundles for deployment automation

  • Experience with SQL (Azure SQL Data Warehouse or relational databases) for data modeling and validation

  • Familiarity with GitLab or similar version control tools for collaboration and artifact management


Preferred:

  • Experience with streaming frameworks like Spark Streaming or Structured Streaming

  • Knowledge of advanced Databricks runtime optimizations and configuration tuning

  • Exposure to end-to-end data governance and security compliance frameworks (e.g. GDPR HIPAA)


Overall Responsibilities

  • Lead design and implementation of data pipelines supporting enterprise analytics reporting and AI integrations across the Azure cloud ecosystem

  • Build test and deploy Lakehouse solutions using Databricks Delta Lake and related components

  • Optimize big data workflows for performance cost-efficiency and scalability in cloud environments

  • Develop automated deployment strategies using Databricks Asset Bundles and CD pipelines supporting continuous integration and delivery

  • Manage data security access controls and compliance using tools like Unity Catalog and Azure security features

  • Collaborate with analytics data science and security teams to incorporate AI/ML models into data workflows

  • Conduct performance tuning troubleshoot issues and continuously improve data quality and pipeline reliability

  • Maintain detailed documentation of architecture data flow security policies and system configurations

  • Lead efforts to migrate legacy systems to cloud-native architectures supporting scalability and operational resilience


Technical Skills (By Category)

Programming and Data Processing (Essential):

  • PySpark Spark SQL Delta Lake (latest runtime preferred)

  • Python for data scripting and automation

  • SQL: Azure SQL Data Warehouse or equivalent


Frameworks & Libraries:

  • Databricks Delta Lake Unity Catalog Delta Live Pipelines

  • Spark structured streaming (preferred)

  • Data governance and metadata management tools


Cloud Technologies:

  • Azure Data Factory Azure Data Lake Azure Databricks Azure Synapse (preferred)

  • Cloud security best practices and access management


Data Management & Governance:

  • Data lineage data quality tools security policies compliant with regulations

DevOps & Automation:

  • CI/CD pipelines supporting Azure DevOps GitLab or Jenkins

  • Infrastructure as Code: Terraform Azure Resource Manager (ARM) templates


Experience Requirements

  • 5 years of experience designing developing and supporting large-scale data pipelines within cloud environments preferably Azure

  • Proven expertise in Databricks and Lakehouse architecture in enterprise settings

  • Demonstrated experience integrating AI/ML workflows within data pipelines (preferred)

  • Sound knowledge of data governance security and compliance practices supporting enterprise standards

  • Experience supporting migration from legacy data platforms to cloud-native architectures


Day-to-Day Activities

  • Architect and develop scalable data workflows supporting enterprise analytics reporting and AI initiatives

  • Build tune and optimize Delta Lake and Lakehouse data solutions for performance and reliability

  • Automate data pipeline deployments and infrastructure using Asset Bundles and CI/CD tools

  • Manage data security privacy and compliance leveraging Unity Catalog and cloud security features

  • Troubleshoot and resolve production issues analyze system bottlenecks and optimize data workflows

  • Collaborate with data scientists BI teams and enterprise architects to refine infrastructure and data models

  • Document architecture data lineage and security controls for audit and compliance readiness

  • Support data migration platform upgrades and cloud infrastructure provisioning


Qualifications

  • Bachelors or Masters degree in Data Engineering Computer Science or related fields

  • 5 years supporting large-scale data pipelines cloud data architectures and analytics platforms

  • Certifications such as Azure Data Engineer Associate or equivalent are preferred

  • Hands-on experience supporting regulated and secure data environments in enterprise organizations


Professional Competencies

  • Strong analytical and troubleshooting skills for complex data workflows and pipelines

  • Excellent stakeholder communication and collaboration skills

  • Leadership qualities for mentoring team members and guiding best practices

  • Adaptability to evolving data technologies security standards and industry regulations

  • Results-oriented focus on operational excellence security and scalability in data environments

SYNECHRONS DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.


All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.

Candidate Application Notice


Required Experience:

Staff IC

Job SummarySynechron is seeking a highly skilled Azure Data Engineer with extensive expertise in Databricks and Lakehouse architecture to lead the development of scalable high-performance data pipelines supporting enterprise analytics this role you will design implement and optimize large-scale dat...
View more view more

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Company Logo

Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more

View Profile View Profile