ob Description:
Key Responsibilities
Design develop and maintain data ingestion pipelines using Databricks and Lakeflow Connect
Integrate data from various structured and unstructured sources into Delta Lake and other data storage systems
Implement realtime and batch ingestion workflows to support analytics and reporting needs
Optimize data ingestion performance ensuring scalability reliability and cost efficiency
Collaborate with data architects analysts and business stakeholders to define data requirements and ingestion strategies
Ensure data quality lineage and governance compliance across the ingestion process
Automate data ingestion monitoring ing and errorhandling mechanisms
Stay up to date with emerging Databricks Lakehouse and data integration technologies and best practices
Required Qualifications
Bachelors or Masters degree in Computer Science Information Systems Data Engineering or a related field
4 years of experience in data engineering or ETL development
Handson experience with Databricks SQL PySpark Delta Lake
Proficiency with Lakeflow Connect for building and managing data ingestion workflows
Strong understanding of data integration patterns data modeling and data lakehouse architectures
Experience with cloud platforms Azure AWS or GCP and associated data services
Knowledge of CICD version control Git and infrastructureascode practices
Familiarity with data governance security and compliance standards
Preferred Skills
Experience with streaming technologies Kafka Event Hubs etc
Knowledge of REST APIs and connectorbased ingestion
Exposure to machine learning data pipelines in Databricks
Strong problemsolving communication and collaboration skills
Skills
Mandatory Skills : Databricks
ob Description: Key Responsibilities Design develop and maintain data ingestion pipelines using Databricks and Lakeflow Connect Integrate data from various structured and unstructured sources into Delta Lake and other data storage systems Implement realtime and batch ingestion workflows to suppor...
ob Description:
Key Responsibilities
Design develop and maintain data ingestion pipelines using Databricks and Lakeflow Connect
Integrate data from various structured and unstructured sources into Delta Lake and other data storage systems
Implement realtime and batch ingestion workflows to support analytics and reporting needs
Optimize data ingestion performance ensuring scalability reliability and cost efficiency
Collaborate with data architects analysts and business stakeholders to define data requirements and ingestion strategies
Ensure data quality lineage and governance compliance across the ingestion process
Automate data ingestion monitoring ing and errorhandling mechanisms
Stay up to date with emerging Databricks Lakehouse and data integration technologies and best practices
Required Qualifications
Bachelors or Masters degree in Computer Science Information Systems Data Engineering or a related field
4 years of experience in data engineering or ETL development
Handson experience with Databricks SQL PySpark Delta Lake
Proficiency with Lakeflow Connect for building and managing data ingestion workflows
Strong understanding of data integration patterns data modeling and data lakehouse architectures
Experience with cloud platforms Azure AWS or GCP and associated data services
Knowledge of CICD version control Git and infrastructureascode practices
Familiarity with data governance security and compliance standards
Preferred Skills
Experience with streaming technologies Kafka Event Hubs etc
Knowledge of REST APIs and connectorbased ingestion
Exposure to machine learning data pipelines in Databricks
Strong problemsolving communication and collaboration skills
Skills
Mandatory Skills : Databricks
View more
View less