Position: Senior Data Engineer
Location: Chicago IL ( Hybrid 3 Days Onsite in a Week)
Duration: 1 Years
Databricks ADF
Role Overview
The Senior Data Engineer will design build and optimize scalable data pipelines and solutions leveraging Databricks Python ETL DevOps. This role requires strong experience in cloud-based data platforms and advanced data engineering practices to support analytics and business intelligence initiatives.
Key Responsibilities
- Architect and implement data pipelines using Databricks and Delta Lake for batch and streaming data.
- Develop and maintain ETL/ELT workflows integrating diverse data sources into cloud data lakes and warehouses.
- Implement data governance security and compliance standards.
- Automate workflows using Python scripting and CI/CD pipelines.
- Troubleshoot and resolve issues in data ingestion transformation and storage.
Required Skills & Experience
- 8 years in data engineering or related roles.
- Expert-level Python programming for data processing and automation.
- Hands-on experience with Databricks and Delta Lake.
- Strong knowledge of Azure Data Factory Data Lake and Synapse (or equivalent cloud platforms).
- Proficiency in SQL and performance tuning.
- Familiarity with DevOps practices CI/CD and infrastructure-as-code tools.
- Experience with data migration and cloud-native architectures.
Excellent problem-solving and communication skills
Position: Senior Data Engineer Location: Chicago IL ( Hybrid 3 Days Onsite in a Week) Duration: 1 Years Databricks ADF Role Overview The Senior Data Engineer will design build and optimize scalable data pipelines and solutions leveraging Databricks Python ETL DevOps. This role requires strong ex...
Position: Senior Data Engineer
Location: Chicago IL ( Hybrid 3 Days Onsite in a Week)
Duration: 1 Years
Databricks ADF
Role Overview
The Senior Data Engineer will design build and optimize scalable data pipelines and solutions leveraging Databricks Python ETL DevOps. This role requires strong experience in cloud-based data platforms and advanced data engineering practices to support analytics and business intelligence initiatives.
Key Responsibilities
- Architect and implement data pipelines using Databricks and Delta Lake for batch and streaming data.
- Develop and maintain ETL/ELT workflows integrating diverse data sources into cloud data lakes and warehouses.
- Implement data governance security and compliance standards.
- Automate workflows using Python scripting and CI/CD pipelines.
- Troubleshoot and resolve issues in data ingestion transformation and storage.
Required Skills & Experience
- 8 years in data engineering or related roles.
- Expert-level Python programming for data processing and automation.
- Hands-on experience with Databricks and Delta Lake.
- Strong knowledge of Azure Data Factory Data Lake and Synapse (or equivalent cloud platforms).
- Proficiency in SQL and performance tuning.
- Familiarity with DevOps practices CI/CD and infrastructure-as-code tools.
- Experience with data migration and cloud-native architectures.
Excellent problem-solving and communication skills
View more
View less