Do you love a career where you Experience Grow & Contribute at the same time while earning at least 10% above the market If so we are excited to have bumped onto you.
If you are a Databricks- Workstream Lead Position looking for excitement challenge and stability in your work then you would be glad to come across this page.
We are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting long-term project. Here are a few details.
Check if you are up for maximizing your earning/growth potential leveraging our Disruptive Talent Solution.
Role: Databricks- Workstream Lead
Location:Hyderabad (Hybrid)
Exp: 3-8 Years
Hybrid role
Requirements
We are seeking an experienced Offshore Delivery Lead to oversee and manage the end-to-end technical delivery of large-scale data engineering solutions. This role requires strong hands-on expertise in PySpark Apache Spark AWS and Databricks along with proven leadership in managing offshore delivery teams.
The ideal candidate will act as a technical leader delivery owner and client-facing coordinator ensuring scalable high-performance data solutions are delivered on time and within scope while maintaining quality and engineering best practices.
Key Responsibilities
Delivery & Program Management
Lead offshore data engineering teams and oversee end-to-end technical delivery.
Own sprint planning delivery milestones and execution in Agile frameworks.
Ensure on-time high-quality delivery aligned with client expectations and SLAs.
Identify delivery risks and proactively implement mitigation strategies.
Technical Leadership
Provide hands-on technical guidance in PySpark Spark AWS and Databricks.
Review solution designs code and implementations for performance scalability and reliability.
Define and enforce data engineering standards coding guidelines and best practices.
Drive reusable frameworks performance optimization and cloud-native designs.
Architecture & Solution Design
Design and guide implementation of scalable data pipelines and data platforms.
Architect ETL/ELT workflows using PySpark and Spark on Databricks.
Ensure optimal use of AWS services such as:
Support batch and streaming architectures where applicable.
Stakeholder & Client Engagement
Act as the primary offshore point of contact for onsite leads and client stakeholders.
Translate business and functional requirements into clear technical deliverables.
Provide regular delivery status updates metrics and risk reports.
Collaborate closely with onsite teams for seamless execution.
Quality Assurance & Governance
Ensure adherence to data security compliance and governance standards.
Drive automated testing data validation and monitoring strategies.
Ensure robustness of production deployments and smooth handovers to support teams.
Team Leadership & Mentoring
Mentor and coach data engineers helping them grow technically and professionally.
Conduct technical reviews performance feedback and knowledge-sharing sessions.
Build a high-performing scalable offshore delivery team.
Required Skills & Qualifications
Bachelors or Masters degree in Computer Science Engineering or a related field.
8 years of overall experience in data engineering / big data solutions.
3 years of experience leading offshore delivery teams.
Strong hands-on expertise in:
Strong experience designing and building ETL/ELT pipelines.
Solid understanding of data modeling partitioning and performance tuning.
Proven experience working in Agile / Scrum delivery models.
Excellent communication leadership and stakeholder management skills.
Nice-to-Have Skills
Experience with streaming technologies (Kafka Spark Streaming Kinesis).
Exposure to data governance data quality and metadata tools.
CI/CD pipeline experience for data platforms.
Knowledge of orchestration tools such as Airflow or AWS Step Functions.
Prior experience working with global or onsite-offshore delivery models.
Experience in domains such as finance insurance healthcare or retail.
Benefits
Required Skills:
Offshore delivery lead to oversee the technical delivery. Experience with Pysparsk Spark AWS Databricks.
Required Education:
Bachelors/Masters degree in Computer Science Engineering Information Technology or related field.
Do you love a career where you Experience Grow & Contribute at the same time while earning at least 10% above the market If so we are excited to have bumped onto you.Learn how we are redefining the meaning of work and be a part of the team raved by Clients Job-seekers and Employees.Jobseeker Video T...
Do you love a career where you Experience Grow & Contribute at the same time while earning at least 10% above the market If so we are excited to have bumped onto you.
If you are a Databricks- Workstream Lead Position looking for excitement challenge and stability in your work then you would be glad to come across this page.
We are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting long-term project. Here are a few details.
Check if you are up for maximizing your earning/growth potential leveraging our Disruptive Talent Solution.
Role: Databricks- Workstream Lead
Location:Hyderabad (Hybrid)
Exp: 3-8 Years
Hybrid role
Requirements
We are seeking an experienced Offshore Delivery Lead to oversee and manage the end-to-end technical delivery of large-scale data engineering solutions. This role requires strong hands-on expertise in PySpark Apache Spark AWS and Databricks along with proven leadership in managing offshore delivery teams.
The ideal candidate will act as a technical leader delivery owner and client-facing coordinator ensuring scalable high-performance data solutions are delivered on time and within scope while maintaining quality and engineering best practices.
Key Responsibilities
Delivery & Program Management
Lead offshore data engineering teams and oversee end-to-end technical delivery.
Own sprint planning delivery milestones and execution in Agile frameworks.
Ensure on-time high-quality delivery aligned with client expectations and SLAs.
Identify delivery risks and proactively implement mitigation strategies.
Technical Leadership
Provide hands-on technical guidance in PySpark Spark AWS and Databricks.
Review solution designs code and implementations for performance scalability and reliability.
Define and enforce data engineering standards coding guidelines and best practices.
Drive reusable frameworks performance optimization and cloud-native designs.
Architecture & Solution Design
Design and guide implementation of scalable data pipelines and data platforms.
Architect ETL/ELT workflows using PySpark and Spark on Databricks.
Ensure optimal use of AWS services such as:
Support batch and streaming architectures where applicable.
Stakeholder & Client Engagement
Act as the primary offshore point of contact for onsite leads and client stakeholders.
Translate business and functional requirements into clear technical deliverables.
Provide regular delivery status updates metrics and risk reports.
Collaborate closely with onsite teams for seamless execution.
Quality Assurance & Governance
Ensure adherence to data security compliance and governance standards.
Drive automated testing data validation and monitoring strategies.
Ensure robustness of production deployments and smooth handovers to support teams.
Team Leadership & Mentoring
Mentor and coach data engineers helping them grow technically and professionally.
Conduct technical reviews performance feedback and knowledge-sharing sessions.
Build a high-performing scalable offshore delivery team.
Required Skills & Qualifications
Bachelors or Masters degree in Computer Science Engineering or a related field.
8 years of overall experience in data engineering / big data solutions.
3 years of experience leading offshore delivery teams.
Strong hands-on expertise in:
Strong experience designing and building ETL/ELT pipelines.
Solid understanding of data modeling partitioning and performance tuning.
Proven experience working in Agile / Scrum delivery models.
Excellent communication leadership and stakeholder management skills.
Nice-to-Have Skills
Experience with streaming technologies (Kafka Spark Streaming Kinesis).
Exposure to data governance data quality and metadata tools.
CI/CD pipeline experience for data platforms.
Knowledge of orchestration tools such as Airflow or AWS Step Functions.
Prior experience working with global or onsite-offshore delivery models.
Experience in domains such as finance insurance healthcare or retail.
Benefits
Required Skills:
Offshore delivery lead to oversee the technical delivery. Experience with Pysparsk Spark AWS Databricks.
Required Education:
Bachelors/Masters degree in Computer Science Engineering Information Technology or related field.
View more
View less