ETL Specialist | Apache Hop | Azure & Databricks | Kafka, Spark, Hadoop | Python | Airflow

Synechron

Not Interested
Bookmark
Report This Job

profile Job Location:

Pune - India

profile Monthly Salary: Not Disclosed
Posted on: 5 days ago
Vacancies: 1 Vacancy

Job Summary

Job Summary

The ETL Specialist at Synechron is a seasoned data engineering professional responsible for designing developing and maintaining robust ETL and data integration workflows. The role focuses on extracting transforming and loading data from diverse sources into target systems while maintaining high standards of data quality security and reliability. By leveraging expertise in modern data technologies and platforms this position ensures scalable efficient solutions that support business intelligence and analytics initiatives. The ETL Specialist plays a strategic role in advancing Synechrons data capabilities and enabling informed business decision-making.

Software Requirements


Required:

  • Apache Hop for ETL workflow design and management (proficient)

  • Experience with data streaming and processing platforms like Kafka Spark and Hadoop

  • Version control using Git for source code management and collaborative workflow

  • Python programming for scripting and automation within data pipelines

  • Azure and Databricks platform experience for cloud-based data engineering

  • Airflow for orchestration and pipeline scheduling


Preferred:

  • Familiarity with additional cloud platforms or big data tools compatible with ETL processes

  • Advanced monitoring and error-handling tools for pipeline stability


Overall Responsibilities

  • Design develop and maintain ETL and data integration workflows using Apache Hop.

  • Extract data from multiple heterogeneous sources transform data according to business rules and load into target systems effectively.

  • Optimize ETL pipelines to improve performance and ensure data integrity.

  • Implement monitoring error handling and alerting mechanisms to maintain pipeline reliability.

  • Collaborate with data architects analysts and business stakeholders to gather and refine data requirements.

  • Leverage cloud platforms such as Azure and Databricks to build scalable data solutions.

  • Lead complex data engineering initiatives and provide technical guidance to peers.

  • Conduct code reviews and enforce best practices in data engineering and pipeline development.

  • Follow Git-based version control and continuous integration practices to maintain code quality and traceability.

Performance outcomes include delivering robust scalable ETL pipelines that meet business needs with minimal downtime and high data accuracy.


Technical Skills (By Category)

Programming Languages:

  • Essential: Python (proficient)

  • Preferred: Additional scripting or programming languages related to data engineering


Databases/Data Management:

  • Essential: Proficient in database querying and data manipulation through SQL

  • Preferred: Experience working with big data storage and query engines


Cloud Technologies:

  • Essential: Azure cloud platform and Databricks

  • Preferred: Experience with other cloud providers or hybrid-cloud architectures


Frameworks and Libraries:

  • Essential: Apache Hop Apache Airflow Apache Spark

  • Preferred: Experience with additional big data frameworks such as Hadoop


Development Tools and Methodologies:

  • Essential: Git for version control code reviews and collaboration

  • Preferred: Familiarity with CI/CD pipelines and DevOps practices


Security Protocols:

  • Preferred: Understanding of data governance security and compliance best practices


Experience Requirements

  • Minimum of 6 years of experience in ETL development data integration or related data engineering roles.

  • Proven track record designing and maintaining complex data pipelines using Apache Hop or equivalent tools.

  • Experience integrating with data platforms including Kafka Spark and Hadoop.

  • Demonstrated expertise with cloud technologies particularly Azure and Databricks.

  • Hands-on experience implementing workflow orchestration and automation using Airflow.

  • Experience leading technical initiatives and collaborating across multidisciplinary teams.

  • Alternative pathways include significant contributions in big data or cloud engineering projects.


Day-to-Day Activities

  • Develop test and deploy ETL pipelines and data integration workflows.

  • Collaborate with business and technical teams to understand data requirements and translate them into ETL solutions.

  • Monitor pipeline performance and troubleshoot issues promptly.

  • Participate in daily stand-ups sprint planning and retrospective meetings as part of Agile teams.

  • Conduct code reviews enforce coding standards and contribute to documentation.

  • Lead efforts to optimize data processing and storage strategies for scalability and efficiency.

  • Provide technical mentorship and guidance to team members.

Decision-making focuses on technical design and operational reliability with input from cross-functional stakeholders.

Qualifications

  • Bachelors degree in Computer Science Information Systems Engineering or equivalent experience.

  • Professional certifications in data engineering cloud platforms or ETL tools are advantageous.

  • Ongoing professional development through training courses or certifications is encouraged.

Professional Competencies

  • Strong analytical and problem-solving skills applied to complex data challenges.

  • Leadership qualities with ability to coach mentor and collaborate effectively in teams.

  • Clear concise communication skills to engage with technical and business audiences.

  • Adaptability to evolving technologies methodologies and business needs.

  • Innovative thinking to improve processes and pioneer data solutions.

  • Effective time management and prioritization in fast-paced environments.

SYNECHRONS DIVERSITY & INCLUSION STATEMENT

Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.


All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.

Candidate Application Notice


Required Experience:

IC

Job SummaryThe ETL Specialist at Synechron is a seasoned data engineering professional responsible for designing developing and maintaining robust ETL and data integration workflows. The role focuses on extracting transforming and loading data from diverse sources into target systems while maintaini...
View more view more

About Company

Company Logo

Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more

View Profile View Profile