Enterprise Data Engineer | Cloud (AWS, Azure) | Big Data (Spark, Hadoop) | ETL & Data Pipelines | SQL & NoSQL
Job Summary
Job Summary
Synechron is seeking a proficient Data Engineer to support the design development and maintenance of scalable efficient data pipelines and enterprise data solutions. The role involves collaborating with cross-functional teams to gather requirements implement data management strategies and ensure data quality security and availability. The Data Engineer will leverage experience in cloud platforms big data tools and modern development practices to enable data-driven decision-making and operational excellence across the organization.
Software Requirements
Required:
Strong understanding of data management concepts cloud platforms (preferably AWS or Azure) and scalable architectures.
Hands-on experience with programming languages such as Python Java or .
Practical experience with big data tools like Apache Spark Hadoop Flink or similar frameworks.
Working knowledge of databases such as SQL (MySQL SQL Server PostgreSQL) and NoSQL databases (e.g. MongoDB DynamoDB).
Experience with data orchestration and pipeline tools such as Apache Airflow Luigi or comparable frameworks.
Familiarity with version control systems such as Git and collaboration tools like JIRA and Confluence.
Preferred:
Knowledge of containerization (Docker Kubernetes) and infrastructure as code (Terraform CloudFormation).
Experience in deploying and managing data pipelines on cloud platforms like AWS Glue Azure Data Factory or GCP Dataflow.
Familiarity with stream processing tools like Kafka or Kinesis.
Exposure to data security protocols and compliance standards (GDPR HIPAA etc.).
Overall Responsibilities
Design develop and maintain large-scale data pipelines ETL workflows and data integrations to support analytics reporting and operational needs.
Collaborate with data scientists analysts and business stakeholders to understand data requirements and deliver reliable solutions.
Optimize and monitor data pipelines for performance scalability and data quality.
Implement data governance validation and cataloging processes to ensure data integrity and security.
Automate deployment testing and data infrastructure changes using CI/CD practices.
Participate in architecture discussions technical reviews and documentation to support data ecosystem growth.
Stay informed of emerging data technologies industry standards and best practices and incorporate relevant innovations.
Expected outcomes:
Reliable scalable secure and high-performing data pipelines that support organizational analytics and business intelligence initiatives.
Technical Skills (By Category)
Programming Languages:
Essential: Python Java or
Preferred: Spark (PySpark Spark Scala) SQL for data manipulation
Databases/Data Management:
Essential: SQL database management (MySQL PostgreSQL SQL Server)
Preferred: NoSQL databases (MongoDB DynamoDB)
Cloud Technologies:
Preferred: AWS (Glue S3 EMR) Azure Data Factory GCP Dataflow
Frameworks & Libraries:
Essential: Apache Spark Kafka Hadoop ecosystem components
Preferred: Dask Flink
Development Tools & Methodologies:
Essential: Git Jenkins CI/CD pipelines Agile/Scrum practices
Preferred: Terraform Docker Kubernetes DataOps tools
Security & Compliance:
Awareness of data encryption access controls and compliance frameworks such as GDPR HIPAA and data masking best practices.
Experience Requirements
Minimum of 5 years developing and maintaining enterprise data pipelines and big data solutions.
Proven experience in designing scalable ETL workflows integrating cloud data services and optimizing data processes.
Demonstrable success in deploying data solutions that support reporting analytics and machine learning initiatives.
Industry experience in finance healthcare retail or enterprise sectors highly desirable; relevant open-source or academic projects also acceptable.
Day-to-Day Activities
Develop test and deploy scalable data pipelines and ETL workflows.
Collaborate with business and data science teams to gather requirements and deliver data solutions.
Monitor data pipelines and optimize for performance reliability and security.
Troubleshoot technical issues perform root cause analysis and apply fixes.
Automate deployment and infrastructure provisioning procedures.
Maintain detailed documentation of data architecture workflows and operational guidelines.
Proactively research emerging data tools and platforms to recommend innovation.
Qualifications
Bachelors or Masters degree in Computer Science Data Engineering or related disciplines.
5 years of experience supporting enterprise data ecosystems especially on cloud platforms.
Experience with big data frameworks cloud data services and automation tools.
Certifications in cloud platforms (AWS Data Analytics Azure Data Engineer GCP Data Engineer) are advantageous.
Strong problem-solving analytical thinking and communication skills.
Professional Competencies
Critical thinking to design innovative and scalable data architectures.
Leadership skills to mentor junior staff and guide data projects.
Effective stakeholder management for cross-team collaboration.
Adaptability to rapidly evolving data technologies and organizational needs.
Ownership of data quality security and compliance standards.
Time management skills to effectively prioritize tasks and meet project deadlines.
SYNECHRONS DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.
All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.
Required Experience:
IC
About Company
Chez Synechron, nous croyons en la puissance du numérique pour transformer les entreprises en mieux. Notre cabinet de conseil mondial combine la créativité et la technologie innovante pour offrir des solutions numériques de premier plan. Les technologies progressistes et les stratégie ... View more