Job Summary
Synechron is seeking a highly skilled and proactive Data Engineer to join our dynamic data analytics team. In this role you will be instrumental in designing developing and maintaining scalable data pipelines and solutions on the Google Cloud Platform (GCP). With your expertise youll enable data-driven decision-making contribute to strategic business initiatives and ensure robust data infrastructure. This position offers an opportunity to work in a collaborative environment with a focus on innovative technologies and continuous growth.
Software Requirements
Required:
- Proficiency in Data Engineering tools and frameworks such as Hive Apache Spark and Python (version 3.x)
- Extensive experience working with Google Cloud Platform (GCP) offerings including Dataflow BigQuery Cloud Storage and Pub/Sub
- Familiarity with Git Jira and Confluence for version control and collaboration
Preferred:
- Experience with additional GCP services like DataProc Data Studio or Cloud Composer
- Exposure to other programming languages such as Java or Scala
- Knowledge of data security best practices and tools
Overall Responsibilities
- Design develop and optimize scalable data pipelines on GCP to support analytics and reporting needs
- Collaborate with cross-functional teams to translate business requirements into technical solutions
- Build and maintain data models ensuring data quality integrity and security
- Participate actively in code reviews adhering to best practices and standards
- Develop automated and efficient data workflows to improve system performance
- Stay updated with emerging data engineering trends and continuously improve technical skills
- Provide technical guidance and support to team members fostering a collaborative environment
- Ensure timely delivery of deliverables aligned with project milestones
Technical Skills (By Category)
Programming Languages:
- Essential: Python (required)
- Preferred: Java Scala
Data Management & Databases:
- Experience with Hive BigQuery and relational databases
- Knowledge of data warehousing concepts and SQL proficiency
Cloud Technologies:
- Extensive hands-on experience with GCP services including Dataflow BigQuery Cloud Storage Pub/Sub and Composer
- Ability to build and optimize data pipelines leveraging GCP offerings
Frameworks & Libraries:
- Spark (PySpark preferred) Hadoop ecosystem experience is advantageous
Development Tools & Methodologies:
- Agile/Scrum methodologies version control with Git project tracking via JIRA documentation on Confluence
Security Protocols:
- Understanding of data security privacy and compliance standards
Experience Requirements
- Minimum of 6-8 years in data or software engineering roles with a focus on data pipeline development
- Proven experience in designing and implementing data solutions on cloud platforms particularly GCP
- Prior experience working in agile teams participating in code reviews and delivering end-to-end data projects
- Experience working with cross-disciplinary teams and understanding varied stakeholder requirements
- Exposure to industry best practices for data security governance and quality assurance is desired
Day-to-Day Activities
- Attend daily stand-up meetings and contribute to project planning sessions
- Collaborate with business analysts data scientists and other stakeholders to understand data needs
- Develop test and deploy scalable data pipelines ensuring efficiency and reliability
- Perform regular code reviews provide constructive feedback and uphold coding standards
- Document technical solutions and maintain clear records of data workflows
- Troubleshoot and resolve technical issues in data processing environments
- Participate in continuous learning initiatives to stay abreast of technological developments
- Support team members by sharing knowledge and resolving technical challenges
Qualifications
- Bachelors or Masters degree in Computer Science Information Technology or a related field
- Relevant professional certifications in GCP (such as Google Cloud Professional Data Engineer) are preferred but not mandatory
- Demonstrable experience in data engineering and cloud technologies
Professional Competencies
- Strong analytical and problem-solving skills with a focus on outcome-driven solutions
- Excellent communication and interpersonal skills to effectively collaborate within teams and with stakeholders
- Ability to work independently with minimal supervision and manage multiple priorities effectively
- Adaptability to evolving technologies and project requirements
- Demonstrated initiative in driving tasks forward and continuous improvement mindset
- Strong organizational skills with a focus on quality and attention to detail
SYNECHRONS DIVERSITY & INCLUSION STATEMENT
Diversity & Inclusion are fundamental to our culture and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity Equity and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger successful businesses as a global company. We encourage applicants from across diverse backgrounds race ethnicities religion age marital status gender sexual orientations or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements mentoring internal mobility learning and development programs and more.
All employment decisions at Synechron are based on business needs job requirements and individual qualifications without regard to the applicants gender gender identity sexual orientation race ethnicity disabled or veteran status or any other characteristic protected by law.
Candidate Application Notice