Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailRole Overview:
We are seeking a skilled Data Engineer with 3 years of experience in cloud platforms like GCP or AWS. The ideal candidate will have strong proficiency in Python programming experience with Kafka for real-time data streaming and expertise in DBT for data transformation. This role requires working on complex projects involving mention any specific types of projects or industries.
Key Responsibilities:
Design develop and deploy scalable data pipelines and ETL processes on cloud platforms (GCP/AWS).
Implement and maintain real-time data streaming solutions using Kafka.
Collaborate with cross-functional teams to understand data requirements and deliver robust data solutions.
Optimize and tune data pipelines for performance and reliability.
Ensure data quality and integrity throughout the data lifecycle.
Work on data modeling and schema design using DBT.
Provide technical guidance and support to junior team members as needed.
Required Skills and Qualifications:
Bachelor s degree in Computer Science Engineering or a related field.
3 years of professional experience as a Data Engineer or similar role.
Proficiency in Python programming and experience with its data libraries.
Hands-on experience with cloud platforms such as Google Cloud Platform (GCP) or Amazon Web Services (AWS).
Strong knowledge of Kafka for real-time data streaming.
Experience with DBT (Data Build Tool) for data modeling and transformation.
Ability to work on complex data projects independently and as part of a team.
Excellent problem-solving and analytical skills.
Effective communication skills with the ability to collaborate across teams.
Preferred Qualifications:
Master s degree in Computer Science Engineering or a related field.
Certifications in relevant cloud technologies (GCP/AWS).
Full Time