Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailWe are an IT Solutions Integrator/Consulting Firm helping our clients hire the right professional for an exciting longterm project. Here are a few details.
We are looking for a skilled and versatile ETL Developer with strong expertise in ETL/ELT pipeline development data design cloud technologies (especially AWS) and automation practices (CI/CD). The ideal candidate will have experience working across various stages of the data lifecycle from ingestion and transformation to orchestration and deployment using tools such as IICS (Informatica Intelligent Cloud Services) Python/PySpark and Shell scripting.
Design build and maintain robust ETL/ELT data pipelines using IICS Python or PySpark to support largescale data processing and analytics.
Collaborate with data architects and analysts to design scalable data models and processing solutions.
Develop and maintain shell scripts for task automation job orchestration and system monitoring.
Work closely with DevOps teams to implement CI/CD pipelines for data solutions ensuring fast and reliable deployments.
Deploy and manage data workflows and infrastructure on AWS cloud services (e.g. S3 Lambda Glue EMR Redshift Athena).
Ensure data quality integrity and compliance through testing validation and monitoring frameworks.
Participate in performance tuning and optimization of ETL jobs and data processing applications.
Troubleshoot data pipeline failures and perform root cause analysis and resolution.
ETL/ELT Tools: Handson experience with Informatica IICS or similar platforms.
Programming: Strong proficiency in Python and/or PySpark for data transformation and processing.
Scripting: Advanced knowledge of Shell scripting in Unix/Linux environments.
Cloud: Experience working with AWS services like S3 EC2 Glue Redshift Lambda etc.
CI/CD: Familiarity with tools like Jenkins GitLab CI or AWS CodePipeline.
Data Modeling & Design: Ability to interpret business requirements into scalable and efficient data architecture.
Strong problemsolving and communication skills with an ability to collaborate across technical and business teams.
Exposure to data governance metadata management or data cataloging tools.
Knowledge of SQL tuning and performance optimization techniques.
Experience with monitoring tools (e.g. CloudWatch DataDog).
Understanding of Agile/Scrum methodologies.
Bachelor s or Master s degree in Computer Science Information Technology Engineering or related field.
Education
B.E/
Full Time