drjobs Oracle Advanced Engineering-Glue- AWS EMR-Redshift-Pyspark-S3- Airflow-senior

Oracle Advanced Engineering-Glue- AWS EMR-Redshift-Pyspark-S3- Airflow-senior

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Mumbai - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

At EY youll have the chance to build a career as unique as you are with the global scale support inclusive culture and technology to become the best version of you. And were counting on your unique voice and perspective to help EY become even better too. Join us and build an exceptional experience for yourself and a better working world for all.

EY-Consulting AWS Staff-Senior

The opportunity

We are looking for a skilled AWS Data Engineer to join our growing data team. This role involves building and managing scalable data pipelines that ingest process and store data from various sources using modern AWS technologies. You will work with both batch and streaming data and contribute to a robust scalable data architecture to support analytics BI and data science use cases. As a problem-solver with the keen ability to diagnose a clients unique needs one should be able to see the gap between where clients currently are and where they need to be. The candidate should be capable of creating a blueprint to help clients achieve their end goal.

Key Responsibilities:

  • Design and implement data ingestion pipelines from various sources including on-premise Oracle databases batch files and Confluent Kafka.
  • Develop Python producers and AWS Glue jobs for batch data processing.
  • Build and manage Spark streaming applications on Amazon EMR.
  • Architect and maintain Medallion Architecture-based data lakes on Amazon S3.
  • Develop and maintain data sinks in Redshift and Oracle.
  • Automate and orchestrate workflows using Apache Airflow.
  • Monitor debug and optimize data pipelines for performance and reliability.
  • Collaborate with cross-functional teams including data analysts scientists and DevOps.

Required Skills and Experience:

  • Good programming skills in Python and Spark (Pyspark).
  • Hands on Experience with Amazon S3 Glue EMR.
  • Good SQL knowledge on Amazon Redshift and Oracle
  • Proven experience in handling streaming data with Kafka and building real-time pipelines.
  • Good understanding of data modeling ETL frameworks and performance tuning.
  • Experience with workflow orchestration tools like Airflow.

Nice-to-Have Skills:

  • Infrastructure as Code using Terraform.
  • Experience with AWS services like SNS SQS DynamoDB DMS Athena and Lake Formation.
  • Familiarity with DataSync for file movement and medallion architecture for data lakes.
  • Monitoring and alerting using CloudWatch Datadog or Splunk.

Qualifications:

  • BTech / MTech / MCA / MBA

EY Building a better working world



EY exists to build a better working world helping to create long-term value for clients people and society and build trust in the capital markets.



Enabled by data and technology diverse EY teams in over 150 countries provide trust through assurance and help clients grow transform and operate.



Working across assurance consulting law strategy tax and transactions EY teams ask better questions to find new answers for the complex issues facing our world today.

Employment Type

Part-Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.