drjobs AWS+Snowflake + DBT

AWS+Snowflake + DBT

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bengaluru - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Industry/Sector

Not Applicable

Specialism

Data Analytics & AI

Management Level

Senior Associate

Job Description & Summary

At PwC our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights enabling informed decision-making and driving business growth.

In data engineering at PwC you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines data integration and data transformation solutions.

At PwC we connect people with diverse backgrounds and skill sets to solve important problems together and lead with purposefor our clients our communities and for the world at large. It is no surprise therefore that 429 of 500 Fortune global companies engage with PwC.

Acceleration Centers (ACs) are PwCs diverse global talent hubs focused on enabling growth for the organization and value creation for our clients. The PwC Advisory Acceleration Center in Bangalore is part of our Advisory business in the US. The team is focused on developing a broader portfolio with solutions for Risk Consulting Management Consulting Technology Consulting Strategy Consulting Forensics as well as vertical specific solutions.

PwCs high-performance culture is based on passion for excellence with focus on diversity and inclusion. You will collaborate with and receive support from a network of people to achieve your goals. We will also provide you with global leadership development frameworks and the latest in digital technologies to learn and excel in your career. At the core of our firms philosophy is a simple construct: We care for our people.

Globally PwC is ranked as the 3rd most attractive employer according to Universum. Our commitment to Responsible Business Leadership Diversity & Inclusion work-life flexibility career coaching and learning & development makes our firm one of the best places to work learn and excel.

Apply to us if you believe PwC is the place to be. Now and in the future!

JOB OVERVIEW

At PwC - AC as an AWS Developer the candidate will interact with Offshore Manager/ Onsite Business Analyst to understand the requirements and the candidate is responsible for end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake Data hub in AWS.

Years of Experience: Candidates with 2-4 years of hands-on experience

Position Requirements:

Must Have:

  • Experience in architecting and delivering highly scalable distributed cloud-based enterprise data solutions

  • Strong expertise in the end-to-end implementation of Cloud data engineering solutions like Enterprise Data Lake Data hub in AWS

  • Hands-on experience with Snowflake utilities SnowSQL SnowPipeETL data Pipelines Big Data model techniques using

Python / Java

  • Experience in loading disparate data sets and translating complex functional and technical requirements into detailed design

  • Should be aware of deploying Snowflake features such as data sharing events and lake-house patterns

  • Deep understanding of relational as well as NoSQL data stores methods and approaches (star and snowflake dimensional modeling)

  • Strong AWS hands-on expertise with a programming background preferably Python/Scala

  • Good knowledge of Big Data frameworks and related technologies - Experience in Hadoop and Spark is mandatory

  • Strong experience in AWS compute services like AWS EMR Glue and Sagemaker and storage services like S3 Redshift & Dynamodb

  • Good experience with any one of the AWS Streaming Services like AWS Kinesis AWS SQS and AWS MSK

  • Troubleshooting and Performance tuning experience in Spark framework - Spark core Sql and Spark Streaming

  • Experience in one of the flow tools like Airflow Nifi or Luigi

  • Good knowledge of Application DevOps tools (Git CI/CD Frameworks) - Experience in Jenkins or Gitlab with rich experience in source code management like Code Pipeline Code Build and Code Commit

  • Experience with AWS CloudWatch AWS Cloud Trail AWS Account Config AWS Config Rules

  • Strong understanding of Cloud data migration processes methods and project lifecycle

  • Good analytical & problem-solving skills

  • Good communication and presentation skills

Desired Knowledge / Skills:

  • Experience in building stream-processing systems using solutions such as Storm or Spark-Streaming

  • Experience in Big Data ML toolkits such as Mahout SparkML or H2O

  • Knowledge in Python

  • Worked in Offshore / Onsite Engagements

  • Experience in one of the flow tools like Airflow Nifi or Luigi

  • Experience in AWS services like STEP & Lambda

Professional and Educational Background:

  • BE / / MCA / / M.E / / MBA

Additional Information:

  • Travel Requirements: Travel to client locations may be as per project requirements.

  • Line of Service: Advisory

  • Horizontal: Technology Consulting

  • Designation: Associate

  • Location: Bangalore India

Travel Requirements

Not Specified

Job Posting End Date

Employment Type

Full-Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.