drjobs Lead Data Engineer

Lead Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Newark - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Job Classification:

Technology Engineering & Cloud

Lead Data Engineer (The Prudential Insurance Company of America Newark NJ):

Design and develop highly scalable and efficient data architectures that align with business requirements and support analytical usecases. Provide technical expertise to the data engineering team offering guidance on best practices coding standards and architectural designs. Design and implement robust ETL/ELT pipelines leveraging AWS Cloud Big Data technologies (Spark & Python) and Snowflake for seamless data ingestion transformation and loading from diverse sources. Demonstrate expertise in AWS data services including S3 Glue Redshift EMR Kinesis Lambda and showcase strong knowledge in Snowflake services such as external stage security assisted engineering and the Snowflake marketplace. Create automations and CI/CD pipelines using tools like BitBucket Jenkins and Cloud Formation to streamline development and deployment processes. Optimize data storage query performance and costs on AWS and Snowflake resources to enhance operational efficiency. Ensure the implementation of data security best practices encryption measures and access controls to meet industry standards and regulations. Collaborate with data scientists analysts and other stakeholders to understand business requirements and deliver actionable insights. Maintain comprehensive documentation of data engineering processes data models and infrastructure configurations to ensure transparency and ease of knowledge transfer.

Telecommuting permitted up to 3 day(s) per week.

Full time employment Monday Friday 40 hours per week.

MINIMUM REQUIREMENTS:

Must have a Bachelors degree or foreign equivalent in Computer Science Information Technology or a related field and 5 years of progressive postbaccalaureate related work experience. Alternatively employer will accept a Masters degree or foreign equivalent in Computer Science Information Technology or a related field and 3 years of related work experience.


Must have 3 years of experience in each of the following:

  • ExtractTransformLoad Experience;
  • Python;
  • Spark Hadoop Hive and Pyspark;
  • Database including SQLServer Snowflake and Redshift;
  • Job Orchestrations including Step Function and Airflow;
  • Version Control Tools such as BitBucket Github or Gitlab;
  • CI/CD Pipelines specifically Jenkins or flyway pipelines; and
  • Infrastructure as Code including CloudFormation CDK or Terraform.

Telecommuting permitted up to 3 day(s) per week.

TO APPLY: Please click Apply Button. Should you have any difficulty in applying for this position through our website please contact for assistance in the application process.

What we offer you:

Eligibility to participate in a discretionary annual incentive program is subject to the rules governing the program whereby an award if any depends on various factors including without limitation individual and organizational performance. To find out more about our Total Rewards package visit Work Life Balance Prudential Careers. Some of the above benefits may not apply to parttime employees scheduled to work less than 20 hours per week.

Employment Type

Full Time

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.