AWS Data Engineer
with DevOps Experience
CDO EDP Foundation platform at Prudential Financials is looking for a Data Engineer to join a diverse team dedicated to providing best in class Data Platform to our customers stakeholders and partners.
Prudentials Chief Data Office is focused on building a Centralized Enterprise Data Platform (EDP) in AWS to drive meaningful insights from data in a fast secure and reliable manner. Enterprise Data Platform will be an integrated repository with aggregated data in a consistent structure. This platform will enable one-click onboarding of on-prem data one-click onboarding of metadata from various other consumer (EDP & non-EDP) accounts one-click data sharing with multiple consumer accounts streamline accessibility increase reusability and minimize data redundancy while also being secure and auditable.
Qualifications
Bachelors degree in Computer Science Software Engineering MIS or equivalent combination of education and experience
8 years of experience as Data Engineer on AWS Stack with experience on DevOps tool
AWS Solutions Architect or AWS Developer Certification required
Solid experience of AWS services such as CloudFormation S3 Athena Glue Glue DataBrew EMR/Spark RDS Redshift DataSync DMS DynamoDB Lambda Step Functions IAM KMS SM EventBridge EC2 SQS SNS LakeFormation CloudWatch Cloud Trail
Implement high velocity streaming solutions and orchestration using Amazon Kinesis AWS Managed Airflow and AWS Managed Kafka (preferred)
Solid experience building solutions on AWS data lake/data warehouse
Analyze design development and implementation of data ingestion pipeline in AWS
Knowledge implementing ETL/ELT for data solutions
End-to-end data solutions (ingest storage integration processing access) on AWS
Knowledge implementing RBAC strategy/solutions using AWS IAM and Redshift RBAC model
Build & implement CI/CD pipelines for EDP Platform using CloudFormation and Jenkins
Programming experience with Python Shell scripting and SQL
Knowledge of analyzing data using SQL Stored procedures
Build automated data pipelines to ingest data from relational database systems file systems NAS shares to AWS relational databases such as Amazon RDS Aurora and Redshift
Build Automated data pipelines to ingest data from Rest APIs to AWS data lake (S3) and relational databases such as Amazon RDS Aurora and Redshift.
Good Experience of DevOps practice
Experience with modern source code management and software repository systems (Bitbucket)
Experience with programming languages (Python)
Experience with scripting languages (Shell Groovy)
Experience with API deployment for tooling integration
Experience using Jenkins CloudBees (Pipeline as Code Shared Libraries)
The ability to document exceptions/issues/action plans/meeting minutes/lessons learned accurately and in a timely fashion
Experience with administering DevOps tools in SaaS
Experience using DevOps Tools (SonarQube Artifactory etc.)
Experience using build tools (Maven MS Build and Gradle)
Experience using containers (Docker)
Experience using Atlassian suite (Jira Confluence)
Experience of Infrastructure as Code using CloudFormation
Creating Jenkins CI pipelines to integrate Sonar/Security scans and test automation scripts
Part of DevOps QA and AWS team focusing on building CI/CD pipeline
Responsible for writing and maintaining Jenkins Pipelines
Responsibilities:
Designing building and maintaining efficient reusable and reliable code
Ensure the best possible performance and quality of high scale data applications and services
Participate in system design discussions
Independently perform hands on development and unit testing of the applications
Collaborate with the development team and build individual components into enterprise data platform
Work in a team environment with product QE/QA and cross functional teams to deliver a project throughout the whole software development cycle.
Responsible to identify and resolve any performance issues
Keep up to date with new technology development and implementation
Participate in code review to make sure standards and best practices are met
Project management: Agile developers take responsibility for estimating planning and managing all tasks and report on progress
Software quality: The Agile developer is also responsible for the quality of the software he/she produces. The team takes responsibility for the quality of the work they produce instead of turning over some code to a separate and independent group for testing
Teamwork: This includes collaboration with all other team members with the aim to take shared responsibility for the overall efforts
Understanding user needs: This is about interacting with users as necessary to clarify requirements
AWS Data Engineer with DevOps Experience CDO EDP Foundation platform at Prudential Financials is looking for a Data Engineer to join a diverse team dedicated to providing best in class Data Platform to our customers stakeholders and partners. Prudentials Chief Data Office is focused on building a C...
AWS Data Engineer
with DevOps Experience
CDO EDP Foundation platform at Prudential Financials is looking for a Data Engineer to join a diverse team dedicated to providing best in class Data Platform to our customers stakeholders and partners.
Prudentials Chief Data Office is focused on building a Centralized Enterprise Data Platform (EDP) in AWS to drive meaningful insights from data in a fast secure and reliable manner. Enterprise Data Platform will be an integrated repository with aggregated data in a consistent structure. This platform will enable one-click onboarding of on-prem data one-click onboarding of metadata from various other consumer (EDP & non-EDP) accounts one-click data sharing with multiple consumer accounts streamline accessibility increase reusability and minimize data redundancy while also being secure and auditable.
Qualifications
Bachelors degree in Computer Science Software Engineering MIS or equivalent combination of education and experience
8 years of experience as Data Engineer on AWS Stack with experience on DevOps tool
AWS Solutions Architect or AWS Developer Certification required
Solid experience of AWS services such as CloudFormation S3 Athena Glue Glue DataBrew EMR/Spark RDS Redshift DataSync DMS DynamoDB Lambda Step Functions IAM KMS SM EventBridge EC2 SQS SNS LakeFormation CloudWatch Cloud Trail
Implement high velocity streaming solutions and orchestration using Amazon Kinesis AWS Managed Airflow and AWS Managed Kafka (preferred)
Solid experience building solutions on AWS data lake/data warehouse
Analyze design development and implementation of data ingestion pipeline in AWS
Knowledge implementing ETL/ELT for data solutions
End-to-end data solutions (ingest storage integration processing access) on AWS
Knowledge implementing RBAC strategy/solutions using AWS IAM and Redshift RBAC model
Build & implement CI/CD pipelines for EDP Platform using CloudFormation and Jenkins
Programming experience with Python Shell scripting and SQL
Knowledge of analyzing data using SQL Stored procedures
Build automated data pipelines to ingest data from relational database systems file systems NAS shares to AWS relational databases such as Amazon RDS Aurora and Redshift
Build Automated data pipelines to ingest data from Rest APIs to AWS data lake (S3) and relational databases such as Amazon RDS Aurora and Redshift.
Good Experience of DevOps practice
Experience with modern source code management and software repository systems (Bitbucket)
Experience with programming languages (Python)
Experience with scripting languages (Shell Groovy)
Experience with API deployment for tooling integration
Experience using Jenkins CloudBees (Pipeline as Code Shared Libraries)
The ability to document exceptions/issues/action plans/meeting minutes/lessons learned accurately and in a timely fashion
Experience with administering DevOps tools in SaaS
Experience using DevOps Tools (SonarQube Artifactory etc.)
Experience using build tools (Maven MS Build and Gradle)
Experience using containers (Docker)
Experience using Atlassian suite (Jira Confluence)
Experience of Infrastructure as Code using CloudFormation
Creating Jenkins CI pipelines to integrate Sonar/Security scans and test automation scripts
Part of DevOps QA and AWS team focusing on building CI/CD pipeline
Responsible for writing and maintaining Jenkins Pipelines
Responsibilities:
Designing building and maintaining efficient reusable and reliable code
Ensure the best possible performance and quality of high scale data applications and services
Participate in system design discussions
Independently perform hands on development and unit testing of the applications
Collaborate with the development team and build individual components into enterprise data platform
Work in a team environment with product QE/QA and cross functional teams to deliver a project throughout the whole software development cycle.
Responsible to identify and resolve any performance issues
Keep up to date with new technology development and implementation
Participate in code review to make sure standards and best practices are met
Project management: Agile developers take responsibility for estimating planning and managing all tasks and report on progress
Software quality: The Agile developer is also responsible for the quality of the software he/she produces. The team takes responsibility for the quality of the work they produce instead of turning over some code to a separate and independent group for testing
Teamwork: This includes collaboration with all other team members with the aim to take shared responsibility for the overall efforts
Understanding user needs: This is about interacting with users as necessary to clarify requirements
View more
View less