Data Engineer AWS Python
Role Description:
We are seeking an experienced ETL/Data Engineer with strong cloud expertise to design develop and maintain data pipelines. The ideal candidate will have hands-on experience with AWS services large-scale data processing and deployment pipelines.
Responsibilities:
- Data Pipeline Development:
- Design and implement data pipelines on cloud platforms (AWS preferred).
- Handle large volumes of data from multiple sources.
- Perform data cleansing data validation and transformation.
ETL & Cloud Development:
- Hands-on ETL development using Python and SQL.
- Utilize AWS services such as Glue Glue Crawlers Lambda Redshift Athena S3 EC2 IAM.
- Implement monitoring and logging mechanisms using AWS CloudWatch and set up alerts.
Deployment & CI/CD:
- Deploy solutions on the cloud.
- Integrate CI/CD pipelines to build artifacts and deploy changes across environments.
Scheduling & Orchestration:
- Work with scheduling frameworks like Airflow and AWS Step Functions to manage workflows.
Collaboration & Communication:
- Communicate effectively with stakeholders and work collaboratively with cross-functional teams.
Skills Required:
- Cloud Computing: Amazon Web Services (AWS)
- Databases: Microsoft SQL Server 2019
- Big Data & Analytics: PySpark
- Programming: Python SQL
- Workflow Orchestration: Airflow AWS Step Functions
- Monitoring & Logging: AWS CloudWatch
Experience: 68 years
Data Engineer AWS PythonRole Description:We are seeking an experienced ETL/Data Engineer with strong cloud expertise to design develop and maintain data pipelines. The ideal candidate will have hands-on experience with AWS services large-scale data processing and deployment pipelines.Responsibiliti...
Data Engineer AWS Python
Role Description:
We are seeking an experienced ETL/Data Engineer with strong cloud expertise to design develop and maintain data pipelines. The ideal candidate will have hands-on experience with AWS services large-scale data processing and deployment pipelines.
Responsibilities:
- Data Pipeline Development:
- Design and implement data pipelines on cloud platforms (AWS preferred).
- Handle large volumes of data from multiple sources.
- Perform data cleansing data validation and transformation.
ETL & Cloud Development:
- Hands-on ETL development using Python and SQL.
- Utilize AWS services such as Glue Glue Crawlers Lambda Redshift Athena S3 EC2 IAM.
- Implement monitoring and logging mechanisms using AWS CloudWatch and set up alerts.
Deployment & CI/CD:
- Deploy solutions on the cloud.
- Integrate CI/CD pipelines to build artifacts and deploy changes across environments.
Scheduling & Orchestration:
- Work with scheduling frameworks like Airflow and AWS Step Functions to manage workflows.
Collaboration & Communication:
- Communicate effectively with stakeholders and work collaboratively with cross-functional teams.
Skills Required:
- Cloud Computing: Amazon Web Services (AWS)
- Databases: Microsoft SQL Server 2019
- Big Data & Analytics: PySpark
- Programming: Python SQL
- Workflow Orchestration: Airflow AWS Step Functions
- Monitoring & Logging: AWS CloudWatch
Experience: 68 years
View more
View less