Job Title: SR. AWS DATA ENGINEER
Location: Reston VA
Duration: 12 Months
Visa: USC GC H1B and EAD
Contract Type: W2
Job Description:
This position is for AWS Data Engineer.
5 7 years experience with AWS data engineering ETL/ELT pipelines and cloudnative data processing.
Looking for proficiency in the following areas :
- AWS Data Services (Glue PySpark Lambda Step Functions S3 DynamoDB Redshift RDS CloudWatch IAM)
- Building and maintaining ETL/ELT data pipelines using AWS Glue PySpark Lambda and orchestration frameworks
- Python scripting for ETL transformations automation and data quality checks
- Experience with distributed data processing (Spark / PySpark)
- Experience integrating data from APIs AWS services and external systems
- Familiarity with monitoring and logging tools (CloudWatch Splunk)
- Experience working with COTS or enterprise platforms as data sources
Other Requirements :
- Exposure to Agile methodology
- Good communication documentation and teamwork skills
- Ability to support off-hours production releases and provide on-call support for critical data pipelines
Education :
- Bachelors degree in Computer Science Information Systems Engineering or related field
- Relevant professional certifications
Job Title: SR. AWS DATA ENGINEER Location: Reston VA Duration: 12 Months Visa: USC GC H1B and EAD Contract Type: W2 Job Description: This position is for AWS Data Engineer. 5 7 years experience with AWS data engineering ETL/ELT pipelines and cloudnative data processing. Looking for proficiency in...
Job Title: SR. AWS DATA ENGINEER
Location: Reston VA
Duration: 12 Months
Visa: USC GC H1B and EAD
Contract Type: W2
Job Description:
This position is for AWS Data Engineer.
5 7 years experience with AWS data engineering ETL/ELT pipelines and cloudnative data processing.
Looking for proficiency in the following areas :
- AWS Data Services (Glue PySpark Lambda Step Functions S3 DynamoDB Redshift RDS CloudWatch IAM)
- Building and maintaining ETL/ELT data pipelines using AWS Glue PySpark Lambda and orchestration frameworks
- Python scripting for ETL transformations automation and data quality checks
- Experience with distributed data processing (Spark / PySpark)
- Experience integrating data from APIs AWS services and external systems
- Familiarity with monitoring and logging tools (CloudWatch Splunk)
- Experience working with COTS or enterprise platforms as data sources
Other Requirements :
- Exposure to Agile methodology
- Good communication documentation and teamwork skills
- Ability to support off-hours production releases and provide on-call support for critical data pipelines
Education :
- Bachelors degree in Computer Science Information Systems Engineering or related field
- Relevant professional certifications
View more
View less