Must be visa independent and local to PA
Job Title: Data Engineer
Location: PA (Hybrid)
Job Summary:
We are looking for a skilled Data Engineer with strong experience in building scalable ETL pipelines and working with AWS data services. The ideal candidate will handle large-scale data integration ensure data quality and support analytics needs.
Key Skills:
- AWS: Glue S3 Lambda Athena IAM SNS SQS SageMaker
- Python SQL PySpark
- Data Modeling
- GitHub Jira Confluence
Responsibilities:
- Develop and maintain ETL pipelines for real-time and batch processing
- Integrate and process large datasets from multiple sources
- Ensure data accuracy consistency and integrity
- Translate business requirements into technical solutions
- Write optimized queries programs and reports
- Collaborate with stakeholders and support data-driven decisions
Requirements:
- Strong hands-on experience with AWS data ecosystem
- Solid programming skills in Python SQL and PySpark
- Experience with data warehousing and modeling
- Excellent problem-solving and communication skills
Must be visa independent and local to PA Job Title: Data Engineer Location: PA (Hybrid) Job Summary: We are looking for a skilled Data Engineer with strong experience in building scalable ETL pipelines and working with AWS data services. The ideal candidate will handle large-scale data integratio...
Must be visa independent and local to PA
Job Title: Data Engineer
Location: PA (Hybrid)
Job Summary:
We are looking for a skilled Data Engineer with strong experience in building scalable ETL pipelines and working with AWS data services. The ideal candidate will handle large-scale data integration ensure data quality and support analytics needs.
Key Skills:
- AWS: Glue S3 Lambda Athena IAM SNS SQS SageMaker
- Python SQL PySpark
- Data Modeling
- GitHub Jira Confluence
Responsibilities:
- Develop and maintain ETL pipelines for real-time and batch processing
- Integrate and process large datasets from multiple sources
- Ensure data accuracy consistency and integrity
- Translate business requirements into technical solutions
- Write optimized queries programs and reports
- Collaborate with stakeholders and support data-driven decisions
Requirements:
- Strong hands-on experience with AWS data ecosystem
- Solid programming skills in Python SQL and PySpark
- Experience with data warehousing and modeling
- Excellent problem-solving and communication skills
View more
View less