Role: GCP Data Engineer.
Location: Dallas TX (Onsite).
Duration: Long Term Contract.
Key Responsibilities:
- Design build and maintain scalable data pipelines on Google Cloud Platform (GCP).
- Develop and optimize data workflows using Cloud Composer (Apache Airflow).
- Implement data ingestion and transformation processes using BigQuery Cloud Storage (GCS) and Cloud Functions.
- Write efficient and complex SQL queries for data analysis transformation and reporting.
- Develop reusable and modular Python scripts for data processing and automation.
- Collaborate with data analysts and business stakeholders to understand data requirements.
- Ensure data quality integrity and security across all stages of the pipeline.
- Monitor and troubleshoot data workflows and infrastructure performance.
- Document technical solutions and maintain best practices for GCP data engineering.
Required Skills & Qualifications:
- Strong hands-on experience with GCP services: BigQuery GCS Cloud Composer Cloud Functions and Airflow.
- Proficiency in SQL for querying and manipulating large datasets.
- Intermediate to Advanced Python programming skills for data engineering tasks.
- Experience with orchestration tools like Apache Airflow.
- Familiarity with CI/CD pipelines and version control (e.g. Git).
- Understanding of data modeling ETL/ELT processes and cloud-native architecture.
- Excellent problem-solving and communication skills.
Role: GCP Data Engineer. Location: Dallas TX (Onsite). Duration: Long Term Contract. Key Responsibilities: Design build and maintain scalable data pipelines on Google Cloud Platform (GCP). Develop and optimize data workflows using Cloud Composer (Apache Airflow). Implement data ingestion and tr...
Role: GCP Data Engineer.
Location: Dallas TX (Onsite).
Duration: Long Term Contract.
Key Responsibilities:
- Design build and maintain scalable data pipelines on Google Cloud Platform (GCP).
- Develop and optimize data workflows using Cloud Composer (Apache Airflow).
- Implement data ingestion and transformation processes using BigQuery Cloud Storage (GCS) and Cloud Functions.
- Write efficient and complex SQL queries for data analysis transformation and reporting.
- Develop reusable and modular Python scripts for data processing and automation.
- Collaborate with data analysts and business stakeholders to understand data requirements.
- Ensure data quality integrity and security across all stages of the pipeline.
- Monitor and troubleshoot data workflows and infrastructure performance.
- Document technical solutions and maintain best practices for GCP data engineering.
Required Skills & Qualifications:
- Strong hands-on experience with GCP services: BigQuery GCS Cloud Composer Cloud Functions and Airflow.
- Proficiency in SQL for querying and manipulating large datasets.
- Intermediate to Advanced Python programming skills for data engineering tasks.
- Experience with orchestration tools like Apache Airflow.
- Familiarity with CI/CD pipelines and version control (e.g. Git).
- Understanding of data modeling ETL/ELT processes and cloud-native architecture.
- Excellent problem-solving and communication skills.
View more
View less