Position: GCP/API - Data Engineer
Location: Danbury CT (Onsite)
Contract
Skills:
Design build and maintain scalable data pipelines using Cloud Dataflow Apache Beam Apache Spark or BigQuery.
Develop ETL/ELT workflows for data ingestion transformation and processing using Cloud Composer (Airflow) TIDAL Dataform or custom scripts.
Optimize BigQuery performance through partitioning clustering and query tuning.
Work with Cloud Storage Pub/SubNi-Fi Cloud SQL and Bigtable for real-time and batch data processing.
Monitor and troubleshoot data pipeline performance failures and cost efficiency.
Strong expertise in GCP services (BigQuery Dataflow Cloud Storage Pub/Sub Bigtable Firestore etc.).
Proficiency in SQL Python and Java for data processing and automation.
Experience with ETL/ELT workflows using Cloud Composer Dataflow or Dataform.
Strong understanding of data modeling warehousing and distributed computing.
Experience with real-time and batch processing architectures.
Understanding of security and compliance standards (IAM encryption GDPR HIPAA etc.).
Strong API Skills:
- Strong Core Java & Spring Boot
- APIGEE & Security patterns
- Swagger Designing
- Microservice Architecture and Patterns
Preferred Qualifications:
GCP certifications (e.g. Professional Data Engineer Associate Cloud Engineer).
Exposure to Kafka Ni-Fi or other streaming technologies.
Experience with containerization and orchestration (Docker Kubernetes GKE)
Position: GCP/API - Data Engineer Location: Danbury CT (Onsite) Contract Skills: Design build and maintain scalable data pipelines using Cloud Dataflow Apache Beam Apache Spark or BigQuery. Develop ETL/ELT workflows for data ingestion transformation and processing using Cloud Compo...
Position: GCP/API - Data Engineer
Location: Danbury CT (Onsite)
Contract
Skills:
Design build and maintain scalable data pipelines using Cloud Dataflow Apache Beam Apache Spark or BigQuery.
Develop ETL/ELT workflows for data ingestion transformation and processing using Cloud Composer (Airflow) TIDAL Dataform or custom scripts.
Optimize BigQuery performance through partitioning clustering and query tuning.
Work with Cloud Storage Pub/SubNi-Fi Cloud SQL and Bigtable for real-time and batch data processing.
Monitor and troubleshoot data pipeline performance failures and cost efficiency.
Strong expertise in GCP services (BigQuery Dataflow Cloud Storage Pub/Sub Bigtable Firestore etc.).
Proficiency in SQL Python and Java for data processing and automation.
Experience with ETL/ELT workflows using Cloud Composer Dataflow or Dataform.
Strong understanding of data modeling warehousing and distributed computing.
Experience with real-time and batch processing architectures.
Understanding of security and compliance standards (IAM encryption GDPR HIPAA etc.).
Strong API Skills:
- Strong Core Java & Spring Boot
- APIGEE & Security patterns
- Swagger Designing
- Microservice Architecture and Patterns
Preferred Qualifications:
GCP certifications (e.g. Professional Data Engineer Associate Cloud Engineer).
Exposure to Kafka Ni-Fi or other streaming technologies.
Experience with containerization and orchestration (Docker Kubernetes GKE)
View more
View less