Data Architect
Duration - 6 Months
Location - Melville NY
Description
- Experience architecting solutions in Cloud Data Engineering platform (eg: GCP AWS Azure)
- Design end to end data architectures on GCP leveraging BigQuery Dataflow Dataproc Pub/Sub Cloud Composer Cloud Storage and Looker.
- Develop modern data lake data warehouse and lakehouse architectures using best practices and well architected frameworks in GCP and prefer to have experience in AWS Azure & Databricks
- Create logical and physical data models data flow diagrams integration patterns and reference architectures.
- Architect scalable ETL/ELT pipelines using Dataflow (Apache Beam) Dataproc (Spark) and/or Cloud Composer orchestrations.
- Lead cloud-native modernization initiatives: migration from legacy platforms to GCP.
- Provide expert guidance to data engineering teams on standards patterns and reusable frameworks.
- Optimize BigQuery performance including partitioning clustering materialized views BI Engine and storage optimizations.
- Implement data quality metadata management observability and lineage frameworks using tools like Dataplex and Data Catalog.
- Ensure CI/CD adoption using Cloud Build GitHub/GitLab pipelines and infrastructure-as-code (Terraform).
- Help define data governance standards including access models encryption retention and data lifecycle management.
- Implement IAM policies VPC Service Controls organizational policy constraints and secure data sharing patterns.
- Ensure compliance with GDPR HIPAA PCI and internal corporate policies.
Data Architect Duration - 6 Months Location - Melville NY Description Experience architecting solutions in Cloud Data Engineering platform (eg: GCP AWS Azure) Design end to end data architectures on GCP leveraging BigQuery Dataflow Dataproc Pub/Sub Cloud Composer Cloud Storage and Looker. Dev...
Data Architect
Duration - 6 Months
Location - Melville NY
Description
- Experience architecting solutions in Cloud Data Engineering platform (eg: GCP AWS Azure)
- Design end to end data architectures on GCP leveraging BigQuery Dataflow Dataproc Pub/Sub Cloud Composer Cloud Storage and Looker.
- Develop modern data lake data warehouse and lakehouse architectures using best practices and well architected frameworks in GCP and prefer to have experience in AWS Azure & Databricks
- Create logical and physical data models data flow diagrams integration patterns and reference architectures.
- Architect scalable ETL/ELT pipelines using Dataflow (Apache Beam) Dataproc (Spark) and/or Cloud Composer orchestrations.
- Lead cloud-native modernization initiatives: migration from legacy platforms to GCP.
- Provide expert guidance to data engineering teams on standards patterns and reusable frameworks.
- Optimize BigQuery performance including partitioning clustering materialized views BI Engine and storage optimizations.
- Implement data quality metadata management observability and lineage frameworks using tools like Dataplex and Data Catalog.
- Ensure CI/CD adoption using Cloud Build GitHub/GitLab pipelines and infrastructure-as-code (Terraform).
- Help define data governance standards including access models encryption retention and data lifecycle management.
- Implement IAM policies VPC Service Controls organizational policy constraints and secure data sharing patterns.
- Ensure compliance with GDPR HIPAA PCI and internal corporate policies.
View more
View less