Job Title: GCP Data Architect Location: Toronto ON
Work Model: Hybrid (4 days onsite per week)
Employment Type: Full-Time / Contract
We are actively hiring a Google Cloud Platform (GCP) Data Architect to design implement and manage scalable secure and high-performing data solutions on GCP. The ideal candidate will have strong experience building modern data platforms leveraging services such as BigQuery Dataflow Dataproc and Cloud Storage.
Key Responsibilities -
Partner with business and functional teams to gather requirements and evaluate integration impacts on the overall enterprise architecture.
-
Define and document technical architecture clarify solution requirements and address design gaps.
-
Develop and implement solutions aligned with established architectural standards and quality frameworks.
-
Establish development standards reusable templates and best practices to ensure consistency and scalability.
-
Automate routine development and operational activities using scripts and reusable frameworks.
-
Design comprehensive end-to-end data architectures on GCP focusing on scalability performance reliability and cost optimization.
-
Define and implement best practices for data governance metadata management security and compliance.
Data Management & Governance Big Data & ETL/ELT Architecture -
Design and optimize data ingestion and transformation pipelines using GCP tools such as Dataflow Dataproc BigQuery Pub/Sub and Cloud Composer.
-
Ensure efficient processing and storage of large-scale structured and unstructured datasets.
-
Architect and manage Databricks environments for advanced analytics and machine learning use cases.
-
Optimize Spark-based processing workloads for performance and cost efficiency.
Required Qualifications -
Demonstrated experience as a Cloud Architect preferably within GCP environments.
-
Strong understanding of data architecture principles and governance models.
-
Hands-on expertise with big data technologies including Spark Hadoop Kafka and Databricks.
-
Proven experience designing and optimizing ETL/ELT pipelines.
-
In-depth knowledge of core GCP services including BigQuery Dataflow Dataproc Pub/Sub Cloud Storage IAM and Cloud Composer.
-
Strong analytical communication and stakeholder engagement skills.
Preferred Qualifications -
Google Cloud Professional Cloud Architect certification.
-
Experience with hybrid or multi-cloud environments.
-
Familiarity with data privacy and compliance standards (e.g. GDPR HIPAA).
-
Understanding of DevOps methodologies and CI/CD practices for data engineering solutions.
If you want I can also:
-
Make it shorter for LinkedIn
-
Convert it into a vendor submission format
-
Create a Boolean string for sourcing
-
Make it more technical or more business-focused
Job Title: GCP Data Architect Location: Toronto ON Work Model: Hybrid (4 days onsite per week) Employment Type: Full-Time / Contract We are actively hiring a Google Cloud Platform (GCP) Data Architect to design implement and manage scalable secure and high-performing data solutions on GCP. The...
Job Title: GCP Data Architect Location: Toronto ON
Work Model: Hybrid (4 days onsite per week)
Employment Type: Full-Time / Contract
We are actively hiring a Google Cloud Platform (GCP) Data Architect to design implement and manage scalable secure and high-performing data solutions on GCP. The ideal candidate will have strong experience building modern data platforms leveraging services such as BigQuery Dataflow Dataproc and Cloud Storage.
Key Responsibilities -
Partner with business and functional teams to gather requirements and evaluate integration impacts on the overall enterprise architecture.
-
Define and document technical architecture clarify solution requirements and address design gaps.
-
Develop and implement solutions aligned with established architectural standards and quality frameworks.
-
Establish development standards reusable templates and best practices to ensure consistency and scalability.
-
Automate routine development and operational activities using scripts and reusable frameworks.
-
Design comprehensive end-to-end data architectures on GCP focusing on scalability performance reliability and cost optimization.
-
Define and implement best practices for data governance metadata management security and compliance.
Data Management & Governance Big Data & ETL/ELT Architecture -
Design and optimize data ingestion and transformation pipelines using GCP tools such as Dataflow Dataproc BigQuery Pub/Sub and Cloud Composer.
-
Ensure efficient processing and storage of large-scale structured and unstructured datasets.
-
Architect and manage Databricks environments for advanced analytics and machine learning use cases.
-
Optimize Spark-based processing workloads for performance and cost efficiency.
Required Qualifications -
Demonstrated experience as a Cloud Architect preferably within GCP environments.
-
Strong understanding of data architecture principles and governance models.
-
Hands-on expertise with big data technologies including Spark Hadoop Kafka and Databricks.
-
Proven experience designing and optimizing ETL/ELT pipelines.
-
In-depth knowledge of core GCP services including BigQuery Dataflow Dataproc Pub/Sub Cloud Storage IAM and Cloud Composer.
-
Strong analytical communication and stakeholder engagement skills.
Preferred Qualifications -
Google Cloud Professional Cloud Architect certification.
-
Experience with hybrid or multi-cloud environments.
-
Familiarity with data privacy and compliance standards (e.g. GDPR HIPAA).
-
Understanding of DevOps methodologies and CI/CD practices for data engineering solutions.
If you want I can also:
-
Make it shorter for LinkedIn
-
Convert it into a vendor submission format
-
Create a Boolean string for sourcing
-
Make it more technical or more business-focused
View more
View less