AuxoAI is hiring a Data Architect GCP to lead enterprise data platform design architecture modernization and solution delivery across global client this client-facing role you will architect scalable data platforms using GCP-native services guide onshore/offshore data engineering teams and define best practices across ingestion transformation governance and consumption layers.
This role is ideal for someone who combines deep GCP platform expertise with leadership experience and is confident working with both engineering teams and executive stakeholders.
Responsibilities:
Design and implement enterprise-scale data architectures using GCP services with BigQuery as the central analytics platform
Lead end-to-end implementation of medallion architecture (Raw Processed Curated) patterns
Oversee data ingestion pipelines using Cloud Composer Dataflow (Apache Beam) Pub/Sub and Cloud Storage
Implement scalable ELT workflows using Dataform and modular SQLX transformations
Optimize BigQuery workloads through advanced partitioning clustering and materialized views
Lead architectural reviews platform standardization and stakeholder engagements across engineering and business teams
Implement data governance frameworks leveraging tools like Atlan Collibra and Dataplex
Collaborate with ML teams to support Vertex AI-based pipeline design and model deployment
Enable downstream consumption through Power BI Looker and optimized data marts
Drive adoption of Infrastructure-as-Code (Terraform) and promote reusable architecture templates
Manage a distributed team of data engineers; set standards review code and ensure platform stability
Requirements
10 years of experience in data architecture and engineering
4 years of hands-on GCP experience including BigQuery Dataflow Cloud Composer Dataform and Cloud Storage
Deep understanding of streaming batch data patterns event-driven ingestion and modern warehouse design
Proven leadership of cross-functional distributed teams in client-facing roles
Strong programming skills in Python and SQL
Experience working with data catalog tools (Atlan Collibra) Dataplex and enterprise source connectors
Excellent communication and stakeholder management skills
Preferred Qualifications:
GCP Professional Data Engineer or Cloud Architect certification
Experience with Vertex AI Model Registry Feature Store or ML pipeline integration
Familiarity with AlloyDB Cloud Spanner Firestore and enterprise integration tools (e.g. Salesforce SAP Oracle)
Background in legacy platform migration (Oracle Azure SQL Server)
Required Skills:
Bachelors degree in Computer Science Engineering or related field; or equivalent work experience. Proven experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps Git GitHub Jenkins and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS AKS) and container orchestration platforms. Deep understanding of cloud-native architectures microservices and serverless computing. Familiarity with Azure Synapse ADF ADLS and AWS data services (EMR Redshift Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform CloudFormation or ARM templates. Experience with monitoring tools (e.g. Prometheus Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.
AuxoAI is hiring a Data Architect GCP to lead enterprise data platform design architecture modernization and solution delivery across global client this client-facing role you will architect scalable data platforms using GCP-native services guide onshore/offshore data engineering teams and define ...
AuxoAI is hiring a Data Architect GCP to lead enterprise data platform design architecture modernization and solution delivery across global client this client-facing role you will architect scalable data platforms using GCP-native services guide onshore/offshore data engineering teams and define best practices across ingestion transformation governance and consumption layers.
This role is ideal for someone who combines deep GCP platform expertise with leadership experience and is confident working with both engineering teams and executive stakeholders.
Responsibilities:
Design and implement enterprise-scale data architectures using GCP services with BigQuery as the central analytics platform
Lead end-to-end implementation of medallion architecture (Raw Processed Curated) patterns
Oversee data ingestion pipelines using Cloud Composer Dataflow (Apache Beam) Pub/Sub and Cloud Storage
Implement scalable ELT workflows using Dataform and modular SQLX transformations
Optimize BigQuery workloads through advanced partitioning clustering and materialized views
Lead architectural reviews platform standardization and stakeholder engagements across engineering and business teams
Implement data governance frameworks leveraging tools like Atlan Collibra and Dataplex
Collaborate with ML teams to support Vertex AI-based pipeline design and model deployment
Enable downstream consumption through Power BI Looker and optimized data marts
Drive adoption of Infrastructure-as-Code (Terraform) and promote reusable architecture templates
Manage a distributed team of data engineers; set standards review code and ensure platform stability
Requirements
10 years of experience in data architecture and engineering
4 years of hands-on GCP experience including BigQuery Dataflow Cloud Composer Dataform and Cloud Storage
Deep understanding of streaming batch data patterns event-driven ingestion and modern warehouse design
Proven leadership of cross-functional distributed teams in client-facing roles
Strong programming skills in Python and SQL
Experience working with data catalog tools (Atlan Collibra) Dataplex and enterprise source connectors
Excellent communication and stakeholder management skills
Preferred Qualifications:
GCP Professional Data Engineer or Cloud Architect certification
Experience with Vertex AI Model Registry Feature Store or ML pipeline integration
Familiarity with AlloyDB Cloud Spanner Firestore and enterprise integration tools (e.g. Salesforce SAP Oracle)
Background in legacy platform migration (Oracle Azure SQL Server)
Required Skills:
Bachelors degree in Computer Science Engineering or related field; or equivalent work experience. Proven experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps Git GitHub Jenkins and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS AKS) and container orchestration platforms. Deep understanding of cloud-native architectures microservices and serverless computing. Familiarity with Azure Synapse ADF ADLS and AWS data services (EMR Redshift Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform CloudFormation or ARM templates. Experience with monitoring tools (e.g. Prometheus Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.
View more
View less