AuxoAI is seeking a Senior Data Engineer to lead the design development and optimization of modern data pipelines and cloud-native platforms using Google Cloud Platform (GCP). This role is ideal for someone with deep experience building scalable batch and streaming data workflows strong hands-on engineering skills and a drive to mentor junior engineers.
Youll work closely with cross-functional teams to build production-grade pipelines using tools like BigQuery Dataflow Pub/Sub Cloud Composer and Dataform enabling high-quality data delivery and analytics at scale.
Responsibilities:
Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
Design and maintain BigQuery datasets using best practices in partitioning clustering and materialized views
Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
Implement SQL-based transformations using Dataform (or dbt)
Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
Drive engineering best practices across CI/CD testing monitoring and pipeline observability
Partner with solution architects and product teams to translate data requirements into technical designs
Mentor junior data engineers and support knowledge-sharing across the team
Contribute to documentation code reviews sprint planning and agile ceremonies
Requirements
5 years of hands-on experience in data engineering with at least 2 years on GCP
Proven expertise in BigQuery Dataflow (Apache Beam) Cloud Composer (Airflow)
Strong programming skills in Python and/or Java
Experience with SQL optimization data modeling and pipeline orchestration
Familiarity with Git CI/CD pipelines and data quality monitoring frameworks
Exposure to Dataform dbt or similar tools for ELT workflows
Solid understanding of data architecture schema design and performance tuning
Excellent problem-solving and collaboration skills
Bonus Skills:
GCP Professional Data Engineer certification
Experience with Vertex AI Cloud Functions Dataproc or real-time streaming architectures
Familiarity with data governance tools (e.g. Atlan Collibra Dataplex)
Exposure to Docker/Kubernetes API integration and infrastructure-as-code (Terraform)
Required Skills:
Bachelors degree in Computer Science Engineering or related field; or equivalent work experience. Proven experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps Git GitHub Jenkins and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS AKS) and container orchestration platforms. Deep understanding of cloud-native architectures microservices and serverless computing. Familiarity with Azure Synapse ADF ADLS and AWS data services (EMR Redshift Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform CloudFormation or ARM templates. Experience with monitoring tools (e.g. Prometheus Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.
AuxoAI is seeking a Senior Data Engineer to lead the design development and optimization of modern data pipelines and cloud-native platforms using Google Cloud Platform (GCP). This role is ideal for someone with deep experience building scalable batch and streaming data workflows strong hands-on eng...
AuxoAI is seeking a Senior Data Engineer to lead the design development and optimization of modern data pipelines and cloud-native platforms using Google Cloud Platform (GCP). This role is ideal for someone with deep experience building scalable batch and streaming data workflows strong hands-on engineering skills and a drive to mentor junior engineers.
Youll work closely with cross-functional teams to build production-grade pipelines using tools like BigQuery Dataflow Pub/Sub Cloud Composer and Dataform enabling high-quality data delivery and analytics at scale.
Responsibilities:
Build and optimize batch and streaming data pipelines using Apache Beam (Dataflow)
Design and maintain BigQuery datasets using best practices in partitioning clustering and materialized views
Develop and manage Airflow DAGs in Cloud Composer for workflow orchestration
Implement SQL-based transformations using Dataform (or dbt)
Leverage Pub/Sub for event-driven ingestion and Cloud Storage for raw/lake layer data architecture
Drive engineering best practices across CI/CD testing monitoring and pipeline observability
Partner with solution architects and product teams to translate data requirements into technical designs
Mentor junior data engineers and support knowledge-sharing across the team
Contribute to documentation code reviews sprint planning and agile ceremonies
Requirements
5 years of hands-on experience in data engineering with at least 2 years on GCP
Proven expertise in BigQuery Dataflow (Apache Beam) Cloud Composer (Airflow)
Strong programming skills in Python and/or Java
Experience with SQL optimization data modeling and pipeline orchestration
Familiarity with Git CI/CD pipelines and data quality monitoring frameworks
Exposure to Dataform dbt or similar tools for ELT workflows
Solid understanding of data architecture schema design and performance tuning
Excellent problem-solving and collaboration skills
Bonus Skills:
GCP Professional Data Engineer certification
Experience with Vertex AI Cloud Functions Dataproc or real-time streaming architectures
Familiarity with data governance tools (e.g. Atlan Collibra Dataplex)
Exposure to Docker/Kubernetes API integration and infrastructure-as-code (Terraform)
Required Skills:
Bachelors degree in Computer Science Engineering or related field; or equivalent work experience. Proven experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps Git GitHub Jenkins and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS AKS) and container orchestration platforms. Deep understanding of cloud-native architectures microservices and serverless computing. Familiarity with Azure Synapse ADF ADLS and AWS data services (EMR Redshift Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform CloudFormation or ARM templates. Experience with monitoring tools (e.g. Prometheus Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.
View more
View less