AuxoAI is seeking a Senior GenAI Data Engineer with strong fundamentals in data engineering and end-to-end solution design. In this role you will design and develop production-grade pipelines leverage GenAI tools (Copilot Claude Gemini) to boost development productivity and define engineering best practices across complex data environments.
This is a highly collaborative cross-functional role ideal for someone who thrives at the intersection of data engineering excellence and GenAI-powered innovation.
Responsibilities:
Architect and develop end-to-end data pipelines from ingestion to transformation to consumption
Lead solutioning and integration for complex data workflows (batch and streaming)
Use AI-assisted coding tools (e.g. GitHub Copilot Claude Gemini) to accelerate code development refactoring and debugging
Implement robust data quality testing lineage and governance frameworks
Drive best practices across pipeline performance reusability and scalability
Mentor junior engineers and contribute to capability building within the data team
Requirements
6 years of experience in data engineering with expertise in:
End-to-end pipeline development (batch and streaming)
Data modeling (dimensional Data Vault OBT)
ETL/ELT design patterns performance tuning and optimization
SQL (Advanced) and Python (Advanced)
Apache Spark for large-scale data processing
Proficiency using AI coding tools (e.g. Copilot Claude Gemini) to enhance productivity and code quality
Strong understanding of data quality frameworks unit testing and CI/CD for data workflows
Preferred Qualifications:
Experience with Google Cloud Platform services:
Exposure to finance or sales data domains
Familiarity with Databricks Delta Lake or Apache Iceberg
GCP Professional Data Engineer certification is a plus
What We Offer:
Opportunity to work on modern data platforms with GenAI integration
Access to professional development support and cloud certification sponsorship
Competitive compensation and flexible work arrangements
A fast-paced high-impact environment where innovation is valued
Required Skills:
Bachelors degree in Computer Science Engineering or related field; or equivalent work experience. Proven experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps Git GitHub Jenkins and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS AKS) and container orchestration platforms. Deep understanding of cloud-native architectures microservices and serverless computing. Familiarity with Azure Synapse ADF ADLS and AWS data services (EMR Redshift Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform CloudFormation or ARM templates. Experience with monitoring tools (e.g. Prometheus Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.
AuxoAI is seeking a Senior GenAI Data Engineer with strong fundamentals in data engineering and end-to-end solution design. In this role you will design and develop production-grade pipelines leverage GenAI tools (Copilot Claude Gemini) to boost development productivity and define engineering best p...
AuxoAI is seeking a Senior GenAI Data Engineer with strong fundamentals in data engineering and end-to-end solution design. In this role you will design and develop production-grade pipelines leverage GenAI tools (Copilot Claude Gemini) to boost development productivity and define engineering best practices across complex data environments.
This is a highly collaborative cross-functional role ideal for someone who thrives at the intersection of data engineering excellence and GenAI-powered innovation.
Responsibilities:
Architect and develop end-to-end data pipelines from ingestion to transformation to consumption
Lead solutioning and integration for complex data workflows (batch and streaming)
Use AI-assisted coding tools (e.g. GitHub Copilot Claude Gemini) to accelerate code development refactoring and debugging
Implement robust data quality testing lineage and governance frameworks
Drive best practices across pipeline performance reusability and scalability
Mentor junior engineers and contribute to capability building within the data team
Requirements
6 years of experience in data engineering with expertise in:
End-to-end pipeline development (batch and streaming)
Data modeling (dimensional Data Vault OBT)
ETL/ELT design patterns performance tuning and optimization
SQL (Advanced) and Python (Advanced)
Apache Spark for large-scale data processing
Proficiency using AI coding tools (e.g. Copilot Claude Gemini) to enhance productivity and code quality
Strong understanding of data quality frameworks unit testing and CI/CD for data workflows
Preferred Qualifications:
Experience with Google Cloud Platform services:
Exposure to finance or sales data domains
Familiarity with Databricks Delta Lake or Apache Iceberg
GCP Professional Data Engineer certification is a plus
What We Offer:
Opportunity to work on modern data platforms with GenAI integration
Access to professional development support and cloud certification sponsorship
Competitive compensation and flexible work arrangements
A fast-paced high-impact environment where innovation is valued
Required Skills:
Bachelors degree in Computer Science Engineering or related field; or equivalent work experience. Proven experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps Git GitHub Jenkins and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS AKS) and container orchestration platforms. Deep understanding of cloud-native architectures microservices and serverless computing. Familiarity with Azure Synapse ADF ADLS and AWS data services (EMR Redshift Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform CloudFormation or ARM templates. Experience with monitoring tools (e.g. Prometheus Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.
View more
View less