Data Engineer GCP

Not Interested
Bookmark
Report This Job

profile Job Location:

Bengaluru Urban - India

profile Monthly Salary: INR 3500000 - 3500000
profile Experience Required: 1-3years
Posted on: 3 hours ago
Vacancies: 1 Vacancy

Job Summary

AuxoAI is hiring a Data Engineer to join our growing data engineering team focused on building production-grade pipelines on Google Cloud Platform (GCP). This is a hands-on role ideal for someone early in their data career whos eager to learn fast work with modern cloud-native tools and support the development of scalable data systems.

Youll work alongside experienced engineers and analysts on real client projects across industries helping implement ELT processes BigQuery pipelines orchestration workflows and foundational MLOps capabilities.


Responsibilities:
  • Assist in developing scalable data pipelines using GCP tools such as Dataflow BigQuery Cloud Composer and Pub/Sub

  • Write and maintain SQL and Python scripts for data ingestion cleaning and transformation

  • Support the creation and maintenance of Airflow DAGs in Cloud Composer for orchestration

  • Collaborate with senior data engineers and data scientists to implement data validation and monitoring checks

  • Participate in code reviews sprint planning and cross-functional team meetings

  • Help with documentation and knowledge base creation for data workflows and pipeline logic

  • Gain exposure to medallion architecture data lake design and performance tuning on BigQuery



Requirements

  • 2-4 years of relevant experience in data engineering backend development or analytics engineering

  • Strong knowledge of SQL and working-level proficiency in Python

  • Exposure to cloud platforms (GCP preferred; AWS/Azure acceptable)

  • Familiarity with data pipeline concepts version control (Git) and basic workflow orchestration

  • Strong communication and documentation skills

  • Eagerness to learn take feedback and grow under mentorship


Bonus Skills:

  • Hands-on experience with GCP tools like BigQuery Dataflow or Cloud Composer

  • Experience with dbt Dataform or Apache Beam

  • Exposure to CI/CD pipelines Terraform or containerization (Docker)

  • Knowledge of basic data modeling and schema design concepts




Required Skills:

Bachelors degree in Computer Science Engineering or related field; or equivalent work experience. Proven experience as a DevOps Engineer or similar role with hands-on expertise in AWS and Azure cloud environments. Strong proficiency in Azure DevOps Git GitHub Jenkins and CI/CD pipeline automation. Experience deploying and managing Kubernetes clusters (EKS AKS) and container orchestration platforms. Deep understanding of cloud-native architectures microservices and serverless computing. Familiarity with Azure Synapse ADF ADLS and AWS data services (EMR Redshift Glue) for data integration and analytics. Solid grasp of infrastructure as code (IaC) tools like Terraform CloudFormation or ARM templates. Experience with monitoring tools (e.g. Prometheus Grafana) and logging solutions for cloud-based applications. Excellent troubleshooting skills and ability to resolve complex technical issues in production environments.

AuxoAI is hiring a Data Engineer to join our growing data engineering team focused on building production-grade pipelines on Google Cloud Platform (GCP). This is a hands-on role ideal for someone early in their data career whos eager to learn fast work with modern cloud-native tools and support the ...
View more view more

Company Industry

IT Services and IT Consulting

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala