Calix provides the cloud software platforms systems and services required for communications service providers to simplify their businesses excite their subscribers and grow their value.
The Cloud Platform Engineering team is responsible for the Platforms Tools and CI/CD pipelines at Calix.
We are looking for aGCP Cloud Platform Engineerto design implement and manage cloud infrastructure and data pipelines usingGoogle Cloud Platform (GCP)services likeDataStreamDataflowApache FlinkApache Spark andDataproc. The ideal candidate will have a strong background inDevOps practicescloud infrastructure automation andbig data technologies. You will collaborate with data engineers developers and operations teams to ensure seamless deployment monitoring and optimization of data solutions.
Responsibilities:
- Design and implement cloud infrastructure usingIaC Terraform etc.
- Automate provisioning and management ofDataproc clustersDataflow jobs and other GCP resources
- Build and maintain CD pipelines for deploying data pipelines streaming applications and cloud infrastructure.
- Integrate tools likeGitLab CI/CD orCloud Buildfor automated testing and deployment.
- Deploy and manage real-time and batch data pipelines usingDataflowDataStream andApache Flink.
- Ensure seamless integration of data pipelines with other GCP services likeBig QueryCloud Storage andKafkaorPub/Sub.
- Implement monitoring and alerting solutions usingCloud MonitoringCloud Logging andPrometheus.
- Monitor performance reliability and cost ofDataproc clustersDataflow jobs and streaming applications.
- Optimize cloud infrastructure and data pipelines for performance scalability and cost-efficiency.
- Implement security best practices for GCP resources including IAM policies encryption and network security.
- Ensure Observability is an integral part of the infrastructure platforms and provides adequate visibility about their health utilization and cost.
- Collaborate extensively with cross functional teams to understand their requirements; educate them through documentation/trainings and improve the adoption of the platforms/tools.
Education and Experience:
- 10 years of overall experience in DevOps -cloud engineering or data engineering.
- 3 years of experience in DevOps cloud engineering or data engineering.
- Proficiency inGoogle Cloud Platform (GCP)services includingDataflowDataStreamDataprocBig Query andCloud Storage.
- Strong experience withApache SparkandApache Flinkfor distributed data processing.
- Knowledge of real-time data streaming technologies (e.g.Apache KafkaPub/Sub).
- Familiarity with data orchestration tools likeApache AirfloworCloud Composer.
- Expertise inInfrastructure as Code (IaC)tools likeTerraformorCloud Deployment Manager.
- Experience with CI/CD tools likeJenkinsGitLab CI/CD orCloud Build.
- Knowledge of containerization and orchestration tools likeDockerandKubernetes.
- Strong scripting skills for automation (e.g.BashPython).
- Experience with monitoring tools likeCloud MonitoringPrometheus andGrafana.
- Familiarity with logging tools likeCloud LoggingorELK Stack.
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration abilities.
- Ability to work in a fast-paced agile environment.
Location:
- India (Flexible hybrid work model - work from Bangalore office for 20 days in a quarter)
Required Experience:
Staff IC
Calix provides the cloud software platforms systems and services required for communications service providers to simplify their businesses excite their subscribers and grow their value.The Cloud Platform Engineering team is responsible for the Platforms Tools and CI/CD pipelines at Calix. We are lo...
Calix provides the cloud software platforms systems and services required for communications service providers to simplify their businesses excite their subscribers and grow their value.
The Cloud Platform Engineering team is responsible for the Platforms Tools and CI/CD pipelines at Calix.
We are looking for aGCP Cloud Platform Engineerto design implement and manage cloud infrastructure and data pipelines usingGoogle Cloud Platform (GCP)services likeDataStreamDataflowApache FlinkApache Spark andDataproc. The ideal candidate will have a strong background inDevOps practicescloud infrastructure automation andbig data technologies. You will collaborate with data engineers developers and operations teams to ensure seamless deployment monitoring and optimization of data solutions.
Responsibilities:
- Design and implement cloud infrastructure usingIaC Terraform etc.
- Automate provisioning and management ofDataproc clustersDataflow jobs and other GCP resources
- Build and maintain CD pipelines for deploying data pipelines streaming applications and cloud infrastructure.
- Integrate tools likeGitLab CI/CD orCloud Buildfor automated testing and deployment.
- Deploy and manage real-time and batch data pipelines usingDataflowDataStream andApache Flink.
- Ensure seamless integration of data pipelines with other GCP services likeBig QueryCloud Storage andKafkaorPub/Sub.
- Implement monitoring and alerting solutions usingCloud MonitoringCloud Logging andPrometheus.
- Monitor performance reliability and cost ofDataproc clustersDataflow jobs and streaming applications.
- Optimize cloud infrastructure and data pipelines for performance scalability and cost-efficiency.
- Implement security best practices for GCP resources including IAM policies encryption and network security.
- Ensure Observability is an integral part of the infrastructure platforms and provides adequate visibility about their health utilization and cost.
- Collaborate extensively with cross functional teams to understand their requirements; educate them through documentation/trainings and improve the adoption of the platforms/tools.
Education and Experience:
- 10 years of overall experience in DevOps -cloud engineering or data engineering.
- 3 years of experience in DevOps cloud engineering or data engineering.
- Proficiency inGoogle Cloud Platform (GCP)services includingDataflowDataStreamDataprocBig Query andCloud Storage.
- Strong experience withApache SparkandApache Flinkfor distributed data processing.
- Knowledge of real-time data streaming technologies (e.g.Apache KafkaPub/Sub).
- Familiarity with data orchestration tools likeApache AirfloworCloud Composer.
- Expertise inInfrastructure as Code (IaC)tools likeTerraformorCloud Deployment Manager.
- Experience with CI/CD tools likeJenkinsGitLab CI/CD orCloud Build.
- Knowledge of containerization and orchestration tools likeDockerandKubernetes.
- Strong scripting skills for automation (e.g.BashPython).
- Experience with monitoring tools likeCloud MonitoringPrometheus andGrafana.
- Familiarity with logging tools likeCloud LoggingorELK Stack.
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration abilities.
- Ability to work in a fast-paced agile environment.
Location:
- India (Flexible hybrid work model - work from Bangalore office for 20 days in a quarter)
Required Experience:
Staff IC
View more
View less