Project the aim youll have
Join a new strategic datatransformation project where were moving analytics from onpremise to GCP and building our data architecture and data model from the ground up with a strong focus on business value creation and CX of our customers.
We work with technologies like GCP Spark Python Kubernetes BigQuery Vertex AI Terraform Looker. We integrate diverse high volume data sources design streaming and batch processing layers implement data governance lineage data quality and data security and set up CI/CD and monitoring/SLOs to shorten the path from question to answer for our business and create a solid foundation for AI/LLM driven solutions.
Position how youll contribute
- Collaborate with data engineering analytics and operations teams to streamline data applications including big data and operational workflows.
- Provide documentation of infrastructure processes and compliance controls.
- Monitor infrastructure health performance and security and resolve issues promptly.
- Conduct regular reviews and audits of systems to ensure ongoing compliance and drive remediation as needed.
- Implement and enforce privacy and security requirements in line with organizational and regulatory standards.
- Lead the technical implementation of access controls encryption data retention and security monitoring.
- Automate and document recurring operational and compliance procedures to ensure reliability and transparency.
Qualifications :
Expectations the experience you need
- 3 years of experience as a DevOps Engineer with handson expertise in data management in GCP Kubernetes and Spark.
- Prior experience supporting data infrastructure or analytics platforms.
- Experience with InfrastructureasCode tools.
- Skilled in scripting languages for automation tasks.
- Familiarity with cloud monitoring tools.
- Strong understanding of networking security and cloud infrastructure best practices.
- Excellent problemsolving skills proactive mindset and strong communication abilities.
Additional Information :
Our offer professional development personal growth
- Flexible employment and remote work
- International projects with leading global clients
- International business trips
- Non-corporate atmosphere
- Language classes
- Internal & external training
- Private healthcare and insurance
- Multisport card
- Well-being initiatives
Remote Work :
Yes
Employment Type :
Full-time
Project the aim youll haveJoin a new strategic datatransformation project where were moving analytics from onpremise to GCP and building our data architecture and data model from the ground up with a strong focus on business value creation and CX of our customers.We work with technologies like GCP...
Project the aim youll have
Join a new strategic datatransformation project where were moving analytics from onpremise to GCP and building our data architecture and data model from the ground up with a strong focus on business value creation and CX of our customers.
We work with technologies like GCP Spark Python Kubernetes BigQuery Vertex AI Terraform Looker. We integrate diverse high volume data sources design streaming and batch processing layers implement data governance lineage data quality and data security and set up CI/CD and monitoring/SLOs to shorten the path from question to answer for our business and create a solid foundation for AI/LLM driven solutions.
Position how youll contribute
- Collaborate with data engineering analytics and operations teams to streamline data applications including big data and operational workflows.
- Provide documentation of infrastructure processes and compliance controls.
- Monitor infrastructure health performance and security and resolve issues promptly.
- Conduct regular reviews and audits of systems to ensure ongoing compliance and drive remediation as needed.
- Implement and enforce privacy and security requirements in line with organizational and regulatory standards.
- Lead the technical implementation of access controls encryption data retention and security monitoring.
- Automate and document recurring operational and compliance procedures to ensure reliability and transparency.
Qualifications :
Expectations the experience you need
- 3 years of experience as a DevOps Engineer with handson expertise in data management in GCP Kubernetes and Spark.
- Prior experience supporting data infrastructure or analytics platforms.
- Experience with InfrastructureasCode tools.
- Skilled in scripting languages for automation tasks.
- Familiarity with cloud monitoring tools.
- Strong understanding of networking security and cloud infrastructure best practices.
- Excellent problemsolving skills proactive mindset and strong communication abilities.
Additional Information :
Our offer professional development personal growth
- Flexible employment and remote work
- International projects with leading global clients
- International business trips
- Non-corporate atmosphere
- Language classes
- Internal & external training
- Private healthcare and insurance
- Multisport card
- Well-being initiatives
Remote Work :
Yes
Employment Type :
Full-time
View more
View less