Full Time: Permanent
Remote-first with 2 days at the office per month (Oxford Street London)
Overview
We are seeking a highly experienced DevOps Engineer with a strong background in Google Cloud Platform (GCP) and a proven track record in delivering complex data analytics projects for clients. In this full-time permanent role you will be responsible for designing implementing and managing the infrastructure and deployment processes that drive successful client engagements. You will work as part of a consultancy team ensuring that each client engagement benefits from a robust scalable and secure cloud environment.
Responsibilities
- Design and implement scalable reliable GCP infrastructures tailored to each clients unique project requirements ensuring high performance availability and security.
- Work closely with client stakeholders full-stack developers data engineers and data scientists to define and execute efficient data ingestion processing and storage solutions that meet project deliverables.
- Implement and automate client-specific deployment processes using CI/CD pipelines and configuration management tools enabling rapid and reliable software releases in a consultancy environment.
- Develop processes around release management testing and automation to ensure successful project delivery adhering to client timelines and quality standards.
- Implement and manage real-time and batch data processing frameworks (e.g. Apache Kafka Apache Spark Google Cloud Dataproc) in line with project needs.
- Build and maintain robust monitoring logging and alerting systems for client projects ensuring system health and performance are continuously optimised and cost-efficient.
- Ensure each clients project complies with data privacy regulations by implementing appropriate access controls and data encryption measures
- Troubleshoot and resolve complex technical challenges related to infrastructure data pipelines and overall application performance during client engagements.
- Remain updated on industry trends and best practices in DevOps data engineering and cloud technologies with a particular focus on GCP to provide cutting-edge solutions to our clients.
Experience & Qualifications
- Proven experience as a DevOps Engineer/Consultant with a history of successful client project delivery.
- Extensive hands-on experience with GCP services such as BigQuery Cloud Storage Dataflow Pub/Sub Dataproc and Cloud Composer.
- Strong programming and scripting skills in languages like Python Bash or Go to automate tasks and build necessary tools.
- Expertise in designing and optimising data pipelines using frameworks like Apache Airflow or equivalent.
- Demonstrated experience with real-time and batch data processing frameworks including Apache Kafka Apache Spark or Google Cloud Dataflow.
- Proficiency in CI/CD tools such as Jenkins GitLab CI/CD or Cloud Build along with a strong command of version control systems like Git.
- Solid understanding of data privacy regulations and experience implementing robust security measures.
- Familiarity with infrastructure as code tools such as Terraform or Deployment Manager.
- Excellent problem-solving and analytical skills with the ability to architect and troubleshoot complex systems across diverse client projects.
- Strong communication skills enabling effective collaboration with both technical and non-technical client stakeholders.
Why BIPROCSI
We started this company with a goal a goal to be the very best. We dont just believe it; we know our team is our biggest asset. Were a group of passionate innovators (*nerds) obsessed with personal growth that believes in challenging the status quo to ensure we come up with the best solutions.
We have a phenomenal culture unparalleled drive and every single person in our team is very carefully selected to make sure we maintain this. We are diverse and we celebrate that. We are whole people with families hobbies and lives outside of work and make sure we have a healthy work-life balance.
We are rapidly expanding and on a growth trajectory. We are continuously hiring at all levels across Business Intelligence Analytics Data Warehousing Data Science and Data Engineering.
Our Mission Statement
To be the benchmark for Excellence and Quality of Service in everything we do.
For more information please visit our website -