About Infinitive:
Infinitive is a data and AI consultancy that enables its clients to modernize monetize and operationalize their data to create lasting and substantial value. We possess deep industry and technology expertise to drive and sustain adoption of new capabilities. We match our people and personalities to our clients culture while bringing the right mix of talent and skills to enable high return on investment.
Infinitive has been named Best Small Firms to Work For by Consulting Magazine 7 times most recently in 2024. Infinitive has also been named a Washington Post Top Workplace Washington Business Journal Best Places to Work and Virginia Business Best Places to Work.
About the Role:
We are seeking a skilled DevOps Engineer with data engineering experience to join our dynamic team. The ideal candidate will have expertise in ElasticSearch CI/CD Git and Infrastructure as Code (IaC) while also possessing experience in data engineering. You will be responsible for designing automating and optimizing infrastructure deployment pipelines and data workflows. This role requires close collaboration with data engineers software developers and operations teams to build scalable secure and highperformance data platforms.
Key Responsibilities:
DevOps & Infrastructure Management:
- Design deploy and manage ElasticSearch clusters ensuring high availability scalability and performance for search and analytics workloads.
- Develop and maintain CI/CD pipelines for automating build test and deployment processes using tools like Jenkins GitHub Actions GitLab CI/CD or ArgoCD.
- Manage and optimize version control workflows using Git ensuring best practices for branching merging and release management.
- Implement Infrastructure as Code (IaC) solutions using Terraform CloudFormation or Ansible for cloud and onprem infrastructure.
- Automate system monitoring alerting and incident response using tools such as Prometheus Grafana Elastic Stack (ELK) or Datadog.
Data Engineering & Pipeline Automation:
- Collaborate with data engineering teams to design and deploy scalable ETL/ELT pipelines using Apache Kafka Apache Spark Kinesis Pub/Sub Dataflow Dataproc or AWS Glue.
- Optimize data storage and retrieval for largescale analytics and search workloads using ElasticSearch BigQuery Snowflake Redshift or ClickHouse.
- Ensure data pipeline reliability and performance implementing monitoring logging and alerting for data workflows.
- Automate data workflows and infrastructure scaling for highthroughput realtime and batch processing environments.
- Implement data security best practices including access controls encryption and compliance with industry standards such as GDPR HIPAA or SOC 2.
Required Skills & Qualifications:
3 years of experience in DevOps Data Engineering or Infrastructure Engineering.
- Strong expertise in ElasticSearch including cluster tuning indexing strategies and scaling.
- Handson experience with CI/CD pipelines using Jenkins GitHub Actions GitLab CI/CD or ArgoCD.
- Proficiency in Git for version control branching strategies and code collaboration.
- Experience with Infrastructure as Code (IaC) using Terraform CloudFormation Ansible or Pulumi.
- Solid experience with cloud platforms (AWS GCP or Azure) and cloudnative data engineering tools.
- Proficiency in Python Bash or Scala for automation data processing and infrastructure scripting.
- Handson experience with containerization and orchestration (Docker Kubernetes Helm).
- Experience with data engineering tools including Apache Kafka Spark Streaming Kinesis Pub/Sub or Dataflow.
- Strong understanding of ETL/ELT workflows and distributed data processing frameworks.
Preferred Qualifications:
- Experience working with data warehouses and lakes (BigQuery Snowflake Redshift ClickHouse S3 GCS).
- Knowledge of monitoring and logging solutions for dataintensive applications.
- Familiarity with security best practices for data storage transmission and processing.
- Understanding of eventdriven architectures and realtime data processing frameworks.
- Certifications such as AWS Certified DevOps Engineer Google Cloud Professional Data Engineer or Certified Kubernetes Administrator (CKA).
Required Experience:
Manager