Were seeking a DataOps Engineer who thrives at the intersection of data engineering DevOps and workflow orchestration. Youll be instrumental in designing automating and optimizing data pipelines that power analytics machine learning and operational intelligence across the organization.
If youre passionate about building resilient data systems streamlining deployments and enabling data teams to move faster with confidencethis role is for you.
Responsibilities
- Administer a large-scale MongoDB cluster using a combination of bash python and Linux OS skills.
- Work with data engineers to design and maintain scalable automated data pipelines using tools like Apache Airflow dbt and Terraform.
- Create abstractions around data workflows for self-service creation of data products in development teams.
- Implement CI/CD workflows for data infrastructure and analytics code
- Monitor and optimize data workflows for performance reliability and cost-efficiency
- Integrate cloud-native services (e.g. S3 Redshift BigQuery Databricks) into unified data workflows
- Develop disaster recovery strategies and backup automation for critical data assets
- Champion DataOps best practices across teams including version control testing and observability.
- Participate in the teams emergency on-call rotation to ensure 24/7 uptime of our systems.
Qualifications :
- 3 years of experience in data engineering DevOps or cloud infrastructure roles
- Proficiency in automating administrative workflows using Bash and Pythonwith an emphasis on writing clean and maintainable code.
- Intermediate knowledge of Linux system administration.
- Basic proficiency writing queries in both relational (SQL) and NoSQL paradigms.
- Experience administering Big Data querying engines like Hadoop Apache Spark or Google BigQuery.
- Experience with data orchestration tools (Airflow Prefect Dagster)
- Familiarity with cloud platforms (AWS Azure or GCP) and infrastructure-as-code (Terraform CloudFormation)
- Strong understanding of data lake and warehouse architectures
- Experience working with containers.
Additional Information :
Headquartered in Mountain View California with over 220 team members across the United States and Europe DNAnexus is experiencing rapid growth and market adoption. With the support of leading investors including Google Ventures and Blackstone and trusted by hundreds of the worlds biomedical leaders the company is at the innovative forefront with our precision health data cloud to drive scientific breakthroughs. If you are interested in joining our team please apply today!
Remote Work :
Yes
Employment Type :
Full-time