About us:
We are proudly owned by Auto & General Insurance Company. The A&G group is made up of trusted and well-known brands such as iSelect Compare the Market and Budget Direct. In late 2024 iSelect and Compare the Market formed a new Aggregation business within the A&G group and now support millions of Australians to compare and buy their personal finance and household products such as insurance energy and loans.
iSelect
At iSelect were passionate about making Australians lives easier by saving them time money and effort. With over 20 years experience we are a one-stop destination for comparison and purchasing across insurance utilities and personal finance products. We compare a wide range of brands and our trained comparison experts help customers to compare select and save more putting them in control.
Compare the Market
At CTM our Noble Purpose is to make a difference in the lives of Australians by simplifying the process of making sound financial decisions. This Noble Purpose is our cultural north star guiding our direction prioritisation and decision making
How you fit:
The DataOps Engineer is responsible for automating monitoring and optimizing data pipelines and infrastructure to ensure efficient scalable and reliable data workflows. This role focuses on DataOps best practices CI/CD automation cloud infrastructure observability and security enabling seamless data delivery across the organization.
The ideal candidate has expertise in DevOps cloud data platforms infrastructure automation and data pipeline orchestration. They will work closely with Data Engineers Data Scientists Platform Engineers and Security teams to improve data availability governance and operational efficiency.
The primary objectives of the role are:
- Streamlining data pipeline deployment through CI/CD
- Ensuring observability performance and security in cloud data environments
- Automating infrastructure management and data workflow orchestration
- Driving best practices for DataOps governance and cost optimization
What you do:
Data Pipeline Automation & Orchestration
- Design implement and maintain CI/CD pipelines for automated data workflow deployment.
- Develop infrastructure-as-code (IaC) templates for provisioning cloud-based data environments/resources.
- Automate ETL/ELT workflow management using orchestration tools like Apache Airflow Prefect or AWS Step Functions.
- Optimize data ingestion transformation and processing pipelines for scalability and efficiency.
Cloud Infrastructure & Security
- Manage data infrastructure in AWS Azure or GCP ensuring high availability and security.
- Implement role-based access control (RBAC) encryption and data security best practices.
- Collaborate with Security teams to ensure compliance with industry regulations (GDPR HIPAA SOC2).
- Monitor cloud costs and optimize resource usage for storage compute and networking.
Monitoring Observability & Incident Response
- Develop real-time monitoring dashboards and alerts for data pipeline performance and failures.
- Use observability tools such as Datadog Prometheus Grafana CloudWatch track latency errors and throughput.
- Establish incident response workflows and automated rollback strategies for failures.
- Work with Site Reliability Engineers (SRE) and DevOps teams to maintain high system uptime and resilience.
DataOps & DevOps Best Practices
- Implement version control automated testing and release management for data pipelines.
- Promote DataOps methodologies to improve collaboration between Data Engineering IT and Business teams.
- Ensure proper data governance metadata tracking and auditability in data workflows.
- Optimize data processing efficiency
Collaboration & Stakeholder Engagement
- Work closely with Data Engineers Data Scientists and Platform Engineers to improve data workflows.
- Act as a DataOps champion providing guidance on best practices for pipeline reliability and security.
- Support business teams by ensuring data availability accessibility and lineage tracking.
What you need:
Qualifications:
Bachelors or Masters degree in Computer Science Data Engineering Cloud Computing or a related field/relevant experience.
Experience & Skills:
- 3 years of experience in DevOps Data Engineering or Cloud Infrastructure roles.
- Experience with CI/CD tools (GitHub Actions Jenkins Azure DevOps etc.)
- Proficiency in orchestration tools (Apache Airflow Prefect Dagster AWS Step Functions etc.).
- Strong understanding of ETL/ELT pipelines workflow scheduling and automation.
- Hands-on experience with cloud data platforms (AWS Azure GCP).
- Proficiency in Terraform for infrastructure-as-code (IaC).
- Experience with observability tools (Datadog Prometheus Grafana CloudWatch Splunk etc.).
- Strong troubleshooting skills for data pipeline failures latency and performance optimization.
- Proficiency in Python Bash or PowerShell for automation and scripting.
- Experience with SQL and NoSQL databases (PostgreSQL Snowflake BigQuery DynamoDB etc.).
- Ability to work with cross-functional teams (Data Engineering IT Security Business Intelligence).
- Strong documentation skills to maintain runbooks architecture diagrams and pipeline workflows.
Whats in it for you:
Career Opportunities
- A competitive salary plus generous uncapped commission scheme
- The opportunity to work with a well-known household brand and public listed company
- Work with a team of highly energised motivated and driven professionals
- 3 recharge days per year (yes in addition to your annual leave!)
- A free coffee each day a funky caf and a state of the artwork space
- An impressive Employee Benefits Programs
If you are someone who fosters team spirit has resilience and aspires to continue learning and developing their skills to drive commercial outcomes you will have tremendous success in helping us achieve our progressive growth agendas. High energy levels and a love of having fun are also important to working in this energetic business.