Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailWe are looking for a Senior Data Engineer to join our Technology & Innovation Group. As a Senior Data Engineer you will play a key role in shaping the future of Atlassian Williams Racings data platform. You will be responsible for designing implementing and evolving scalable secure and high-performance data infrastructure and pipelines while also contributing to the strategic direction of our data architecture.
This role bridges deep technical expertise with business context ensuring the data engineering layer effectively supports performance operations and innovation across the organisation. As we scale up our data platform you will also mentor junior engineers influence architectural decisions and help integrate new technologies that support Williams broader transformation journey.
Main duties:
Lead the design development and optimisation of modern cloud-native data pipelines and infrastructure to support large-scale high-value data workloads across the organisation.
Work closely with Data Architects and the Head of Data & AI to support the development and implementation of our Data Strategy and the build-out of our data platform.
Evaluate and implement best-in-class technologies frameworks and tools for ingestion processing governance observability and storage of structured and unstructured data.
Collaborate across business and technical teams to identify requirements develop solutions and ensure that data products support analytics AI/ML and operational reporting use cases.
Champion data quality observability lineage and metadata management to ensure data is trusted discoverable and reliable.
Drive cloud migration efforts including deployment of scalable services in AWS (or other cloud environments) infrastructure as code and automation of data operations.
Provide guidance and mentorship to other engineers helping grow a culture of high-performance engineering and continuous improvement.
Create and maintain robust documentation of pipelines architecture and best practices to ensure sustainability and knowledge sharing.
Skills and experience required:
Bachelors or Masters degree in Computer Science Engineering or a related field.
Previous proven hands-on experience in data engineering or software engineering with strong exposure to modern data platforms.
Proven expertise in building and maintaining data pipelines and ETL/ELT workflows using tools like Apache Airflow dbt or custom frameworks.
Strong experience with cloud data platforms (e.g. AWS Azure GCP) and distributed data systems (Spark Kafka or Flink etc)
Proficiency in Python (or similar languages) with solid software engineering fundamentals (testing modularity version control).
Hands-on experience with SQL and NoSQL data stores such as PostgreSQL Redshift DynamoDB or MongoDB.
Good understanding of data warehousing and modern architectures (e.g. data lakehouse data mesh).
Familiarity with DevOps/CI-CD practices infrastructure-as-code (Terraform CloudFormation) and containerisation (Docker/Kubernetes).
Understanding of data quality observability lineage and metadata management practices.
Desirable:
Experience with event-driven architectures and real-time data processing.
Prior exposure to data governance cataloguing and security frameworks (e.g. IAM encryption GDPR).
Experience in a fast-paced environment such as automotive motorsport or high-performance computing.
A track record of mentoring junior engineers and contributing to engineering culture and team standards.
Additional Information :
#LI-KW1
Atlassian Williams Racing is an equal opportunity employer that values diversity and inclusion. We are happy to discuss reasonable job adjustments.
Remote Work :
No
Employment Type :
Full-time
Full-time