Spark Developer

Skillfield

Not Interested
Bookmark
Report This Job

profile Job Location:

Melbourne - Australia

profile Monthly Salary: Not Disclosed
Posted on: 6 hours ago
Vacancies: 1 Vacancy

Job Summary

About Skillfield

Skillfield is a specialist technology consultancy operating at the intersection of cyber data and AI. We partner with enterprise clients to protect transform and enable their organisations through thoughtful outcomefocused solutions. Our reputation is built on outcomes not effort.


The Opportunity

Contract: 6 months with potential for extension
Melbourne: Hybrid

Were looking for a Spark Developer Consultant to support the design development and ongoing improvement of distributed data processing solutions across enterprise environments. This role contributes to the ingestion transformation and delivery of scalable data pipelines across cloud and onprem platforms.


Youll work closely with data architects engineers and delivery teams to build reliable welldocumented data solutions using Apache Spark and related technologies. This role values collaboration clarity and continuous improvement with a strong focus on quality reliability and maintainability.


What Youll Do


Data Engineering & Development

  • Design develop and maintain scalable ETL and ELT pipelines using Apache Spark (batch and streaming).
    Implement data transformation logic aligned with business rules and data quality standards.
  • Optimise Spark workloads for performance reliability and cost efficiency including partitioning caching executor configuration shuffle tuning and Spark SQL optimisation.


Data Ingestion & Integration

  • Build and manage data ingestion flows using Apache NiFi including processors FlowFiles controller services and environment promotion practices.
  • Support integration across cloud and hybrid environments working with a range of data sources and platforms.
  • Supporting Services & Automation
  • Develop supporting utilities automation scripts and microservices using Python and Go.
  • Contribute to improving reliability and repeatability through CI/CD pipelines testing practices and automation.


Collaboration & Problem Solving

  • Partner with data architects on solution design schemas data models and integration patterns.
  • Investigate and resolve pipeline failures performance issues data inconsistencies and distributed system behaviours.


Governance Documentation & Delivery

  • Ensure security governance and compliance requirements are met across data processes.
    Track work in Jira and maintain clear uptodate technical documentation and design artefacts in Confluence.
    Contribute to shared standards patterns and continuous improvement across the data engineering practice.


What Youll Bring

  • Handson experience with Apache Spark (Scala or PySpark) for largescale data processing.
    Experience building and managing data flows with Apache NiFi.
    Proficiency in Python for ETL automation and data manipulation.
    Working experience with Go for backend utilities or supporting services.
    Understanding of distributed systems cluster configuration and performance tuning principles.
  • Platforms & Delivery Practices
  • Experience working with cloud data platforms such as AWS Azure or GCP including hybrid or onprem integrations.
    Familiarity with CI/CD pipelines Gitbased workflows and modern engineering practices.
    Exposure to containerisation and orchestration tools such as Docker and Kubernetes.
    Understanding of data modelling schema evolution and data quality approaches.


Tools

  • Jira for work and task tracking.
  • Confluence for documentation and design artefacts.
  • GitHub GitLab or Bitbucket for version control.
  • Spark ecosystem tools including Spark SQL Spark Streaming and Delta Lake.
  • NiFi components such as NiFi Registry and clustered flow management.


Nice to Have

  • Experience with streaming and messaging technologies such as Kafka Event Hub or Pub/Sub.
  • Exposure to managed Spark platforms such as Databricks EMR or cloudnative Spark services.
  • Experience with orchestration tools such as Airflow dbt or similar scheduling frameworks.
  • Familiarity with IAM RBAC and data security controls including encryption.
  • Exposure to Gobased microservice patterns such as gRPC or messagebased architectures.

Why Join Skillfield

  • Be part of a team focused on work that makes a meaningful difference
  • Join a culture that values clarity collaboration and shared success
  • Enjoy flexibility support and recognition based on outcomes


Ready to make a measurable impact
Apply now and help us deliver impact that matters.


Required Experience:

Senior IC

About SkillfieldSkillfield is a specialist technology consultancy operating at the intersection of cyber data and AI. We partner with enterprise clients to protect transform and enable their organisations through thoughtful outcomefocused solutions. Our reputation is built on outcomes not effort.The...
View more view more

Key Skills

  • CCTV
  • Computer Science
  • Corporate Marketing
  • E Learning
  • Arabic English Translation

About Company

Company Logo

Cyber Security Skillfield empowers Australian organisations to detect and respond quickly to cyber security threats.Let’s talk about how we can help. Learn More Data Services Are data complexity challenges preventing your organisation from making data-driven decisions?Let’s talk about ... View more

View Profile View Profile