Consulting Data Engineer
DyFlex is an SAP Platinum Partner delivering high-quality SAP solutions across Australia. Were now expanding our Data and AI practice to match the strength and reputation of our established SAP capability.
As a Consulting Data Engineer youll design build and deploy scalable data pipeline and machine learning solutions that deliver real business value. Youll use strong SQL and modern data stacks to create reliable cost effective pipelines and support ML workloads. Experience with at least one of Databricks Snowflake BigQuery MS Fabric is required more than one a bonus. Experience with tools like Spark dbt Dataform is a plus but not required. You will work on meaningful technical challenges with autonomy and communicate technical outcomes clearly to both technical and non-technical stakeholders.
We value engineers who think creatively communicate effectively and engage confidently with stakeholders. Were looking for engineers who do more than write code. Youll listen to client challenges dig into the core problem help shape solutions and explain them clearly. If you want to build something from the ground up with a team thats already proven it can deliver meaningful outcomes wed like to hear from you.
Your tasks and responsibilities:
- Build and maintain scalable data pipelines for ingesting transforming and delivering data
- Manage and optimise databases warehouses and cloud storage solutions
- Implement data quality frameworks and testing processes to ensure reliable systems
- Design and deliver cloud-based solutions (AWS Azure or GCP)
- Take technical ownership of project components and lead small development teams
- Engage directly with clients translating business requirements into technical solutions
- Champion best practices including version control CI/CD and infrastructure as code
Your qualifications and experience:
- Hands-on data engineering experience in production environments
- Strong proficiency in Python and SQL; experience with at least one additional language (e.g. Java Typescript/Javascript)
- Experience with modern frameworks such as Apache Spark Airflow dbt Kafka or Flink
- Background in building ML pipelines MLOps practices or feature stores is highly valued
- Proven expertise in relational databases data modelling and query optimisation
- Demonstrated ability to solve complex technical problems independently
- Excellent communication skills with ability to engage clients and stakeholders
- Degree in Computer Science Engineering Data Science Mathematics or a related field
What we offer:
- Work with SAPs latest technologies on cloud as S/4HANA BTP and Joule plus Databricks ML/AI tools and cloud platforms
- A flexible and supportive work environment including work from home
- Competitive remuneration and benefits including novated lease birthday leave salary packaging wellbeing programme additional purchased leave and company-provided laptop
- Comprehensive training budget and paid certifications (Databricks SAP cloud platforms Snowflake BigQuery)
- Structured career advancement pathways with mentoring from senior engineers
- Exposure to diverse industries and client environments
Join a renowned organisation delivering projects to some of Australias leading enterprises
DyFlex is committed to providing a safe flexible and respectful environment for staff free from all forms of discrimination bullying and harassment. We are proud of our diverse and inclusive team as only together we can continually improve ourselves and achieve best outcomes for our customers we are the regions leading SAP Platinum Partner!
Please note that we cant offer sponsorship for this role and we expect a clear statement about your legal rights in Australia especially if you are applying from overseas.
Required Experience:
IC
Consulting Data EngineerDyFlex is an SAP Platinum Partner delivering high-quality SAP solutions across Australia. Were now expanding our Data and AI practice to match the strength and reputation of our established SAP capability.As a Consulting Data Engineer youll design build and deploy scalable da...
Consulting Data Engineer
DyFlex is an SAP Platinum Partner delivering high-quality SAP solutions across Australia. Were now expanding our Data and AI practice to match the strength and reputation of our established SAP capability.
As a Consulting Data Engineer youll design build and deploy scalable data pipeline and machine learning solutions that deliver real business value. Youll use strong SQL and modern data stacks to create reliable cost effective pipelines and support ML workloads. Experience with at least one of Databricks Snowflake BigQuery MS Fabric is required more than one a bonus. Experience with tools like Spark dbt Dataform is a plus but not required. You will work on meaningful technical challenges with autonomy and communicate technical outcomes clearly to both technical and non-technical stakeholders.
We value engineers who think creatively communicate effectively and engage confidently with stakeholders. Were looking for engineers who do more than write code. Youll listen to client challenges dig into the core problem help shape solutions and explain them clearly. If you want to build something from the ground up with a team thats already proven it can deliver meaningful outcomes wed like to hear from you.
Your tasks and responsibilities:
- Build and maintain scalable data pipelines for ingesting transforming and delivering data
- Manage and optimise databases warehouses and cloud storage solutions
- Implement data quality frameworks and testing processes to ensure reliable systems
- Design and deliver cloud-based solutions (AWS Azure or GCP)
- Take technical ownership of project components and lead small development teams
- Engage directly with clients translating business requirements into technical solutions
- Champion best practices including version control CI/CD and infrastructure as code
Your qualifications and experience:
- Hands-on data engineering experience in production environments
- Strong proficiency in Python and SQL; experience with at least one additional language (e.g. Java Typescript/Javascript)
- Experience with modern frameworks such as Apache Spark Airflow dbt Kafka or Flink
- Background in building ML pipelines MLOps practices or feature stores is highly valued
- Proven expertise in relational databases data modelling and query optimisation
- Demonstrated ability to solve complex technical problems independently
- Excellent communication skills with ability to engage clients and stakeholders
- Degree in Computer Science Engineering Data Science Mathematics or a related field
What we offer:
- Work with SAPs latest technologies on cloud as S/4HANA BTP and Joule plus Databricks ML/AI tools and cloud platforms
- A flexible and supportive work environment including work from home
- Competitive remuneration and benefits including novated lease birthday leave salary packaging wellbeing programme additional purchased leave and company-provided laptop
- Comprehensive training budget and paid certifications (Databricks SAP cloud platforms Snowflake BigQuery)
- Structured career advancement pathways with mentoring from senior engineers
- Exposure to diverse industries and client environments
Join a renowned organisation delivering projects to some of Australias leading enterprises
DyFlex is committed to providing a safe flexible and respectful environment for staff free from all forms of discrimination bullying and harassment. We are proud of our diverse and inclusive team as only together we can continually improve ourselves and achieve best outcomes for our customers we are the regions leading SAP Platinum Partner!
Please note that we cant offer sponsorship for this role and we expect a clear statement about your legal rights in Australia especially if you are applying from overseas.
Required Experience:
IC
View more
View less