DescriptionCore Technical Responsibilities
- Developing Data Solutions: Implement and enhance data-driven solutions to integrate massive amounts of data in real-time/batch mode from various data producing systems using state-of-the-art tools such as OCI Data Flow OCI Data Integration Spark and Kafka. Embrace modern data architecture philosophies including data products data contracts and data mesh to ensure a decentralized approach to data management. Expert in Advanced SQL and PL/SQL for data analysis and testing data gaps & data quality
- Data Pipeline Development: Develop and optimise high-performance batch and real-time data pipelines employing advanced streaming technologies like Kafka NoSQL and Oracle ETL Tools to address challenges associated with large-scale data processing and analysis.
- Cloud Data Management: Implement and oversee cloud-specific data services including Autonomous Data Warehouse OCI Object Storage with Parquet and Delta-Tables and OCI Streams. Leverage cloud architectures to improve data sharing and interoperability across different business pillars
- Security and Compliance: Ensure all data practices comply with security policies and regulations embedding security by design in the data infrastructure. Incorporate tools and methodologies recommended for data security and compliance ensuring robust protection and governance of data data governance security and compliance: lineage cataloguing DQ checks PII protection and privacybydesign
- Leads BI initiativesfrom requirements gathering to delivery including planning tracking and stakeholder ownership and accountability for the delivery of data analytics insights data modelling dashboarding & reporting
- Designs and developsdata models dashboards reports and visualisations using BI tools and strong analytical data into actionable insights creating compelling narratives and storytelling techniques to communicate insights broadly including to leadership teams.
- Understands business processesand collaborates with stakeholders to identify data-driven nebulous problem statements and develop hypotheses / issue trees to determine the solution(s). Triage assess & prioritise incoming requests to the team in collaboration with key business stakeholders.
- Produces clear impactful reportingthat translates complex data into meaningful business data to shape business decisions quantify new opportunities for improvements across business process identify upcoming challenges.
Ideal Skills and Experience
We use a broad range of tools languages and frameworks. We dont expect you to know them all but experience or exposure with some of these (or equivalents) will set you up for success in this team
- 6 Years of Experience in Data Engineering / Analytics or related fields
- Core Data Engineering Tools & Technologies: Demonstrates proficiency in SQL and Spark and familiarity with platforms for Big Data and Data Lakehouses. Well-versed in various technologies including ETL/ELT NoSQL and Advanced SQL/PL-SQL to work with unstructured JSONs. Adept in various modern data-architecture patterns
- Data Storage Expertise: Knowledgeable in data warehousing technologies and proficient in managing various data storage formats including Parquet Delta ORC Avro and JSON to optimise data storage and retrieval
- Data Modelling Expertise: Proficient in data modelling understanding the implications and trade-offs of various methodologies and approaches
- Infrastructure Configuration for Data Systems: Competent in setting up data system infrastructures favouring infrastructure-as-code practices using tools such as Terraform
- Programming Languages: Proficient in Python and Advanced SQL
- CI/CD Implementation: Knowledgeable about continuous integration and continuous deployment practices using tools like GitHub enhancing software development and quality assurance
- Agile Delivery and Project Management: Skilled in agile and kanban project delivery methods ensuring efficient and effective solution development
- Communication Skills: Effective in engaging stakeholders and translating business requirements into practical data engineering with crossfunctional teams to ensure solutions meet performance reliability and operational excellence standards
- Strong analytical and problem-solving skills with the ability to turn data into insights.
- Expertise in data modelling and sematic modelling for Analytics using Oracle Analytics with sound knowledge of OBIEE/OAC RPD design development.
- Expertise at telling stories with data using OBIEE / OAC (preferred) or other visualiation tools like PowerBI Looker etc
- Highly proficient in translating complex findings into a compelling visualisation and a proven track record for crafting analysis into well-designed business Business acumen with the ability to link data to business outcomes.
- Highly proficient in data query and manipulation (advanced level SQL)strong experience with data analysis and understanding of data architecture data modelling data warehouse concepts and principles (e.g. dimensional modelling star schema).
ResponsibilitiesAs a member of the software engineering division you will assist in defining and developing software for tasks associated with the developing debugging or designing of software applications or operating systems. Provide technical leadership to other software developers. Specify design and implement modest changes to existing software architecture to meet changing needs.
QualificationsCareer Level - IC3
Required Experience:
Senior IC
DescriptionCore Technical ResponsibilitiesDeveloping Data Solutions: Implement and enhance data-driven solutions to integrate massive amounts of data in real-time/batch mode from various data producing systems using state-of-the-art tools such as OCI Data Flow OCI Data Integration Spark and Kafka. E...
DescriptionCore Technical Responsibilities
- Developing Data Solutions: Implement and enhance data-driven solutions to integrate massive amounts of data in real-time/batch mode from various data producing systems using state-of-the-art tools such as OCI Data Flow OCI Data Integration Spark and Kafka. Embrace modern data architecture philosophies including data products data contracts and data mesh to ensure a decentralized approach to data management. Expert in Advanced SQL and PL/SQL for data analysis and testing data gaps & data quality
- Data Pipeline Development: Develop and optimise high-performance batch and real-time data pipelines employing advanced streaming technologies like Kafka NoSQL and Oracle ETL Tools to address challenges associated with large-scale data processing and analysis.
- Cloud Data Management: Implement and oversee cloud-specific data services including Autonomous Data Warehouse OCI Object Storage with Parquet and Delta-Tables and OCI Streams. Leverage cloud architectures to improve data sharing and interoperability across different business pillars
- Security and Compliance: Ensure all data practices comply with security policies and regulations embedding security by design in the data infrastructure. Incorporate tools and methodologies recommended for data security and compliance ensuring robust protection and governance of data data governance security and compliance: lineage cataloguing DQ checks PII protection and privacybydesign
- Leads BI initiativesfrom requirements gathering to delivery including planning tracking and stakeholder ownership and accountability for the delivery of data analytics insights data modelling dashboarding & reporting
- Designs and developsdata models dashboards reports and visualisations using BI tools and strong analytical data into actionable insights creating compelling narratives and storytelling techniques to communicate insights broadly including to leadership teams.
- Understands business processesand collaborates with stakeholders to identify data-driven nebulous problem statements and develop hypotheses / issue trees to determine the solution(s). Triage assess & prioritise incoming requests to the team in collaboration with key business stakeholders.
- Produces clear impactful reportingthat translates complex data into meaningful business data to shape business decisions quantify new opportunities for improvements across business process identify upcoming challenges.
Ideal Skills and Experience
We use a broad range of tools languages and frameworks. We dont expect you to know them all but experience or exposure with some of these (or equivalents) will set you up for success in this team
- 6 Years of Experience in Data Engineering / Analytics or related fields
- Core Data Engineering Tools & Technologies: Demonstrates proficiency in SQL and Spark and familiarity with platforms for Big Data and Data Lakehouses. Well-versed in various technologies including ETL/ELT NoSQL and Advanced SQL/PL-SQL to work with unstructured JSONs. Adept in various modern data-architecture patterns
- Data Storage Expertise: Knowledgeable in data warehousing technologies and proficient in managing various data storage formats including Parquet Delta ORC Avro and JSON to optimise data storage and retrieval
- Data Modelling Expertise: Proficient in data modelling understanding the implications and trade-offs of various methodologies and approaches
- Infrastructure Configuration for Data Systems: Competent in setting up data system infrastructures favouring infrastructure-as-code practices using tools such as Terraform
- Programming Languages: Proficient in Python and Advanced SQL
- CI/CD Implementation: Knowledgeable about continuous integration and continuous deployment practices using tools like GitHub enhancing software development and quality assurance
- Agile Delivery and Project Management: Skilled in agile and kanban project delivery methods ensuring efficient and effective solution development
- Communication Skills: Effective in engaging stakeholders and translating business requirements into practical data engineering with crossfunctional teams to ensure solutions meet performance reliability and operational excellence standards
- Strong analytical and problem-solving skills with the ability to turn data into insights.
- Expertise in data modelling and sematic modelling for Analytics using Oracle Analytics with sound knowledge of OBIEE/OAC RPD design development.
- Expertise at telling stories with data using OBIEE / OAC (preferred) or other visualiation tools like PowerBI Looker etc
- Highly proficient in translating complex findings into a compelling visualisation and a proven track record for crafting analysis into well-designed business Business acumen with the ability to link data to business outcomes.
- Highly proficient in data query and manipulation (advanced level SQL)strong experience with data analysis and understanding of data architecture data modelling data warehouse concepts and principles (e.g. dimensional modelling star schema).
ResponsibilitiesAs a member of the software engineering division you will assist in defining and developing software for tasks associated with the developing debugging or designing of software applications or operating systems. Provide technical leadership to other software developers. Specify design and implement modest changes to existing software architecture to meet changing needs.
QualificationsCareer Level - IC3
Required Experience:
Senior IC
View more
View less