About our client:
Our client offers financial service solutions helping their clients achieve their dreams. With an emphasis on culture fit they boast a dedicated team of over 600 employees many with over a decade of tenure. They have built their culture on a feeling of togetherness trust and respect and are always looking to support employees continuous learning. Using Agile they provide diverse services with a focus on research innovation and improvement.
What you will be doing:
- Design implement and maintain scalable data warehouse architectures (Star/Snowflake) and optimize ETL/ELT pipelines for data quality.
- Write and tune complex SQL (T-SQL/ANSI) and use Python for data manipulation and advanced analytics.
- Own the full SDLC: requirements design testing deployment and solution estimation.
- Collaborate with stakeholders to translate business data needs into technical structures.
- Identify and resolve performance bottlenecks through normalisation indexing and query optimization.
- Develop and maintain dashboards/reports using tools like Power BI or Yellowfin for data-led decision-making.
- Create and maintain high-quality technical documentation (data flow diagrams specifications metadata).
- Actively participate in the Agile/Scrum framework and sprint ceremonies.
- Research and evaluate emerging data technologies to continuously improve the data warehouse.
What our client is looking for:
- A relevant tertiary qualification would be beneficial (Computer Science IT Data Science etc.)
- Strong knowledge of analytical & dimensional data warehouse modelling design architecture data structures and fundamental design principles.
- Experience in developing and modifying ETL processes (Extract Transform Load) using various data sources and understanding concepts like data normalisation and performance tuning.
- Proficiency in relational databases (SQL Server PostgreSQL) SQL (TSQL/ANSI) and programming languages like Python.
- Capability in the full Software Development Life Cycle including solution estimation requirements gathering analysis (modifying data structures to meet client needs) technical design unit testing debugging and documentation.
- Ability to consult with clients to gather requirements analyse business needs propose technical alternatives and provide necessary documentation (e.g. technical specifications data flow diagrams).
- Understanding of Big Data Visualization Tools and specific industry BI Visualisation tools (e.g. Yellowfin or Power BI).
- Essential skills include communication (verbal & written internal & external) problem-solving and being a team player.
Job ID:
Required Skills:
Data Warehouse ETL ELT SQL Python Dimensional Modeling Star Schema Snowflake Schema SDLC Agile Power BI Yellowfin Performance Tuning
About our client:Our client offers financial service solutions helping their clients achieve their dreams. With an emphasis on culture fit they boast a dedicated team of over 600 employees many with over a decade of tenure. They have built their culture on a feeling of togetherness trust and respect...
About our client:
Our client offers financial service solutions helping their clients achieve their dreams. With an emphasis on culture fit they boast a dedicated team of over 600 employees many with over a decade of tenure. They have built their culture on a feeling of togetherness trust and respect and are always looking to support employees continuous learning. Using Agile they provide diverse services with a focus on research innovation and improvement.
What you will be doing:
- Design implement and maintain scalable data warehouse architectures (Star/Snowflake) and optimize ETL/ELT pipelines for data quality.
- Write and tune complex SQL (T-SQL/ANSI) and use Python for data manipulation and advanced analytics.
- Own the full SDLC: requirements design testing deployment and solution estimation.
- Collaborate with stakeholders to translate business data needs into technical structures.
- Identify and resolve performance bottlenecks through normalisation indexing and query optimization.
- Develop and maintain dashboards/reports using tools like Power BI or Yellowfin for data-led decision-making.
- Create and maintain high-quality technical documentation (data flow diagrams specifications metadata).
- Actively participate in the Agile/Scrum framework and sprint ceremonies.
- Research and evaluate emerging data technologies to continuously improve the data warehouse.
What our client is looking for:
- A relevant tertiary qualification would be beneficial (Computer Science IT Data Science etc.)
- Strong knowledge of analytical & dimensional data warehouse modelling design architecture data structures and fundamental design principles.
- Experience in developing and modifying ETL processes (Extract Transform Load) using various data sources and understanding concepts like data normalisation and performance tuning.
- Proficiency in relational databases (SQL Server PostgreSQL) SQL (TSQL/ANSI) and programming languages like Python.
- Capability in the full Software Development Life Cycle including solution estimation requirements gathering analysis (modifying data structures to meet client needs) technical design unit testing debugging and documentation.
- Ability to consult with clients to gather requirements analyse business needs propose technical alternatives and provide necessary documentation (e.g. technical specifications data flow diagrams).
- Understanding of Big Data Visualization Tools and specific industry BI Visualisation tools (e.g. Yellowfin or Power BI).
- Essential skills include communication (verbal & written internal & external) problem-solving and being a team player.
Job ID:
Required Skills:
Data Warehouse ETL ELT SQL Python Dimensional Modeling Star Schema Snowflake Schema SDLC Agile Power BI Yellowfin Performance Tuning
View more
View less