Your skills and attributes for success:
- An excellent team player and able to work independently.
- Excellent client facing skills with experience on client projects.
- A self-starter who is proactive in nature.
- Excellent verbal written communication and presentational skills.
- Ability to build internal and external relationships.
- Effective negotiating and influencing skills.
- Ability to think creatively and propose innovative solutions.
- Leadership skills.
To qualify for this role you must have:
- Proven experience and knowledge with PySpark and Apache Spark including the fundamentals of how it works.
- Core experience with AWS with substantial and mature Azure platform offering.
- Experience with other cloud platforms e.g. Azure GCP and data lake architectures.
- Strong experience and programming skills in languages such as Python and SQL and the ability to write complex SQL queries.
- Use of GitHub and CI/CD practices.
- Support development of the Azure Databricks Lakehouse platform shaping frameworks and solutions that other engineering teams will adopt in future data projects.
- Build optimise and maintain data processing frameworks using Python ensuring performance scalability and maintainability.#Support DBT integration and best practices for transformation pipelines within Databricks.
- Apply software engineering principles including:
- Source control automated testing CI/CD
- Design patterns and reusable solutions
- Coding standards and patterns
- Collaborate with technical solution authorities ensuring alignment with governance design decisions and platform standards.
- Collaborate closely with the Cloud Architecture and Data Architecture teams to deliver approved solutions.
- Stakeholder management take ownership of requirements communicate effectively across teams and deliver high quality solutions.
- Experience of DevOps and infrastructure deployments (Azure and Databricks).
- A proactive awareness of industry standards regulations and developments.
- Multi-skilled experience in one or more of the following disciplines: Data Management Data Engineering Data Warehousing Data Modelling Data Quality Data Integration Data Analytics Data Visualisation Data Science and Business Intelligence.
- Proficiency in Infrastructure as Code tools especially Terraform.
- Experience with Terraform for cloud resource provisioning (AWS Azure GCP).
- Project experience using one or more of the following technologies: Tableau Power BI Cloud Azure AWS GCP Snowflake) and their integration with Databricks is advantageous.
#TalanUK #LI-HB1
Qualifications :
You must be:
- Willing to work on client sites potentially for extended periods.
- Willing to travel for work purposes and be happy to stay away from home for extended periods.
- Eligible to work in the UK without restriction.
Additional Information :
What we offer:
- 25 days holiday bank holidays.
- 5 days holiday buy/sell option.
- Private medical insurance.
- Life cover.
- Cycle to work scheme.
- Eligibility for company pension scheme (5% employer contribution salary sacrifice option).
- Employee assistance programme.
- Bespoke online learning via Udemy for Business.
Remote Work :
No
Employment Type :
Full-time
Your skills and attributes for success:An excellent team player and able to work independently.Excellent client facing skills with experience on client projects.A self-starter who is proactive in nature.Excellent verbal written communication and presentational skills.Ability to build internal and ex...
Your skills and attributes for success:
- An excellent team player and able to work independently.
- Excellent client facing skills with experience on client projects.
- A self-starter who is proactive in nature.
- Excellent verbal written communication and presentational skills.
- Ability to build internal and external relationships.
- Effective negotiating and influencing skills.
- Ability to think creatively and propose innovative solutions.
- Leadership skills.
To qualify for this role you must have:
- Proven experience and knowledge with PySpark and Apache Spark including the fundamentals of how it works.
- Core experience with AWS with substantial and mature Azure platform offering.
- Experience with other cloud platforms e.g. Azure GCP and data lake architectures.
- Strong experience and programming skills in languages such as Python and SQL and the ability to write complex SQL queries.
- Use of GitHub and CI/CD practices.
- Support development of the Azure Databricks Lakehouse platform shaping frameworks and solutions that other engineering teams will adopt in future data projects.
- Build optimise and maintain data processing frameworks using Python ensuring performance scalability and maintainability.#Support DBT integration and best practices for transformation pipelines within Databricks.
- Apply software engineering principles including:
- Source control automated testing CI/CD
- Design patterns and reusable solutions
- Coding standards and patterns
- Collaborate with technical solution authorities ensuring alignment with governance design decisions and platform standards.
- Collaborate closely with the Cloud Architecture and Data Architecture teams to deliver approved solutions.
- Stakeholder management take ownership of requirements communicate effectively across teams and deliver high quality solutions.
- Experience of DevOps and infrastructure deployments (Azure and Databricks).
- A proactive awareness of industry standards regulations and developments.
- Multi-skilled experience in one or more of the following disciplines: Data Management Data Engineering Data Warehousing Data Modelling Data Quality Data Integration Data Analytics Data Visualisation Data Science and Business Intelligence.
- Proficiency in Infrastructure as Code tools especially Terraform.
- Experience with Terraform for cloud resource provisioning (AWS Azure GCP).
- Project experience using one or more of the following technologies: Tableau Power BI Cloud Azure AWS GCP Snowflake) and their integration with Databricks is advantageous.
#TalanUK #LI-HB1
Qualifications :
You must be:
- Willing to work on client sites potentially for extended periods.
- Willing to travel for work purposes and be happy to stay away from home for extended periods.
- Eligible to work in the UK without restriction.
Additional Information :
What we offer:
- 25 days holiday bank holidays.
- 5 days holiday buy/sell option.
- Private medical insurance.
- Life cover.
- Cycle to work scheme.
- Eligibility for company pension scheme (5% employer contribution salary sacrifice option).
- Employee assistance programme.
- Bespoke online learning via Udemy for Business.
Remote Work :
No
Employment Type :
Full-time
View more
View less