drjobs Data Engineer

Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Bracknell - UK

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Job Purpose

Evelyn Partners is recruiting a skilled and experienced Data Engineer to join a growing team to deliver a large data modernisation programme.

As a Data Services Data Engineer you will be responsible for designing developing and maintaining our MS SQL Server Data Warehouses and associated data feeds into and out of the warehouses and developing on our new modern cloud data platform requiring Snowflake dbt and Azure Data Factory experience. Our data platforms support regulatory requirements business intelligence & reporting needs and numerous system integrations. This role requires strong technical proficiency and a deep understanding of data engineering and data warehousing principles and practices.

You will be critical in the development and support of the new Evelyn Data Platform which is being engineered on Snowflake utilising Azure Data Factory pipelines for data integration dbt for data modelling Azure BLOB Storage for data storage and GitHub for version control and collaboration.

The role will be working in an agile and collaborative environment within a growing and skilled team. An understanding of the wealth management industry including products & services and the associated data is a plus.

Key Responsibilities

Design develop and implement data warehouse solutions using Snowflake Azure and MS SQL Server.

Develop data models and database schemas that support reporting and analytics needs.

Extensive use of and fully conversant in SQL.

Experience working with programming languages like C# Python Java Spark.

Create and maintain ETL/ELT processes to extract transform and load data from various sources into the data platforms.

Design develop and deploy SSIS packages and ADF pipelines.

Manage and troubleshoot SSIS packages and ADF pipelines ensuring data integrity and error handling is robust.

Creation and maintenance of complex SQL stored procedures.

Integrate data from various sources including flat files REST/SOAP APIs and third-party databases into the data warehouse/platform.

Develop and maintain complex stored procedures views and functions in SQL to support data integration and reporting needs.

Collaborate with business analysts data scientists and other stakeholders to understand data requirements and deliver solutions.

Support the development of reports and dashboards using tools like SSRS Power BI or similar. Consideration for data warehouse/platform performance including indexing partitioning and query tuning.

Provide ad-hoc queries and data extracts to support business needs.

Work closely with other engineers DBAs and Tech staff to ensure seamless data integration and system compatibility.

Document data warehouse architecture ETL processes and database configurations.

Maintain effective relationships with stakeholders and ensure communication is effective and timely.

Provide time estimates for work so stakeholder expectations and resources can be managed.

Collaborate with stakeholders to understand business requirements and translate them into technical specifications for data engineering solutions.

Continuously improve processes and codebases.

Design and deliver data warehousing solutions and key data engineering workstreams for any required solution.

Stay up to date with the latest trends technologies and best practices in data engineering.


Qualifications :

Key Skills and Experience

Technical

Proven experience building data pipelines in Azure Data Factory and Snowflake.

Proficiency in MS SQL Server including T-SQL programming stored procedures and functions.

Extensive experience with SQL Server Integration Services (SSIS) & Visual Studio and Azure Data Factory (ADF) and GitHub for ETL development.

Experience building and maintaining reports using SQL Server Reporting Services (SSRS).

Solid understanding of data warehousing concepts including star/snowflake schemas dimensional modelling and data vault 2.0 and of the principles of ETL technical design

Experience with SQL Server Analysis Services (SSAS) is a plus.

Familiarity with data visualization tools such as Power BI or Tableau is desirable.

Experience with source control tools like Git and familiarity with DevOps practices is an advantage.

Ability to write efficient and optimized SQL queries for data retrieval and manipulation.

Solid understanding of data warehousing concepts and principles

Snowflake/Cloud Engineering experience is required:

Proven experience working as a Data Engineer preferably with expertise in Snowflake ADF Azure Storage dbt and GitHub.

Extensive experience in designing building deploying and supporting cloud-based data products and pipelines.

Experience with Azure Storage services such as Blob Storage Data Lake Storage or Azure Files.

Hands-on experience designing and developing data pipelines using Azure Data Factory.

Experience of data warehouse design and implementation using Snowflake and dbt modelling.

SnowPro and Microsoft Cloud certifications would be highly desirable.

Experience with Azure AI Services and/or Snowflake Cortex AI would be a plus.

 

Non-Technical

Strong coaching and mentoring skills

Calm under pressure and keen to highlight others successes.

Business Analysis skills to derive requirements from business stakeholders.

Strong ability to extract information by questioning active listening and interviewing.

Ability to quickly assimilate and understand business processes and information flow.

Highly numerate with strong analytical and problem-solving skills.

Ability to work on own initiative within agreed boundaries.

Ability to work under pressure and manage conflicting priorities.

Be flexible and adaptable to meet business deliverables.

Willingness to travel to different office locations as required by the role.

Understanding of the principles of information security and data protection.

Ability to communicate with non-technical colleagues at all levels of the organisation with confidence and with clarity.

 

Professional Qualifications and Education

SnowPro Core

SnowPro Advanced: Data Engineer

Azure Data Fundamentals

Azure Data Engineer Associate


Additional Information :

As a colleague here at Evelyn Partners you will have access to benefits that include:

  • Competitive salary
  • Private medical insurance
  • Life assurance
  • Pension contribution
  • Hybrid working model (role dependant)
  • Generous holiday package
  • Option to purchase additional holiday
  • Shared parental leave

We are proud to value the differences that a diverse workforce brings representative of society and our clients. At Evelyn Partners we have a wide range of highly active employee resource groups and were delivering multiple diversity equity and inclusion initiatives across the organisation. It is our commitment to provide a workspace where all colleagues regardless of identity background or circumstance feel respected as individuals and feel that they can achieve their full potential and work in a safe supportive and inclusive environment.

We are happy to make any reasonable adjustments to accommodate for your needs throughout the application process. Please let your Recruiter know.


Remote Work :

No


Employment Type :

Full-time

Employment Type

Full-time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.