We are seeking an experienced and strategicData Engineerto design and develop scalable data solutions in a high-stakes financial environment. This role requires deep technical expertise inDatabricksSQL andcloud-based data warehousing along with a strong understanding offinancial data domainsregulatory compliance anddata governance.
What you are going to do
Implement robust scalableETL/ELT pipelinesusingDatabricks (Spark Delta Lake)andSQLto support complex financial data workflows
Designdata modelsandwarehouse schemasoptimized for performance auditability and regulatory compliance
Collaborate with data architects analysts and business stakeholders to translate financial requirements into technical solutions
Work withdata qualitylineage andgovernanceinitiatives across the data platform
Integrate data fromcore banking systemstrading platformsmarket data providers andregulatory sources
Optimize data processing forcost-efficiencylatency andscalabilityinAzure
Implement and maintainCI/CD pipelinesautomated testing andmonitoringfor data workflows
Stay current with emerging technologies and evaluate their applicability to the financial data ecosystem
Help team in migrating towards new combined DAP platform
What we offer you
Our people are the driving force behind our organization. We value the knowledge and expertise you bring. We believe that your commitment can take our organization to a higher level. We offer you:
Salary between 3908 and 5583 based on 40 hours depending on your knowledge and experience
13th month and holiday allowance are paid with your monthly salary
27 vacation days for a 5-day working week and one Diversity Day
A modern pension administered by BeFrank
Plenty of training and learning opportunities
NS Business Card 2nd class which gives you unlimited travel also privately. Do you prefer to travel with your own transport Then you can declare the kilometers travelled
Allowances for setting up your home office and for internet use
Who you are
We are looking for someone with:
Required Qualifications
Bachelors in Computer Science Engineering Finance or a related field
2 years of experience indata engineering with at least 2 years in asenior or lead role
Proven expertise in: Databricks(Spark Delta Lake Unity Catalog MLflow) SQL(T-SQL Spark SQL) for complex transformations and performance tuning PythonorScalafor data processing and automation Cloud platforms(Azure) and ETL orchestration tools(e.g. Azure Data Factory )
Strong understanding offinancial data domains: trade data market data risk metrics compliance reporting and financial KPIs
Experience withdata governancemetadata management andsecurity controlsin regulated environments
Familiarity withDevOps practicesinfrastructure as code andCI/CD pipelines
Preferred qualifications
Exposure tomachine learning pipelinesin Databricks for financial modeling or fraud detection
Certifications inAzure Data EngineeringDatabricks orfinancial data management
Experience leadingcross-functional data initiativesorplatform migrations
Who you will work with
As a member of NN Banks you will work within the mortgage data lake domain where our mission is to collect and deliver high-quality data to our customers in the Mortgage and finance and risk domains for risk modelling and reporting.
In the Data lake team collaboration is key. We engage with other teams to achieve the best possible results and everyone is very accessible and willing to help. Our team is composed of diverse international individuals with varying backgrounds and areas of expertise.
Any questions
If you have any questions about the job or about the process you can reach out via mail to Bianca Schaareman (Talent Acquisition Specialist) via .
Required Experience:
IC
We are seeking an experienced and strategicData Engineerto design and develop scalable data solutions in a high-stakes financial environment. This role requires deep technical expertise inDatabricksSQL andcloud-based data warehousing along with a strong understanding offinancial data domainsregulato...
We are seeking an experienced and strategicData Engineerto design and develop scalable data solutions in a high-stakes financial environment. This role requires deep technical expertise inDatabricksSQL andcloud-based data warehousing along with a strong understanding offinancial data domainsregulatory compliance anddata governance.
What you are going to do
Implement robust scalableETL/ELT pipelinesusingDatabricks (Spark Delta Lake)andSQLto support complex financial data workflows
Designdata modelsandwarehouse schemasoptimized for performance auditability and regulatory compliance
Collaborate with data architects analysts and business stakeholders to translate financial requirements into technical solutions
Work withdata qualitylineage andgovernanceinitiatives across the data platform
Integrate data fromcore banking systemstrading platformsmarket data providers andregulatory sources
Optimize data processing forcost-efficiencylatency andscalabilityinAzure
Implement and maintainCI/CD pipelinesautomated testing andmonitoringfor data workflows
Stay current with emerging technologies and evaluate their applicability to the financial data ecosystem
Help team in migrating towards new combined DAP platform
What we offer you
Our people are the driving force behind our organization. We value the knowledge and expertise you bring. We believe that your commitment can take our organization to a higher level. We offer you:
Salary between 3908 and 5583 based on 40 hours depending on your knowledge and experience
13th month and holiday allowance are paid with your monthly salary
27 vacation days for a 5-day working week and one Diversity Day
A modern pension administered by BeFrank
Plenty of training and learning opportunities
NS Business Card 2nd class which gives you unlimited travel also privately. Do you prefer to travel with your own transport Then you can declare the kilometers travelled
Allowances for setting up your home office and for internet use
Who you are
We are looking for someone with:
Required Qualifications
Bachelors in Computer Science Engineering Finance or a related field
2 years of experience indata engineering with at least 2 years in asenior or lead role
Proven expertise in: Databricks(Spark Delta Lake Unity Catalog MLflow) SQL(T-SQL Spark SQL) for complex transformations and performance tuning PythonorScalafor data processing and automation Cloud platforms(Azure) and ETL orchestration tools(e.g. Azure Data Factory )
Strong understanding offinancial data domains: trade data market data risk metrics compliance reporting and financial KPIs
Experience withdata governancemetadata management andsecurity controlsin regulated environments
Familiarity withDevOps practicesinfrastructure as code andCI/CD pipelines
Preferred qualifications
Exposure tomachine learning pipelinesin Databricks for financial modeling or fraud detection
Certifications inAzure Data EngineeringDatabricks orfinancial data management
Experience leadingcross-functional data initiativesorplatform migrations
Who you will work with
As a member of NN Banks you will work within the mortgage data lake domain where our mission is to collect and deliver high-quality data to our customers in the Mortgage and finance and risk domains for risk modelling and reporting.
In the Data lake team collaboration is key. We engage with other teams to achieve the best possible results and everyone is very accessible and willing to help. Our team is composed of diverse international individuals with varying backgrounds and areas of expertise.
Any questions
If you have any questions about the job or about the process you can reach out via mail to Bianca Schaareman (Talent Acquisition Specialist) via .
Required Experience:
IC
View more
View less