Big data engineer for Technology Innovation Group for a Global Finance Firm
Job Title
Big data engineer
Company Overview
Top global finance firm
It is an excellent opportunity for a strong intelligent programmer to gain exposure to a front-office environment and work closely with the business. Additionally given the relatively lean and fast-moving organization the person will gain responsibility and a broad range of experience very quickly.
Your Role and Responsibilities
(Upon Employment)
Architect develop and enhance large-scale data processing systems using Scala and Apache Spark on Hadoop clusters.
Design and maintain efficient ETL workflows to ingest clean and transform financial data from multiple sources in compliance with Basel III LCR/NSFR/PRA and other regulatory requirements.
Manage and optimize object storage architectures using MinIO for scalable data lakes.
Utilize Dremio or Denodo for data virtualization and integration enabling unified access for risk regulatory and business teams.
Partner with business users to deliver accurate reliable and timely data for regulatory and liquidity reporting.
Implement best practices for data quality governance security and compliance (Basel III LCR NSFR PRA SOX FINRA).
Monitor and tune cluster resources to achieve high performance reliability and cost efficiency.
Provide technical leadership conduct code reviews and mentor junior engineers.
Maintain detailed technical documentation of architectures data flows and processes particularly for regulatory reporting.
Stay updated on emerging data technologies financial regulations and industry trends in data management and compliance.
(Scope of any potential changes) Duties as defined by the company
Experience and Qualifications
Mandatory Requirements:
Bachelors or Masters degree in Computer Science Engineering or a related discipline.
Minimum 6 years of experience in Big Data or Data Engineering roles preferably in the financial industry.
Expert-level proficiency in Scala and advanced Apache Spark for large-scale data processing.
Deep knowledge and hands-on experience with Hadoop (HDFS) and related ecosystem tools.
Practical experience with MinIO for scalable object storage management.
Strong experience with Dremio and/or Denodo for data federation and integration.
Proficiency in SQL and familiarity with BI/reporting platforms such as Tableau.
Solid understanding of data security governance and compliance in financial data management.
Direct involvement in Basel III LCR NSFR or other liquidity and regulatory reporting projects.
Proven ability to optimize data pipelines for reliability and performance.
Experience handling sensitive or regulated financial data.
Excellent communication analytical and problem-solving skills.
Preferred Qualifications:
Familiarity with data modeling and data warehouse concepts.
Experience supporting regulatory frameworks within financial organizations.
Exposure to cloud data platforms (AWS Azure or GCP).
Understanding of financial regulations (Basel III GDPR SOX FINRA etc.).
Experience working in Agile/Scrum environments.
Work Location
(Upon Employment) Tokyo
(Scope of change) Location as specified by the company
Details will be provided during the meeting.
Required Experience:
Senior IC