Senior Data Engineer

Not Interested
Bookmark
Report This Job

profile Job Location:

Taguig - Philippines

profile Monthly Salary: Not Disclosed
Posted on: 2 hours ago
Vacancies: 1 Vacancy

Job Summary

Role Overview

This role will play a critical role in ensuring the reliability scalability and compliance of data pipelines that support surveillance systems across communications and trading activities covering the above structured and unstructured data.

The role bridges engineering and operations enabling robust data ingestion transformation and monitoring to meet regulatory and internal compliance requirements. The Data Ops Engineer will play a critical role in collaborating with upstream teams to ensure data completeness accuracy and timeliness is as expected and that any data completeness or quality issues are visible.

The role will also work on other Surveillance data initiatives such as persisting Surveillance Alerts in the firms data lake for analytics purposes.

Role Responsibilities

  • Design build maintain and optimise end-to-end data pipelines and workflows between the source data points and target destinations working with the wider Surveillance Technology team to prioritise automation scalability and strategy at the heart of the design.

  • Implement automated data completeness and quality checks validation rules and reconciliation processes to ensure accuracy completeness and timeliness of the data ingested and to make visible any data that is not processed.

  • Identify Critical Data Elements and implement failover and recovery strategies for the respective Data Flows.

  • Build AWS infrastructure using Terraform or CDK

  • Write unit integration and infrastructure tests

  • Monitor investigate and resolve data anomalies through collaboration with Business Analysts Developers and Testers across functions and verticals.

  • Implement data management and governance frameworks to ensure data is ingested and loaded per the requirements of the consuming platform; Scila for Trade Surveillance and Global Relay for Communications Surveillance.

  • Partnering with the Data Strategy and Data Infrastructure team to ensure Data Lineage auditability and retention policies are enforced across all necessary pipelines.

  • Ensuring that Data consumed and processed is compliance with regulatory legal and security protocols.

  • Work closely with surveillance analysts compliance officers and engineering teams to translate business rules into technical specifications.

  • Partnering closely with stakeholders and subject matter experts such as the Cloud Infrastructure team to optimise performance and costs.

  • Stay updated on industry trends and emerging tech to ensure continuous improvement.

Experience / Competences

Essential:

  • Strong experience ETL/ELT data pipeline builds from design to implementation to maintenance in relation to financial market messaging platforms and trade & order systems.

  • Solid understanding of CI/CD pipelines ideally with a background in software engineering product management or data analytics.

  • Experience with some of EKS Lambda EventBridge Step Functions S3 DynamoDB AWS Glue Snowflake Terraform and Transfer Family.

  • Strong proficiency in Python or Java SQL and data pipeline frameworks (e.g. Airflow dbt Spark) with solid experience with the AWS ecosystem.

  • Proven expertise with data governance frameworks and compliance regulations in financial services.

  • Knowledge of streaming technologies (Kafka Kinesis) and API integrations and hands-on experience with monitoring tools (e.g. Grafana) and observability practices.

  • Excellent problem-solving skills and ability to work in a fast-paced environment.

  • Bachelors degree in Computer Science Data Science Engineering or related field.

  • Previous experience in Data Ops and Data Engineering.

  • Strong communication and collaboration skills to engage with technical and non-technical stakeholders.

  • Strong experience with Agile software delivery.

Desired:

  • Experience with market data ingestion metadata extraction and event-driven architectures.

  • Proficient with Terraform or CDK (infrastructure-as-code).

  • Experience in Business Communications Technology e.g. Bloomberg ICE Symphony Teams Chat etc.

  • Familiarity with security best practices IAM and VPN configuration.

  • Experience with regulatory compliance and data security in financial services.

  • Knowledge of financial markets and trading platforms.

  • Experience with GitLab Qliksense & Alation

  • Certifications in DataOps cloud platforms or related areas.

Role OverviewThis role will play a critical role in ensuring the reliability scalability and compliance of data pipelines that support surveillance systems across communications and trading activities covering the above structured and unstructured data.The role bridges engineering and operations ena...
View more view more

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Company Logo

Recognised as AWSs Rising Star Partner of the Year for 2023 in EMEA and 2022 in the UK&I were expanding globally with new offices in South Africa and Dubai a strong presence in the Philippines and our HQ in the UK.If youre ready to join a high-growth AWS partner and take your career t ... View more

View Profile View Profile