Key Responsibilities:
- Design and implement data lakehouse solutions on AWS using Medallion Architecture (Bronze/Silver/Gold layers).
- Build and optimize real-time and batch data pipelines leveraging Apache Spark Kafka and AWS Glue/EMR.
- Architect storage and processing layers using Parquet and Iceberg for schema evolution partitioning and performance optimization.
- Integrate AWS data services (S3 Redshift Lake Formation Kinesis Lambda DynamoDB) into enterprise solutions.
- Ensure data governance lineage cataloging and security compliance in line with financial regulations (Basel III MiFID II Dodd-Frank).
- Partner with business stakeholders (trading risk compliance) to translate requirements into technical architecture.
- Provide technical leadership and guidance to engineering teams.
Required Skills & Experience:
- Core Technical Expertise
- Strong hands-on skills in AWS Data Services (S3 Redshift Glue EMR Kinesis Lake Formation DynamoDB).
- Expertise in Apache Kafka (event streaming) and Apache Spark (batch and streaming).
- Proficiency in Python for data engineering and automation.
- Strong knowledge of Parquet Iceberg and Medallion Architecture.
- Finance & Capital Markets Knowledge
- Experience with trading systems market data feeds risk analytics and regulatory reporting.
- Familiarity with time-series data reference/master data and real-time analytics.
- Preferred
- Exposure to Delta Lake DBT Databricks or Snowflake.
- AWS Certifications (Solutions Architect Professional Data Analytics Specialty).
Qualifications:
- Bachelors or Masters degree in Computer Science Engineering or related field.
- 10 years of IT experience with 5 years in data architecture (3 years on AWS).
- Mandatory exposure to Finance and Capital Markets.
Key Responsibilities: Design and implement data lakehouse solutions on AWS using Medallion Architecture (Bronze/Silver/Gold layers). Build and optimize real-time and batch data pipelines leveraging Apache Spark Kafka and AWS Glue/EMR. Architect storage and processing layers using Parquet and Iceber...
Key Responsibilities:
- Design and implement data lakehouse solutions on AWS using Medallion Architecture (Bronze/Silver/Gold layers).
- Build and optimize real-time and batch data pipelines leveraging Apache Spark Kafka and AWS Glue/EMR.
- Architect storage and processing layers using Parquet and Iceberg for schema evolution partitioning and performance optimization.
- Integrate AWS data services (S3 Redshift Lake Formation Kinesis Lambda DynamoDB) into enterprise solutions.
- Ensure data governance lineage cataloging and security compliance in line with financial regulations (Basel III MiFID II Dodd-Frank).
- Partner with business stakeholders (trading risk compliance) to translate requirements into technical architecture.
- Provide technical leadership and guidance to engineering teams.
Required Skills & Experience:
- Core Technical Expertise
- Strong hands-on skills in AWS Data Services (S3 Redshift Glue EMR Kinesis Lake Formation DynamoDB).
- Expertise in Apache Kafka (event streaming) and Apache Spark (batch and streaming).
- Proficiency in Python for data engineering and automation.
- Strong knowledge of Parquet Iceberg and Medallion Architecture.
- Finance & Capital Markets Knowledge
- Experience with trading systems market data feeds risk analytics and regulatory reporting.
- Familiarity with time-series data reference/master data and real-time analytics.
- Preferred
- Exposure to Delta Lake DBT Databricks or Snowflake.
- AWS Certifications (Solutions Architect Professional Data Analytics Specialty).
Qualifications:
- Bachelors or Masters degree in Computer Science Engineering or related field.
- 10 years of IT experience with 5 years in data architecture (3 years on AWS).
- Mandatory exposure to Finance and Capital Markets.
View more
View less