Our challenge
Lead the development and optimization of batch and real-time data pipelines ensuring scalability reliability and performance. Architect design and deploy data integration streaming and analytics solutions leveraging Spark Kafka and Snowflake. Ability to help voluntarily and proactively and support Team Members Peers to deliver their tasks to ensure End-to-end delivery. Evaluates technical performance challenges and recommend tuning solutions. Hands-on knowledge of Data Service Engineer to design develop and maintain our Reference Data System utilizing modern data technologies including Kafka Snowflake and Python.
Responsibilities:
-
Lead the development and optimization of batch and real-time data pipelines ensuring scalability reliability and performance.
-
Architect design and deploy data integration streaming and analytics solutions leveraging Spark Kafka and Snowflake.
-
Ability to help voluntarily and proactively and support Team Members Peers to deliver their tasks to ensure End-to-end delivery.
-
Evaluates technical performance challenges and recommend tuning solutions.
-
Hands-on knowledge of Data Service Engineer to design develop and maintain our Reference Data System utilizing modern data technologies including Kafka Snowflake and Python.
Requirements:
-
Proven experience in building and maintaining data pipelines especially using Kafka Snowflake and Python.
-
Strong expertise in distributed data processing and streaming architectures.
-
Experience with Snowflake data warehouse platform: data loading performance tuning and management.
-
Proficiency in Python scripting and programming for data manipulation and automation.
-
Familiarity with Kafka ecosystem (Confluent Kafka Connect Kafka Streams) is a big plus.
-
Knowledge of SQL data modeling and ETL/ELT processes.
-
Understanding of cloud platforms (AWS Azure GCP) is a plus
Domain Knowledge in any of the below area:
-
Trade Processing Settlement Reconciliation and related back/middle-office functions within financial markets (Equities Fixed Income Derivatives FX etc.).
-
Strong understanding of trade lifecycle events order types allocation rules and settlement processes.
-
Funding Support Planning & Analysis Regulatory reporting & Compliance. Knowledge of regulatory standards (such as Dodd-Frank EMIR MiFID II) related to trade reporting and lifecycle management.
Our challenge Lead the development and optimization of batch and real-time data pipelines ensuring scalability reliability and performance. Architect design and deploy data integration streaming and analytics solutions leveraging Spark Kafka and Snowflake. Ability to help voluntarily and proactively...
Our challenge
Lead the development and optimization of batch and real-time data pipelines ensuring scalability reliability and performance. Architect design and deploy data integration streaming and analytics solutions leveraging Spark Kafka and Snowflake. Ability to help voluntarily and proactively and support Team Members Peers to deliver their tasks to ensure End-to-end delivery. Evaluates technical performance challenges and recommend tuning solutions. Hands-on knowledge of Data Service Engineer to design develop and maintain our Reference Data System utilizing modern data technologies including Kafka Snowflake and Python.
Responsibilities:
-
Lead the development and optimization of batch and real-time data pipelines ensuring scalability reliability and performance.
-
Architect design and deploy data integration streaming and analytics solutions leveraging Spark Kafka and Snowflake.
-
Ability to help voluntarily and proactively and support Team Members Peers to deliver their tasks to ensure End-to-end delivery.
-
Evaluates technical performance challenges and recommend tuning solutions.
-
Hands-on knowledge of Data Service Engineer to design develop and maintain our Reference Data System utilizing modern data technologies including Kafka Snowflake and Python.
Requirements:
-
Proven experience in building and maintaining data pipelines especially using Kafka Snowflake and Python.
-
Strong expertise in distributed data processing and streaming architectures.
-
Experience with Snowflake data warehouse platform: data loading performance tuning and management.
-
Proficiency in Python scripting and programming for data manipulation and automation.
-
Familiarity with Kafka ecosystem (Confluent Kafka Connect Kafka Streams) is a big plus.
-
Knowledge of SQL data modeling and ETL/ELT processes.
-
Understanding of cloud platforms (AWS Azure GCP) is a plus
Domain Knowledge in any of the below area:
-
Trade Processing Settlement Reconciliation and related back/middle-office functions within financial markets (Equities Fixed Income Derivatives FX etc.).
-
Strong understanding of trade lifecycle events order types allocation rules and settlement processes.
-
Funding Support Planning & Analysis Regulatory reporting & Compliance. Knowledge of regulatory standards (such as Dodd-Frank EMIR MiFID II) related to trade reporting and lifecycle management.
View more
View less