Overview
This is a greenfield build expanding existing sports trading platform within SIG Sportsa startup team that operates with the agility of a small company but has the financial and technical backing of Susquehanna International Group (SIG) one of the largest and most successful proprietary trading firms globally. The SIG Sports team is split between the United States and Dublin Ireland and is composed of high-performing engineers who are passionate about building innovative trading systems. The team is scaling quickly and looking for strong senior engineers who can hit the ground running take ownership and help shape the platform from the ground up.
Senior role responsible for designing building and maintaining the enterprise data ecosystem everything that happens after raw data is captured. The focus is on ensuring long-term data persistence efficient pipelines and enabling analytics at scale.
Key Focus Areas
- Data Architecture & Engineering: Own the full lifecycle of data after ingestion from raw storage to curated analytics-ready datasets.
- Pipelines & Processing: Build and manage data pipelines (batch and streaming) that transform event and trading data into structured meaningful outputs.
- Storage & Persistence: Design scalable and cost-efficient data storage strategies (data lakes warehouses).
- Technology Stack: Experience with Databricks Spark cloud data platforms (AWS/Azure/GCP) and modern data warehousing tools (e.g. Snowflake Delta Lake).
- Scale & Performance: Proven experience operating in large-scale high-throughput environments.
- Ecosystem Leadership: Build the ecosystem governance tooling standards and team practices around enterprise data management.
- Domain Understanding: Ideally familiar with financial markets or sports trading data (time-based event-driven high-volume).
The existing infra team handles platform deployment and infrastructure operations so this role is cantered on building the data systems and logic not running the underlying platform.
What were looking for
Technical Expertise:
- Programming: Python SQL and sometimes Scala or Java (for big data pipelines).
- Data Engineering Tools: Hadoop Spark Kafka Airflow and ETL frameworks.
- Cloud Platforms: AWS Azure or GCP (esp. services like EMR Databricks BigQuery Synapse).
- Databases: Both relational (PostgreSQL Oracle) and NoSQL (Cassandra MongoDB).
Data Management & Governance
- Data modelling data warehousing (e.g. Snowflake Redshift) metadata management and strong understanding of data quality lineage and compliance (critical in finance).
Analytics & Visualization
- Familiarity with tools like Power BI Tableau or Looker.
- Understanding of applied statistics machine learning basics and time series analysis (especially for market or risk data).
Domain Knowledge
- Financial markets trading systems risk management or regulatory reporting (MiFID II Basel etc.).
Soft Skills
- Cross-functional collaboration with data scientists engineers and compliance teams.
- Ability to translate technical insights into business terms for finance stakeholders
Required Experience:
Director
OverviewThis is a greenfield build expanding existing sports trading platform within SIG Sportsa startup team that operates with the agility of a small company but has the financial and technical backing of Susquehanna International Group (SIG) one of the largest and most successful proprietary trad...
Overview
This is a greenfield build expanding existing sports trading platform within SIG Sportsa startup team that operates with the agility of a small company but has the financial and technical backing of Susquehanna International Group (SIG) one of the largest and most successful proprietary trading firms globally. The SIG Sports team is split between the United States and Dublin Ireland and is composed of high-performing engineers who are passionate about building innovative trading systems. The team is scaling quickly and looking for strong senior engineers who can hit the ground running take ownership and help shape the platform from the ground up.
Senior role responsible for designing building and maintaining the enterprise data ecosystem everything that happens after raw data is captured. The focus is on ensuring long-term data persistence efficient pipelines and enabling analytics at scale.
Key Focus Areas
- Data Architecture & Engineering: Own the full lifecycle of data after ingestion from raw storage to curated analytics-ready datasets.
- Pipelines & Processing: Build and manage data pipelines (batch and streaming) that transform event and trading data into structured meaningful outputs.
- Storage & Persistence: Design scalable and cost-efficient data storage strategies (data lakes warehouses).
- Technology Stack: Experience with Databricks Spark cloud data platforms (AWS/Azure/GCP) and modern data warehousing tools (e.g. Snowflake Delta Lake).
- Scale & Performance: Proven experience operating in large-scale high-throughput environments.
- Ecosystem Leadership: Build the ecosystem governance tooling standards and team practices around enterprise data management.
- Domain Understanding: Ideally familiar with financial markets or sports trading data (time-based event-driven high-volume).
The existing infra team handles platform deployment and infrastructure operations so this role is cantered on building the data systems and logic not running the underlying platform.
What were looking for
Technical Expertise:
- Programming: Python SQL and sometimes Scala or Java (for big data pipelines).
- Data Engineering Tools: Hadoop Spark Kafka Airflow and ETL frameworks.
- Cloud Platforms: AWS Azure or GCP (esp. services like EMR Databricks BigQuery Synapse).
- Databases: Both relational (PostgreSQL Oracle) and NoSQL (Cassandra MongoDB).
Data Management & Governance
- Data modelling data warehousing (e.g. Snowflake Redshift) metadata management and strong understanding of data quality lineage and compliance (critical in finance).
Analytics & Visualization
- Familiarity with tools like Power BI Tableau or Looker.
- Understanding of applied statistics machine learning basics and time series analysis (especially for market or risk data).
Domain Knowledge
- Financial markets trading systems risk management or regulatory reporting (MiFID II Basel etc.).
Soft Skills
- Cross-functional collaboration with data scientists engineers and compliance teams.
- Ability to translate technical insights into business terms for finance stakeholders
Required Experience:
Director
View more
View less