As a leading financial services and healthcare technology company based on revenue SS&C is headquartered in Windsor Connecticut and has 27000 employees in 35 countries. Some 20000 financial services and healthcare organizations from the worlds largest companies to small and mid-market firms rely on SS&C for expertise scale and technology.
Job Description
We are looking for aPrincipal Data Platform Engineerto lead the development of batch and real-time data pipelines on top of a modern evolving data platform.
This is ahands-on technical leadership rolewhere you will design and build scalable ingestion and transformation pipelines while mentoring a small team of engineers.
The core data platform foundationincluding storage compute engines and shared serviceshas already been established by a dedicated platform engineering team. You will work closely with that team tobuild pipelines on the platform and help guide its evolution based on real-world data integration needs.
Our environment combinesmodern streaming and lakehouse technologies with complex legacy data sources including DB2 replication fixed-width files CSV extracts and APIs.
The team is currently small butexpected to grow providing opportunities for increased leadership responsibility and career advancement as the platform and organization expand.
Build data pipelines
Design and developbatch and real-time data pipelines
ImplementCDC pipelines using Debezium and Kafka
Build streaming pipelines usingKafka and Apache Flink
Develop transformation workflows usingPython Spark / PySpark and Airflow
Integrate complex data sources
Ingest data fromDB2 replication streams
Processlegacy fixed-width and CSV data feeds
IntegrateAPI-based data sources
Work with modern data platforms
Store and manage data usingApache Iceberg and Parquet
Enable analytics throughTrino and StarRocks
Lead and grow the team
Mentor and guide a small team of data engineers
Establish best practices forpipeline architecture testing and reliability
Help recruit and grow the team as the platform expands
Remaindeeply hands-on in system design and development
Collaborate with platform engineering
Work closely with the team responsible for the underlying data platform
Provide input into theongoing evolution of the platform
8 years buildingdata platforms or large-scale data pipelines
Strong programming experience inPython
Experience withSpark / PySpark
Experience building pipelines withApache Airflow
Experience withKafka-based streaming architectures
Experience implementingCDC pipelines (Debezium or similar)
Experience withApache Flink or other streaming frameworks
Experience withParquet and modern table formats such as Apache Iceberg
Experience with distributed query engines such asTrino Presto or StarRocks
Experience integrating data fromheterogeneous or legacy systems
Experienceleading or mentoring engineers
Python
Apache Spark / PySpark
Apache Flink
Apache Airflow
Debezium
Kafka
Apache Iceberg
Parquet
Trino
StarRocks
Work oncomplex legacy-to-modern data integration problems
Buildstreaming and batch data pipelines at scale
Helpshape the evolution of a modern open data platform
Lead and grow asmall high-impact engineering team
Opportunities forincreased leadership scope as the team expands
Stayhands-on with modern distributed data systems
Unless explicitly requested or approached by SS&C Technologies Inc. or any of its affiliated companies the company will not accept unsolicited resumes from headhunters recruitment agencies or fee-based recruitment services.
SS&C Technologies is an Equal Employment Opportunity employer and does not discriminate against any applicant for employment or employee on the basis of race color religious creed gender age marital status sexual orientation national origin disability veteran status or any other classification protected by applicable discrimination laws.
Required Experience:
Staff IC
Leading cloud-based provider of financial services technology solutions. SS&C Technologies owns and maintains the best financial technology in the industry