As a leading financial services and healthcare technology company based on revenue SS&C is headquartered in Windsor Connecticut and has 27000 employees in 35 countries. Some 20000 financial services and healthcare organizations from the worlds largest companies to small and mid-market firms rely on SS&C for expertise scale and technology.
Job Description
Job Summary
We are seeking an experienced Senior Data Engineer to build optimize and maintain scalable data pipelines and infrastructure in a modern lakehouse environment. You will work closely with Data Architects to implement well-defined data products schemas and patterns ensuring reliable data ingestion transformation quality and distribution. This role requires strong hands-on expertise with both batch and streaming systems as well as a deep focus on performance reliability and operational excellence.
Key Responsibilities
- Implement and maintain end-to-end data pipelines for data acquisition from diverse sources including databases APIs files and messaging systems such as Kafka.
- Build robust data validation enrichment and transformation workflows using Python and pySpark.
- Develop and optimize data storage and querying layers using technologies such as Apache Iceberg Trino StarRocks and Snowflake.
- Implement and maintain dimensional data models including Star and Snowflake schemas as defined by data architecture standards.
- Integrate and manage streaming data flows using Kafka for both ingestion and real-time data distribution.
- Design and implement data quality checks monitoring and alerting to ensure high data reliability.
- Contribute to metadata management data governance and security practices including access controls and data masking.
- Enable data distribution and consumption through files APIs Kafka Snowflake data sharing and analytics tools.
- Optimize pipeline performance cost and scalability while troubleshooting and resolving production issues.
- Collaborate closely with data architects analysts data scientists and stakeholders to deliver high-quality data products.
- Mentor junior engineers and promote best practices in code quality testing and CI/CD for data pipelines.
Required Skills and Qualifications
- Bachelors or Masters degree in Computer Science Engineering or a related field.
- 5 years of hands-on experience in data engineering roles including at least 2 years working with big data or lakehouse platforms.
- Strong proficiency in Python and pySpark for building scalable data processing pipelines.
- Hands-on experience with analytical and query platforms such as Trino StarRocks and Snowflake.
- Experience working with open table formats particularly Apache Iceberg.
- Proven experience with streaming technologies especially Apache Kafka.
- Solid understanding of dimensional modeling and data warehousing concepts.
- Familiarity with data quality frameworks metadata management governance tools and security best practices.
- Experience with cloud platforms such as AWS Azure or GCP and infrastructure-as-code tools.
- Strong problem-solving skills with experience debugging and tuning complex data pipelines.
- Excellent communication and collaboration skills.
Preferred Qualifications
- Experience building and operating large-scale real-time and batch data platforms.
- Familiarity with orchestration tools such as Airflow or Dagster.
- Experience with CI/CD practices for data engineering workflows.
- Familiarity with BI tools and analytic dashboard integrations.
- Relevant certifications (e.g. Databricks Snowflake Confluent) or contributions to open-source projects.
Unless explicitly requested or approached by SS&C Technologies Inc. or any of its affiliated companies the company will not accept unsolicited resumes from headhunters recruitment agencies or fee-based recruitment services.
SS&C Technologies is an Equal Employment Opportunity employer and does not discriminate against any applicant for employment or employee on the basis of race color religious creed gender age marital status sexual orientation national origin disability veteran status or any other classification protected by applicable discrimination laws.
Required Experience:
Senior IC
As a leading financial services and healthcare technology company based on revenue SS&C is headquartered in Windsor Connecticut and has 27000 employees in 35 countries. Some 20000 financial services and healthcare organizations from the worlds largest companies to small and mid-market firms rely on ...
As a leading financial services and healthcare technology company based on revenue SS&C is headquartered in Windsor Connecticut and has 27000 employees in 35 countries. Some 20000 financial services and healthcare organizations from the worlds largest companies to small and mid-market firms rely on SS&C for expertise scale and technology.
Job Description
Job Summary
We are seeking an experienced Senior Data Engineer to build optimize and maintain scalable data pipelines and infrastructure in a modern lakehouse environment. You will work closely with Data Architects to implement well-defined data products schemas and patterns ensuring reliable data ingestion transformation quality and distribution. This role requires strong hands-on expertise with both batch and streaming systems as well as a deep focus on performance reliability and operational excellence.
Key Responsibilities
- Implement and maintain end-to-end data pipelines for data acquisition from diverse sources including databases APIs files and messaging systems such as Kafka.
- Build robust data validation enrichment and transformation workflows using Python and pySpark.
- Develop and optimize data storage and querying layers using technologies such as Apache Iceberg Trino StarRocks and Snowflake.
- Implement and maintain dimensional data models including Star and Snowflake schemas as defined by data architecture standards.
- Integrate and manage streaming data flows using Kafka for both ingestion and real-time data distribution.
- Design and implement data quality checks monitoring and alerting to ensure high data reliability.
- Contribute to metadata management data governance and security practices including access controls and data masking.
- Enable data distribution and consumption through files APIs Kafka Snowflake data sharing and analytics tools.
- Optimize pipeline performance cost and scalability while troubleshooting and resolving production issues.
- Collaborate closely with data architects analysts data scientists and stakeholders to deliver high-quality data products.
- Mentor junior engineers and promote best practices in code quality testing and CI/CD for data pipelines.
Required Skills and Qualifications
- Bachelors or Masters degree in Computer Science Engineering or a related field.
- 5 years of hands-on experience in data engineering roles including at least 2 years working with big data or lakehouse platforms.
- Strong proficiency in Python and pySpark for building scalable data processing pipelines.
- Hands-on experience with analytical and query platforms such as Trino StarRocks and Snowflake.
- Experience working with open table formats particularly Apache Iceberg.
- Proven experience with streaming technologies especially Apache Kafka.
- Solid understanding of dimensional modeling and data warehousing concepts.
- Familiarity with data quality frameworks metadata management governance tools and security best practices.
- Experience with cloud platforms such as AWS Azure or GCP and infrastructure-as-code tools.
- Strong problem-solving skills with experience debugging and tuning complex data pipelines.
- Excellent communication and collaboration skills.
Preferred Qualifications
- Experience building and operating large-scale real-time and batch data platforms.
- Familiarity with orchestration tools such as Airflow or Dagster.
- Experience with CI/CD practices for data engineering workflows.
- Familiarity with BI tools and analytic dashboard integrations.
- Relevant certifications (e.g. Databricks Snowflake Confluent) or contributions to open-source projects.
Unless explicitly requested or approached by SS&C Technologies Inc. or any of its affiliated companies the company will not accept unsolicited resumes from headhunters recruitment agencies or fee-based recruitment services.
SS&C Technologies is an Equal Employment Opportunity employer and does not discriminate against any applicant for employment or employee on the basis of race color religious creed gender age marital status sexual orientation national origin disability veteran status or any other classification protected by applicable discrimination laws.
Required Experience:
Senior IC
View more
View less