We are looking for a skilled PySpark Developer with hands-on experience in Reltio MDM to join our data engineering team. The ideal candidate will be responsible for designing and implementing scalable data processing solutions using PySpark and integrating with Reltios cloud-native MDM platform. Key Responsibilities:Develop and maintain data pipelines using PySpark in distributed computing environments (e.g. AWS EMR Databricks).Integrate and synchronize data between enterprise systems and the Reltio MDM and implement data transformation cleansing and enrichment with data architects business analysts and Reltio solution architects to ensure high-quality data on API-based integration between Reltio and upstream/downstream PySpark jobs for performance and data quality integrity and governance throughout the and resolve data and performance issues in existing Skills & Qualifications:7 years of experience in PySpark development and distributed data understanding of Apache Spark DataFrames and Spark with Reltio MDM including entity modeling survivorship rules match & merge in working with REST APIs and JSON data with cloud platforms like AWS and data services (e.g. S3 Lambda step function)Good knowledge of data warehousing concepts ETL workflows and data with CI/CD practices and version control tools like problem-solving and communication skills.