We are looking for a skilled PySpark Developer with handson experience in Reltio MDM to join our data engineering team. The ideal candidate will be responsible for designing and implementing scalable data processing solutions using PySpark and integrating with Reltios cloudnative MDM platform. Key Responsibilities:Develop and maintain data pipelines using PySpark in distributed computing environments (e.g. AWS EMR Databricks).Integrate and synchronize data between enterprise systems and the Reltio MDM and implement data transformation cleansing and enrichment with data architects business analysts and Reltio solution architects to ensure highquality data on APIbased integration between Reltio and upstream/downstream PySpark jobs for performance and data quality integrity and governance throughout the and resolve data and performance issues in existing Skills & Qualifications:7 years of experience in PySpark development and distributed data understanding of Apache Spark DataFrames and Spark with Reltio MDM including entity modeling survivorship rules match & merge in working with REST APIs and JSON data with cloud platforms like AWS and data services (e.g. S3 Lambda step function)Good knowledge of data warehousing concepts ETL workflows and data with CI/CD practices and version control tools like problemsolving and communication JD