Our client is a global technology consulting and digital solutions company that enables enterprises across industries to reimagine business models accelerate innovation and maximize growth by harnessing digital technologies. The company has 90000 employees across the globe.
We are looking for a Data Engineer to design and optimize data solutions that power decisionmaking across global business units. Youll work handson with technologies like PySpark Hadoop and Hive SQL to process largescale datasets while also playing a key role in ensuring system performance data quality and production stability. This role blends technical implementation with crossregional collaboration operational support and continuous improvement of data infrastructure.
Key takeaways:
Stack: Apache Spark Hadoop Ecosystem Python SparkSQL
Salary: Contract of Employment (UoP): PLN gross/month
Working model: Hybrid (3 days weekly in the office)
Location: Warsaw
Recruitment process:
- Call with MOTIFE Recruiter
- Technical Interview
- Interview with the Client
Responsibilities:
- Implement and configure PySpark Hadoop and Hive SQL solutions in production environments working with largescale datasets.
- Engage with stakeholders across EMEA NAM and APAC regions to address incidents coordinate fixes and ensure the timely resolution of production issues.
- Collaborate with BAU teams and the global production assurance team to maintain system stability performance and adherence to SLAs.
- Provide technical guidance and support to offshore teams particularly in PySpark and Hadoop environments including troubleshooting and issue resolution.
- Utilize Autosys for job scheduling monitoring and automation of workflows.
- Work closely with regional EMEA tech teams to ensure compliance with data protection regulations and best practices in data handling.
Requirements:
- Professional experience in Big Data PySpark HIVE Hadoop PL/SQL.
- Good knowledge of AWS and Snowflake.
- Good understanding of CI/CD and system design.
- Possession of the Databricks Certified Developer Apache Spark 3.0 certification is a mandatory requirement for this position.
- Excellent written and oral communication skills in English.
- Ability to understand and work on various internal systems.
- Ability to work with multiple stakeholders.
- Experience working on technologies in Fund transfer AML knowledge will be an added advantage.
- Bachelors or Masters degree in computer science engineering or a related field.
- Nice to have: experience with Starburst Presto.
Join the team and make a real difference. Apply now to take the next step in your career!