Responsibilities
- We have a fantastic opportunity for an experienced Data Engineer to join our global team in an Engineering Manager role. This role will play a major part in the delivery of our Group Data Strategy and Data Transformation Journey by delivering enhancing and maintaining our IQEQ Data Platform which will drive how our data is managed and used to deliver outcomes in a host of key areas to maximise business value and growth delivering improvements for internal and external stakeholders and clients.
- This is a deeply technical hands-on role requiring an experienced practitioner who will get directly involved in expanding and optimizing our data pipeline architecture. We need a proven data engineering professional with extensive current experience in building and managing complex data infrastructures. The ideal candidate will have a track record of rolling up their sleeves to construct troubleshoot and refine data pipelines through direct practical work.
- You will actively support our software developers database architects data analysts and data scientists by directly implementing data initiatives not just strategizing. Your day-to-day responsibilities demand practical technical expertise in data wrangling with a proven ability to navigate and leverage significant data assets to build optimal production-ready models.
- You will ensure optimal data delivery architecture is consistent throughout ongoing projects and you must be self-directed and comfortable supporting the data needs of multiple teams systems and products.
Tasks (what does the role do on a day-to-day basis)
- Create and maintain optimal data pipeline architecture
- Assemble large complex data sets that meet functional / non-functional business requirements.
- Identify design and implement internal process improvements: automating manual processes optimizing data delivery re-designing infrastructure for greater scalability etc.
- Build the infrastructure required for optimal extraction transformation and loading of data from a wide variety of data sources using SQL and AWS Azure big data technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive Product Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
- Implement data flows to connect operational systems data for analytics and business intelligence (BI) systems
- Document source-to-target mappings
- Re-engineer manual data flows to enable scaling and repeatable use
- Write ETL (extract transform load) scripts and code to ensure the ETL process performs optimally
- Develop business intelligence reports that can be reused
- Supporting the Lead Data Engineer providing day to day supervision of the Data Science and Engineering team
- Provide in-country coordination for the team ensuring the India team adheres to processes and policy at the local level
Key behaviours we expect to see
- In addition to demonstrating our Group Values (Authentic Bold and Collaborative) the role holder will be expected to demonstrate the following:
- Communicates Effectively Adjusting communication style to fit the audience & message. Providing timely information to help others across the organisation. Encourages the open expression of diverse ideas and opinions
- Action Orientated Readily taking action on challenges without unnecessary planning and identifies new opportunities taking ownership of them
- Interpersonal Savvy Relating comfortably with people across all levels functions cultures & geographies. Builds rapport in an open friendly & accepting way
- An analytical mind excellent problem-solving & diagnostic skills attention to detail
Qualifications :
Education / professional qualifications
- Bachelors degree in computer science or another related field
- 8-10 years of experience in software engineering.
- Specific experience in Data Engineering and Analytics
- Background in Financial Industry preferred.
Background & Technical experience
- Proficiency in Linux fundamentals and Bash scripting skills.
- Programming expertise in one or more languages mainly: Python Go Scala C Kotlin
- Expertise on Python libraries - Pandas Numpy PySpark Dask.
- In-Depth Knowledge of Algorithms and Data Structures
- Deep understand of database systems e.g. PgSQL/MySQL and Microsoft SQL server
- Experience with at least one cloud platforms e.g. AWS Azure GCP
- Experience with one or more Datalakes/Datawarehouses - Snowflake / DataBricks / Redshift etc
- Experience in Stream processing - Kafka Kineses etc
- Experienced in the implementation of Data warehousing solutions
- Experienced in the implementation of API solutions and tooling
- Other company product and market knowledge
- Experience of working in a complex multi-country professional services financial services or BPO organisation with complex processing requirements
- Multi-country experience and demonstrates an ability to work in a multi-cultural talented and demanding team environment.
- Possess the skills and the personality to operate effectively in a very fast-paced complex global business with an in-depth knowledge of program management
- Excellent communication skills in both written and oral form with staff members customers suppliers and the management team with the ability to make decisions act and get results
- Passion dynamism and drive
- Personal presence integrity and credibility
- Ability to solve problems either independently or by utilising other members of the team where necessary
- Experience of leading and mentoring teams.
- Strong analytical and troubleshooting skills.
- Ability to investigate and analyse information and to draw conclusions.
- Experience/Exposure to ISO 27001 Infosec compliances
Languages
Fully proficient in spoken and written English additional European languages will be an asset
Remote Work :
No
Employment Type :
Full-time