Project the aim youll have
The project focuses on transitioning from a legacy data warehouse to a modern cloud-based data lake architecture for a major telecommunications provider. A key component of the initiative is the modernization and adaptation of an existing campaign management system responsible for delivering targeted marketing communications (e.g. email and SMS) to customers. The work involves redesigning data flows integrating campaign logic with the new data lake ecosystem and ensuring scalable real-time access to customer segmentation and analytics.
Position how youll contribute
Your role will focus on building a pure backend environment (no frontend work) within a distributed microservices-based architecture:
- Designing developing testing and deploying backend services in Python
- Building and maintaining microservices-based systems
- Processing and integrating data from queueing/streaming systems (Kafka)
- Designing and managing data flows using Apache NiFi
- Developing ETL pipelines with AWS Glue Azure Data Factory or similar tools
- Implementing scalable data processing workflows
- Improving and optimizing existing backend services
- Translating business requirements into robust technical solutions
- Collaborating with architects DevOps engineers and data engineering teams
- Supporting infrastructure and platform integration initiatives
Qualifications :
Expectations the experience you need
- Minimum 5 years of commercial backend experience with Python
- Practical experience with microservices architecture
- Experience working with data streaming or queueing systems (Kafka/ ActiveMQ Artemis preferred)
- Knowledge of data processing and ETL workflows
- Testing experience (PyTest Cucumber/Behave)
- Hands-on experience with containers
- Kubernetes and Helm familiarity
- CI/CD pipelines (e.g. GitLab)
Additional skills the edge you have
- Experience with Apache NiFi
- Familiarity with cloud ETL platforms (AWS Glue Azure Data Factory or similar)
- Understanding of distributed data processing patterns
Additional Information :
Our offer professional development personal growth:
- Flexible employment and remote work
- International projects with leading global clients
- International business trips
- Non-corporate atmosphere
- Language classes
- Internal & external training
- Private healthcare and insurance
- Multisport card
- Well-being initiatives
Remote Work :
Yes
Employment Type :
Full-time
Project the aim youll haveThe project focuses on transitioning from a legacy data warehouse to a modern cloud-based data lake architecture for a major telecommunications provider. A key component of the initiative is the modernization and adaptation of an existing campaign management system respon...
Project the aim youll have
The project focuses on transitioning from a legacy data warehouse to a modern cloud-based data lake architecture for a major telecommunications provider. A key component of the initiative is the modernization and adaptation of an existing campaign management system responsible for delivering targeted marketing communications (e.g. email and SMS) to customers. The work involves redesigning data flows integrating campaign logic with the new data lake ecosystem and ensuring scalable real-time access to customer segmentation and analytics.
Position how youll contribute
Your role will focus on building a pure backend environment (no frontend work) within a distributed microservices-based architecture:
- Designing developing testing and deploying backend services in Python
- Building and maintaining microservices-based systems
- Processing and integrating data from queueing/streaming systems (Kafka)
- Designing and managing data flows using Apache NiFi
- Developing ETL pipelines with AWS Glue Azure Data Factory or similar tools
- Implementing scalable data processing workflows
- Improving and optimizing existing backend services
- Translating business requirements into robust technical solutions
- Collaborating with architects DevOps engineers and data engineering teams
- Supporting infrastructure and platform integration initiatives
Qualifications :
Expectations the experience you need
- Minimum 5 years of commercial backend experience with Python
- Practical experience with microservices architecture
- Experience working with data streaming or queueing systems (Kafka/ ActiveMQ Artemis preferred)
- Knowledge of data processing and ETL workflows
- Testing experience (PyTest Cucumber/Behave)
- Hands-on experience with containers
- Kubernetes and Helm familiarity
- CI/CD pipelines (e.g. GitLab)
Additional skills the edge you have
- Experience with Apache NiFi
- Familiarity with cloud ETL platforms (AWS Glue Azure Data Factory or similar)
- Understanding of distributed data processing patterns
Additional Information :
Our offer professional development personal growth:
- Flexible employment and remote work
- International projects with leading global clients
- International business trips
- Non-corporate atmosphere
- Language classes
- Internal & external training
- Private healthcare and insurance
- Multisport card
- Well-being initiatives
Remote Work :
Yes
Employment Type :
Full-time
View more
View less