Required Travel :Minimal
Managerial - No
Location: :Greece- Athens (Amdocs Site)
Who are we
Amdocs helps the worlds leading communications and media companies deliver exceptional customer experiences through reliable efficient and secure operations at scale. We provide software products and services that embed intelligence into how work runs across business IT and network domains delivering measurable outcomes in customer experience network performance cloud modernization and revenue growth. With our talented people and more than forty years of experience running mission-critical systems around the globe Amdocs runs billions of transactions daily. Our technology is relied on every day connecting people worldwide and advancing a more inclusive connected world. Together we help those who shape the future to make it amazing. Amdocs is listed on the NASDAQ Global Select Market (NASDAQ: DOX) and reported revenue of $4.53 billion in fiscal 2025. For more information visit
At Amdocs our mission is to empower our employees to Live Amazing Do Amazing every day. We believe in creating a workplace where you not only excel professionally but also thrive personally. Through our culture of making a real impact fostering growth embracing flexibility and building connections we enable them to live meaningful lives while making a difference in the world.
In one sentence
Responsible for designing developing debugging deploying and maintaining scalable Python-based ETL pipelines while ensuring data integrity fostering cross-functional collaboration and maintaining compliance through comprehensive documentation.
What will your job look like
You will design develop and implement robust ETL pipelines using Python to efficiently extract transform and load data from diverse sources supporting both legacy and modern systems.
You will maintain and enhance existing ETL workflows ensuring performance scalability and reliability across various environments.
You will work with complex XML data formats including vendor-specific schemas(eg Nokia Ericsson Huawei etc) to parse transform and integrate configuration and performance data.
You will troubleshoot and optimize data pipelines identifying bottlenecks and implementing improvements to enhance speed and resilience.
You will collaborate with cross-functional teamsincluding solution architects technical consultants and business stakeholdersto align data solutions with operational and business requirements.
You will ensure data integrity and consistency by applying validation rules monitoring mechanisms and best practices throughout the pipeline lifecycle.
You will document pipeline architecture data flow logic and XML schema mappings to support maintainability knowledge sharing and compliance.
You will be encouraged to actively look for innovation continuous improvement and efficiency in all assigned tasks.
All you need is...
Qualifications:
A Bachelors degree in Engineering Computer Science Information Technology or a related field (or equivalent practical experience).
4 years of hands-on experience in data engineering with a strong focus on Python-based development.
Proficiency in Python including experience with libraries such as pandas lxml and other tools for data manipulation.
Proven experience in designing building and maintaining ETL pipelines with a solid understanding of data ingestion techniques and best practices.
Deep familiarity with XML data formats including the ability to parse and transform complex vendor-specific schemas.
Proficiency in SQL including:
o Writing complex queries
o Performing joins and aggregations
o Data quality validation including completeness accuracy and reconciliation
High attention to detail especially when working with complex data structures and transformation logic.
Preferred skills:
Experience using Apache Hop for ETL pipeline development workflow orchestration and metadata-driven transformation logic.
Familiarity with cloud platforms (e.g. AWS Azure GCP) and cloud-native data services.
Exposure to telecommunications systems or working with vendor data formats (e.g. Ericsson Nokia Huawei etc).
Knowledge of CI/CD practices version control (Git) and agile development methodologies.
Why you will love this job:
You will have the opportunity to make a difference by shaping the design and architecture of data pipelines contributing innovative ideas and enhancements that influence how data is processed integrated and delivered across systems.
You will have the opportunity to work with the industrys most advanced technologies.
You will have the opportunity to work in a growing organization with ever-growing opportunities for personal growth.
Amdocs is an equal opportunity employer. We welcome applicants from all backgrounds and are committed to fostering a diverse and inclusive workforce
Required Experience:
IC
Required Travel :MinimalManagerial - NoLocation: :Greece- Athens (Amdocs Site)Who are we Amdocs helps the worlds leading communications and media companies deliver exceptional customer experiences through reliable efficient and secure operations at scale. We provide software products and services th...
Required Travel :Minimal
Managerial - No
Location: :Greece- Athens (Amdocs Site)
Who are we
Amdocs helps the worlds leading communications and media companies deliver exceptional customer experiences through reliable efficient and secure operations at scale. We provide software products and services that embed intelligence into how work runs across business IT and network domains delivering measurable outcomes in customer experience network performance cloud modernization and revenue growth. With our talented people and more than forty years of experience running mission-critical systems around the globe Amdocs runs billions of transactions daily. Our technology is relied on every day connecting people worldwide and advancing a more inclusive connected world. Together we help those who shape the future to make it amazing. Amdocs is listed on the NASDAQ Global Select Market (NASDAQ: DOX) and reported revenue of $4.53 billion in fiscal 2025. For more information visit
At Amdocs our mission is to empower our employees to Live Amazing Do Amazing every day. We believe in creating a workplace where you not only excel professionally but also thrive personally. Through our culture of making a real impact fostering growth embracing flexibility and building connections we enable them to live meaningful lives while making a difference in the world.
In one sentence
Responsible for designing developing debugging deploying and maintaining scalable Python-based ETL pipelines while ensuring data integrity fostering cross-functional collaboration and maintaining compliance through comprehensive documentation.
What will your job look like
You will design develop and implement robust ETL pipelines using Python to efficiently extract transform and load data from diverse sources supporting both legacy and modern systems.
You will maintain and enhance existing ETL workflows ensuring performance scalability and reliability across various environments.
You will work with complex XML data formats including vendor-specific schemas(eg Nokia Ericsson Huawei etc) to parse transform and integrate configuration and performance data.
You will troubleshoot and optimize data pipelines identifying bottlenecks and implementing improvements to enhance speed and resilience.
You will collaborate with cross-functional teamsincluding solution architects technical consultants and business stakeholdersto align data solutions with operational and business requirements.
You will ensure data integrity and consistency by applying validation rules monitoring mechanisms and best practices throughout the pipeline lifecycle.
You will document pipeline architecture data flow logic and XML schema mappings to support maintainability knowledge sharing and compliance.
You will be encouraged to actively look for innovation continuous improvement and efficiency in all assigned tasks.
All you need is...
Qualifications:
A Bachelors degree in Engineering Computer Science Information Technology or a related field (or equivalent practical experience).
4 years of hands-on experience in data engineering with a strong focus on Python-based development.
Proficiency in Python including experience with libraries such as pandas lxml and other tools for data manipulation.
Proven experience in designing building and maintaining ETL pipelines with a solid understanding of data ingestion techniques and best practices.
Deep familiarity with XML data formats including the ability to parse and transform complex vendor-specific schemas.
Proficiency in SQL including:
o Writing complex queries
o Performing joins and aggregations
o Data quality validation including completeness accuracy and reconciliation
High attention to detail especially when working with complex data structures and transformation logic.
Preferred skills:
Experience using Apache Hop for ETL pipeline development workflow orchestration and metadata-driven transformation logic.
Familiarity with cloud platforms (e.g. AWS Azure GCP) and cloud-native data services.
Exposure to telecommunications systems or working with vendor data formats (e.g. Ericsson Nokia Huawei etc).
Knowledge of CI/CD practices version control (Git) and agile development methodologies.
Why you will love this job:
You will have the opportunity to make a difference by shaping the design and architecture of data pipelines contributing innovative ideas and enhancements that influence how data is processed integrated and delivered across systems.
You will have the opportunity to work with the industrys most advanced technologies.
You will have the opportunity to work in a growing organization with ever-growing opportunities for personal growth.
Amdocs is an equal opportunity employer. We welcome applicants from all backgrounds and are committed to fostering a diverse and inclusive workforce
Required Experience:
IC
View more
View less