Project Description
The team Data Driven Access continuously improves the fixed access network in Germany. Thereby it uses data engineering data analytics and data science to detect problems in the network and to solve them before the customer even noticed that they have a problem with their landline. First step is always to understand the data structure and how the structure is connected to the the second step data engineering including data quality checks is done. Then we develop data analytics (rule based) and machine learning algorithms to detect faults in the network in a strong partnership with network experts. Last step is building a pipeline and set the solution in production. The development is done on a big data platform called One Data Lake (ODL) with KNIME and Cloudera Data Science Workbench. Currently a migration from ODL to Google Cloud Platfrom (GCP) is in progress.
Activity description and concrete tasks
As a Data Engineer you are part of a team which is responsible for several data analytics projects in the fixed access network within Telekom Deutschland.
You will be part of an innovative and agile work surrounding where you will be using Machine Learning Operation (MLOps) methods to ensure the best network quality for our customers.
Your key task will be migrating productive workflows from ODL to Google Cloud Platfrom (GCP). The workflows are build in KNIME or with a mixture of Python and SQL (Impala Hive Spark PySpark) and need to be transformed to run with BigQuery or Vertex AI Workbench on GCP. You will set them in production with Pipeline Cloud Composer and use Giblab.
In addition you will develeop new use cases on GCP for a better customer experience.
Within our team you will be driving innovation by constantly growing and sharing your own expertise and giving constructive feedback during the whole development cycle.
Qualifications :
Tools & Platforms
GitLab
Google Cloud Platfrom (GCP)
KNIME
Skill requirements
You have a degree in computer science or a related degree
You have multiple years of professional experience as Data Engineer
You are proficient in SQL and Python perferably Impala Hive Big Query PySpark
You have experience with low-code platfroms preferably KNIME
You have experience with Google Cloud Platfrom preferably BigQuery Vertex AI Pipeline Cloud Composer
You know about common best practices of MLOps like data cleaning clean code modelling operation and git and use it in your everyday work
You are analytical and solution oriented
You are driven by a willingness to grow and are constantly pushing yourself and your team to try innovative ideas
You are thriving in an agile work environment
You are interested in telecommunication technologies
You are proficient in englisch - german skills are a big plus
Información adicional :
What do we offer you
- International positive dynamic and motivated work environment.
- Hybrid work model (telecommuting/on-site).
- Flexible schedule.
- Continuous training: Certification preparation access to Coursera weekly English and German classes...
- Flexible compensation plan: medical insurance restaurant tickets day care transportation allowances...
- Life and accident insurance.
- More than 26 working days of vacation per year.
- Social fund.
- Free service in specialists (doctors physiotherapists nutritionists psychologists lawyers...).
- 100% of salary in case of medical leave.
And many more advantages of being part of T-Systems!
If you are looking for a new challenge do not hesitate to send us your CV! Please send CV in English. Join our team!
T-Systems Iberia will only process the CVs of candidates who meet the requirements specified for each offer.
Remote Work :
No
Employment Type :
Full-time
Project DescriptionThe team Data Driven Access continuously improves the fixed access network in Germany. Thereby it uses data engineering data analytics and data science to detect problems in the network and to solve them before the customer even noticed that they have a problem with their landline...
Project Description
The team Data Driven Access continuously improves the fixed access network in Germany. Thereby it uses data engineering data analytics and data science to detect problems in the network and to solve them before the customer even noticed that they have a problem with their landline. First step is always to understand the data structure and how the structure is connected to the the second step data engineering including data quality checks is done. Then we develop data analytics (rule based) and machine learning algorithms to detect faults in the network in a strong partnership with network experts. Last step is building a pipeline and set the solution in production. The development is done on a big data platform called One Data Lake (ODL) with KNIME and Cloudera Data Science Workbench. Currently a migration from ODL to Google Cloud Platfrom (GCP) is in progress.
Activity description and concrete tasks
As a Data Engineer you are part of a team which is responsible for several data analytics projects in the fixed access network within Telekom Deutschland.
You will be part of an innovative and agile work surrounding where you will be using Machine Learning Operation (MLOps) methods to ensure the best network quality for our customers.
Your key task will be migrating productive workflows from ODL to Google Cloud Platfrom (GCP). The workflows are build in KNIME or with a mixture of Python and SQL (Impala Hive Spark PySpark) and need to be transformed to run with BigQuery or Vertex AI Workbench on GCP. You will set them in production with Pipeline Cloud Composer and use Giblab.
In addition you will develeop new use cases on GCP for a better customer experience.
Within our team you will be driving innovation by constantly growing and sharing your own expertise and giving constructive feedback during the whole development cycle.
Qualifications :
Tools & Platforms
GitLab
Google Cloud Platfrom (GCP)
KNIME
Skill requirements
You have a degree in computer science or a related degree
You have multiple years of professional experience as Data Engineer
You are proficient in SQL and Python perferably Impala Hive Big Query PySpark
You have experience with low-code platfroms preferably KNIME
You have experience with Google Cloud Platfrom preferably BigQuery Vertex AI Pipeline Cloud Composer
You know about common best practices of MLOps like data cleaning clean code modelling operation and git and use it in your everyday work
You are analytical and solution oriented
You are driven by a willingness to grow and are constantly pushing yourself and your team to try innovative ideas
You are thriving in an agile work environment
You are interested in telecommunication technologies
You are proficient in englisch - german skills are a big plus
Información adicional :
What do we offer you
- International positive dynamic and motivated work environment.
- Hybrid work model (telecommuting/on-site).
- Flexible schedule.
- Continuous training: Certification preparation access to Coursera weekly English and German classes...
- Flexible compensation plan: medical insurance restaurant tickets day care transportation allowances...
- Life and accident insurance.
- More than 26 working days of vacation per year.
- Social fund.
- Free service in specialists (doctors physiotherapists nutritionists psychologists lawyers...).
- 100% of salary in case of medical leave.
And many more advantages of being part of T-Systems!
If you are looking for a new challenge do not hesitate to send us your CV! Please send CV in English. Join our team!
T-Systems Iberia will only process the CVs of candidates who meet the requirements specified for each offer.
Remote Work :
No
Employment Type :
Full-time
View more
View less