Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailProject Description:
The team Data Driven Access continuously improves the fixed access network in Germany. Thereby it uses data engineering data analytics and data science to detect problems in the network and to solve them before the customer even noticed that they have a problem with their landline. First step is always to understand the data structure and how the structure is connected to the infrastructure. In the second step data engineering including data quality checks is done. Then we develop data analytics (rule based) and machine learning algorithms to detect faults in the network in a strong partnership with network experts. Last step is building a pipeline and set the solution in production. The development is done on a big data platform called One Data Lake (ODL) with KNIME and Cloudera Data Science Workbench. Currently a migration from ODL to Google Cloud Platform (GCP) is in progress.
Activity description and concrete tasks:
As a Data Engineer you are part of a team which is responsible for several data analytics projects in the fixed access network within Telekom Deutschland.
You will be part of an innovative and agile work surrounding where you will be using Machine Learning Operation (MLOps) methods to ensure the best network quality for our customers.
Your key task will be migrating productive workflows from ODL to Google Cloud Platform (GCP). The workflows are build in KNIME or with a mixture of Python and SQL (Impala Hive Spark PySpark) and need to be transformed to run with BigQuery or Vertex AI Workbench on GCP. You will set them in production with Pipeline Cloud Composer and use Gitlab.
In addition you will develop new use cases on GCP for a better customer experience.
Within our team you will be driving innovation by constantly growing and sharing your own expertise and giving constructive feedback during the whole development cycle.
Qualifications :
Tools & Platforms:
GitLab
Google Cloud Platfrom (GCP)
KNIME
Skill requirements:
You have a degree in computer science or a related degree
You have multiple years of professional experience as Data Engineer
You are proficient in SQL and Python perferably Impala Hive Big Query PySpark
You have experience with lowcode platfroms preferably KNIME
You have experience with Google Cloud Platfrom preferably BigQuery Vertex AI Pipeline Cloud Composer
You know about common best practices of MLOps like data cleaning clean code modelling operation and git and you use it in your everyday work
You are analytical and solution oriented
You are driven by a willingness to grow and are constantly pushing yourself and your team to try innovative ideas
You are thriving in an agile work environment
You are interested in telecommunication technologies
You are proficient in English German language is a big plus
Additional Information :
What do we offer you
International positive dynamic and motivated work environment.
Hybrid work model (telework/facetoface).
Flexible schedule.
Continuous training.
Flexible Compensation Plan.
Life and accident insurance.
More than 25 working days of vacation per year.
And many more advantages of being part of TSystems!
If you are looking for a new challenge do not hesitate to send us your CV! Please send CV in English. Join our team!
TSystems Iberia will only process the CVs of candidates who meet the requirements specified for each offer.
Remote Work :
No
Employment Type :
Fulltime
Full-time