Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailThe salary range for this position is (contract of employment):PLN in gross terms
A hybrid work model requires 1 day a week in the office
In the area of Delivery Experience we are building technology that makes Allegros deliveries easy costeffective fast and predictable. Our team takes care of critical services along the Allegro shopping journey responsible for predicting delivery times using statistical algorithms and machine learning selecting the best delivery methods tailored to customers and integrating with carrier companies. Delivery Experience is also one of the fastestgrowing areas where we undertake new complex projects to enhance logistics and warehousing processes.
We are looking for a Mid/Senior Data Engineer with a focus on the data processing and preparation deployment and maintenance of our data projects. Join our team to enhance your skills related to deploying databased processes data ops approaches and share the skills within the team.
We are looking for people who:
Have at least 3 years of experience as Data Engineer and working with large datasets
Have experience with cloud providers (GCP preferred)
Are highly proficient in SQL
Have strong understanding of data modeling and cloud DWH architecture
Have experience in designing and maintaining ETL/ELT processes
Are capable of optimizing cost and efficiency of data processing
Are proficient in Python for working with large data sets (using PySpark or Airflow)
Use good practices (clean code code review CI/CD)
Have a high degree of autonomy and take responsibility for developed solutions
Have English proficiency on at least B2 level
Like to share knowledge with other team members
Nice to have:
Experience with Azure and crosscloud data transfers and multicloud architecture
What will your responsibilities be
You will be actively responsible for developing and maintaining processes for handling large volumes of data
You will be streamlining and developing the data architecture that powers analytical products and work along a team of experienced analysts
You will be monitoring and enhancing quality and integrity of the data
You will manage and optimize costs related to our data infrastructure and data processing on GCP
What we offer
A hybrid work model that you will agree on with your leader and the team.
We have welllocated offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (heightadjustable desks interactive conference rooms)
A wide selection of fringe benefits in a cafeteria plan you choose what you like (e.g. medical sports or lunch packages insurance purchase vouchers)
16 or 14 MacBook Pro with M1 processor and 32GB RAM or a corresponding Dell with Windows (if you dont like Macs) and other gadgets that you may need
Hackathons team tourism training budget and an internal educational platform MindUp (including training courses on work organization means of communications motivation to work and various technologies and subjectmatter issues)
English classes that we pay for related to the specific nature of your job
Why would you like to work with us:
Big Data is not an empty slogan for us but a reality you will be working on really big datasets (petabytes of data)
You will have a real impact on the direction of product development and technology choices. We utilize the latest and best available technologies as we select them according to our own needs
Our tech stack includes: GCP BigQuery (Py)Spark Airflow
We are a closeknit team where we work well together
You will have the opportunity to work within a team of experienced engineers and big data specialists who are eager to share their knowledge including publicly through allegro.tech
Apply to Allegro and see why it is #dobrzetuby #goodtobehere)
Remote Work :
No
Employment Type :
Fulltime
Full-time