As a Data Engineer Intern you will gain hands-on experience with modern cloud platforms data engineering tools and real-world projects that combine data integration analytics and visualization.
Your future tasks:
- Design and implement data structures and ETL processes
- Integrate multiple data sources
- Write queries for data analysis and exploration
- Implement data visualizations
- Create software test plans and project documentation
- Prepare and execute tests for delivered solutions
- Analyze project requirements
Technologies you will work with:
- Azure Cloud
- Databricks
- Python
- .NET
- SQL
- Terraform
- Azure DevOps
Qualifications :
- Knowledge of SQL
- Basic programming skills in Python
- Familiarity with Python libraries (PySpark Pandas NumPy)
- Understanding of relational databases (SQL)
- Basic knowledge of data engineering concepts: data warehouse data lake ETL/ELT Big Data
- Understanding of software development/versioning processes
- Knowledge of basic algorithms and data structures
- Teamwork and communication skills
- English level B1 (comfortable communication)
Nice to have:
- Experience with Big Data technologies
- Basic programming in PL/SQL or T-SQL
- Knowledge of reporting tools (Tableau Power BI)
- Familiarity with non-relational databases
- Understanding of DevOps practices / CI/CD tools
- Knowledge of Agile methodology
Additional Information :
Hybrid work model: 2 days from the office 3 days of home office.
Office locations: Katowice Poznań Warszawa Lublin Rzeszów
We require full time availability (Monday to Friday 8h per day)
Remote Work :
No
Employment Type :
Full-time
As a Data Engineer Intern you will gain hands-on experience with modern cloud platforms data engineering tools and real-world projects that combine data integration analytics and visualization. Your future tasks: Design and implement data structures and ETL processesIntegrate multiple data sourcesWr...
As a Data Engineer Intern you will gain hands-on experience with modern cloud platforms data engineering tools and real-world projects that combine data integration analytics and visualization.
Your future tasks:
- Design and implement data structures and ETL processes
- Integrate multiple data sources
- Write queries for data analysis and exploration
- Implement data visualizations
- Create software test plans and project documentation
- Prepare and execute tests for delivered solutions
- Analyze project requirements
Technologies you will work with:
- Azure Cloud
- Databricks
- Python
- .NET
- SQL
- Terraform
- Azure DevOps
Qualifications :
- Knowledge of SQL
- Basic programming skills in Python
- Familiarity with Python libraries (PySpark Pandas NumPy)
- Understanding of relational databases (SQL)
- Basic knowledge of data engineering concepts: data warehouse data lake ETL/ELT Big Data
- Understanding of software development/versioning processes
- Knowledge of basic algorithms and data structures
- Teamwork and communication skills
- English level B1 (comfortable communication)
Nice to have:
- Experience with Big Data technologies
- Basic programming in PL/SQL or T-SQL
- Knowledge of reporting tools (Tableau Power BI)
- Familiarity with non-relational databases
- Understanding of DevOps practices / CI/CD tools
- Knowledge of Agile methodology
Additional Information :
Hybrid work model: 2 days from the office 3 days of home office.
Office locations: Katowice Poznań Warszawa Lublin Rzeszów
We require full time availability (Monday to Friday 8h per day)
Remote Work :
No
Employment Type :
Full-time
View more
View less