- Design build and optimize high-volume high-performance ELT pipelines for centralized data warehousing
- Collaborate with Product Managers ML Engineers Data Scientists and DevOps to define and enforce a reliable scalable and secure data platform architecture
- Ensure adherence to data warehousing standards data quality best practices and metadata management processes
- Take ownership of key data warehouse components and drive improvements in performance and reliability
- Conduct architecture reviews and participate in system design discussions to provide technical leadership
- Mentor and guide junior engineers fostering a culture of quality and continuous learning
Qualifications :
- 5 years of professional experience in data engineering building scalable data systems and pipelines
- Expert proficiency in SQL
- Strong programming skills in Python
- Proven experience in designing building and managing large-scale data lakes and warehouses
- Solid computer science fundamentals (data structures algorithms distributed systems)
- Deep understanding of distributed system architecture with a focus on data availability reliability and performance
- Upper-Intermediate or higher English level
WOULD BE A PLUS
- Experience with cloud platforms such as AWS GCP or Azure
- Knowledge of Spark Kafka Airflow or dbt
- Familiarity with BI tools like Tableau or Power BI
- Understanding of CI/CD practices for data pipelines
- Exposure to machine learning model deployment and monitoring
Additional Information :
PERSONAL PROFILE
- Analytical thinker with strong problem-solving skills
- Detail-oriented and committed to delivering high-quality results
- Collaborative team player with excellent communication skills
- Adaptable to changing priorities and requirements
- Proactive and self-motivated with a leadership mindset
Remote Work :
Yes
Employment Type :
Full-time
Design build and optimize high-volume high-performance ELT pipelines for centralized data warehousingCollaborate with Product Managers ML Engineers Data Scientists and DevOps to define and enforce a reliable scalable and secure data platform architectureEnsure adherence to data warehousing standards...
- Design build and optimize high-volume high-performance ELT pipelines for centralized data warehousing
- Collaborate with Product Managers ML Engineers Data Scientists and DevOps to define and enforce a reliable scalable and secure data platform architecture
- Ensure adherence to data warehousing standards data quality best practices and metadata management processes
- Take ownership of key data warehouse components and drive improvements in performance and reliability
- Conduct architecture reviews and participate in system design discussions to provide technical leadership
- Mentor and guide junior engineers fostering a culture of quality and continuous learning
Qualifications :
- 5 years of professional experience in data engineering building scalable data systems and pipelines
- Expert proficiency in SQL
- Strong programming skills in Python
- Proven experience in designing building and managing large-scale data lakes and warehouses
- Solid computer science fundamentals (data structures algorithms distributed systems)
- Deep understanding of distributed system architecture with a focus on data availability reliability and performance
- Upper-Intermediate or higher English level
WOULD BE A PLUS
- Experience with cloud platforms such as AWS GCP or Azure
- Knowledge of Spark Kafka Airflow or dbt
- Familiarity with BI tools like Tableau or Power BI
- Understanding of CI/CD practices for data pipelines
- Exposure to machine learning model deployment and monitoring
Additional Information :
PERSONAL PROFILE
- Analytical thinker with strong problem-solving skills
- Detail-oriented and committed to delivering high-quality results
- Collaborative team player with excellent communication skills
- Adaptable to changing priorities and requirements
- Proactive and self-motivated with a leadership mindset
Remote Work :
Yes
Employment Type :
Full-time
View more
View less