Datasphere ETL Engineer
Job Summary
Job Description
Are You Ready to Make It Happen at Mondelēz International
Join our Mission to Lead the Future of Snacking. Make It With Pride.
Together with analytics team leaders youwill support our businesswith excellent data models to uncover trends that can drive long-term business results.
How you will contribute
You will:
- Execute the business analytics agenda in conjunction with analytics team leaders
- Work with best-in-class external partners who leverage analytics tools and processes
- Use models/algorithms to uncover signals/patterns and trends to drive long-term business performance
- Execute the business analytics agenda using a methodical approach that conveys to stakeholders what businessanalytics will deliver
What you will bring
A desire to drive your future and accelerate your career and the following experience and knowledge:
- Using data analysis to make recommendations to analytic leaders
- Understanding in best-in-class analytics practices
- Knowledge of Indicators (KPIs) and scorecards
- Knowledge of BI tools like Tableau Excel Alteryx R Python etc. is a plus
More about this role -
As a Data COE Datasphere ETL Engineer you will have the opportunity to design and build scalable secure and cost-effective cloud-based data solutions with significant focus on SAP ecosystem and GCP. You will develop and maintain data pipelines to extract transform and load data into data warehouses or data lakes ensuring data quality and validation processes to maintain data accuracy and integrity. The primary focus of this role is to architect and implement robust pipelines using SAP Datasphere to ensure high-quality data availability for advanced analytics initiatives. You will collaborate closely with data teams product owners and other stakeholders to stay updated with the latest cloud technologies and best practices.
What extra ingredients you will bring: Design and Build: Develop scalable secure and cost-effective cloud-based data solutions.
Design and build scalable data models and data flows within SAP Datasphere leveraging its Federation/Replication capabilities to bridge SAP S/4HANA (or ECC) with GCP.
Manage Data Pipelines/Flows: Maintain monitor and optimize data pipelines/flows to extract (CDS views/ Direct database tables) and load data into data warehouses or data lakes.
Identify and resolve performance bottlenecks in data extraction and transformation specifically optimizing for high-volume delta loads. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity.
Collaborate and Innovate: Work closely with Data architects data modelers product owners and stay updated with the latest cloud technologies and best practices.
Leadership: Lead a team of data engineers (external/internal) to ensure timely deliverables fulfilling all data engineering and integration guidelines and standards.
Job specific requirements: Familiarity with the SAP Integration Suite or Enterprise-grade ETL tools (e.g. Informatica Talend SSIS FiveTran or similar) for SAP-to-GCP connectivity.
Solid understanding of data warehousing concepts ETL/ELT principles and data modeling techniques.
Programming: Python
Database: SAP SQL PL/SQL Postgres SQL Bigquery. Strong experience with SAP BW/4HANA or SAP Analytics Cloud or SAP Data Intelligence with a willingness to learn Datasphere.
ETL & Integration: AecorSoft/DataSphere ETL tools GCP Databricks (Optional). Knowledge of SAP HANA modeling and SAP Landscape Transformation (SLT) is highly preferred.
Data Warehousing: SCD Types Fact & Dimensions Star Schema
Data Modeling: Erwin (Optional)
Medallion Architecture: Bronze Silver Gold Platinum
GCP Cloud Services: Big Query GCS Cloud Function/Run GKE
Supporting Technologies: Airflow (Composer) Automic.
Visualization (Optional): PowerBI (Optional) Tableau (Optional)
Soft Skills:
Problem-Solving: The ability to identify and solve complex data-related challenges.
Communication: Effective communication skills to collaborate with Product Owners analysts and stakeholders.
Analytical Thinking: The capacity to analyze data and draw meaningful insights.
Attention to Detail: Meticulousness in data preparation and pipeline development.
Adaptability: The ability to stay updated with emerging technologies and trends in data engineering and cloud computing fields.
No Relocation support availableBusiness Unit Summary
At Mondelēz International our purpose is to empower people to snack right by offering the right snack for the right moment made the right way. That means delivering a broad range of delicious high-quality snacks that nourish lifes moments made with sustainable ingredients and packaging that consumers can feel good about.
We have a rich portfolio of strong brands globally and locally including many household names such as Oreo belVita and LU biscuits; Cadbury Dairy Milk Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits chocolate and candy and the second top position in gum.
Our 80000 makers and bakers are located in more than80countries and we sell our products in over150countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happenand happen fast.
Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race color religion gender sexual orientation or preference gender identity national origin disability status protected veteran status or any other characteristic protected by law.
Job Type
RegularAnalytics & ModellingAnalytics & Data ScienceRequired Experience:
IC
About Company
Mondelēz International, Inc. empowers people to snack right in over 150 countries around the world. We're leading the future of snacking with iconic brands such as Oreo, belVita and LU biscuits; Cadbury Dairy Milk, Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. ... View more