Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailAbout GlobalFoundries
GlobalFoundriesis a leading fullservice semiconductor foundry providing a unique combination of design development and fabrication services to some of the worlds most inspired technology companies. With a global manufacturing footprint spanning three continents GlobalFoundries makes possible the technologies and systems that transform industries and give customers the power to shape their markets. For more information visit.
Introduction: The Data Engineering and Analytics Group is responsible for integrating manufacturing and enterprise data out of a high variety of source systems used in the semiconductor engineering and production process. The data solutions are used in GLOBALFOUNDRIES Fabs in Dresden in the US and in Singapore and also other wordwide locations. The DEA team is responsible to design and build data products endtoend meeting business team requirements. It does so by improving data architecture by solutioning building and providing timely highquality solutions that address analytical needs of engineers in a leadingedge semiconductor foundry.
Your Job:
Understand the business case and translate to a holistic solution involving AWS Cloud Services PySpark EMR Python Data Ingestion and Cloud DB Redshift/ Postgres
PL/SQL development for high volume data sets logical and physical schema design
Proven experience with large complex database or data lake projects in environments producing highvolume data
Demonstrated problem solving skills; familiarity with various root cause analysis methods; experience in documenting identified problems and driving resolutions.
Communication with stakeholders and business partners across all worldwide locations.
Understanding and aligning business partners across locations doing requirements engineering and business case mapping.
Leading projects: Working on complex crossfunctional projects acting as a subject matter expert but also experience in project manager role
Designing data products: Creating trusted reusable data products building on data collection transformation and curation.
Developing documentation: Creating functional and technical documentation that supports best practices
Design data pipeline: Establish design artifacts based on user requirements and engage with ETL Build teams (ETL framework design data modeling sourcetargetmapping architecture diagrams)
Interaction with Data Governance teams aligning on data models lineage and relationships.
Data Analysis for trouble shooting (e.g. data issues performance issues) but also for BI & analytics interaction potentially on large data sets
Interfacing with architecture team cloud engineering teams and vendors on designing state of the art solutions.
Make recommendations regarding enhancements and/or improvements provide consulting for operational aspects
Other Responsibilities:
Customer/stakeholder focus. Ability to build strong relationships with Application teams cross functional IT and global/local IT teams as well as interfacing with vendors e.g. AWS)
Advising junior engineers & Build Team: Providing guidance to junior engineers and ETL Build team and BI/Analytics teams
Required Qualifications:
Bachelor or masters degree. Preferably in information technology or electrical engineering but since job focus on business interaction also background in other fields is welcome.
Proven experience 10 year minimum) with data engineering analytics design and optimization.
Very good knowledge of data architecture approaches and trends and high interest to apply and further develop that knowledge including understanding of OLAP/OLTP ML genAI modelling statistics.
Problem solving experience & analytics skills e.g. with L6S curriculum (green/black belt)
Good experience in AWS Services Big dataPySpark EMRPython Cloud DB RedShift
Proven experience with large complex data projects in environments producing highvolume data
Proficiency in SQL and PL/SQL
Excellent conceptual abilities pared with very good technical documentation skills e.g. ability to understand and document complex data flows as part of business / production processes
Familiarity with SDLC concepts and processes
Translate business requirements into technical specifications. This involves critical thinking problemsolving and the capacity to work with large datasets.
Additional Skill:
Experience using and developing on AWS services AWS certification AWS Solutions Architect
Proficiency in programming languages such as Python PySpark SQL Java C is highly recommended.
Understanding of machine learning and AI technologies is becoming increasingly important for data architects
Experience in semiconductor industry
Knowledge of Semistructured datasets
Experience with analytics & reporting solutions and business intelligence tools
Experience in collecting structuring and summarizing requirements in a data warehouse environment
Knowledge of statistical data analysis and data mining
Experience in test management test case definition and test processes
GlobalFoundries is an equal opportunity employer cultivating a diverse and inclusive workforce. We believe having a multicultural workplace enhances productivity efficiency and innovation whilst our employees feel truly respected valued and heard. As an affirmative employer all qualified applicants are considered for employment regardless of age ethnicity marital status citizenship race religion political affiliation gender sexual orientation and medical and/or physical abilities. All offers of employment with GlobalFoundries are conditioned upon the successful completion of background checks medical screenings as applicable and subject to the respective local laws and regulations.
Information about our benefits you can find here: https://gf/aboutus/careers/opportunitiesasia
Required Experience:
Senior IC
Full-Time