SKF has been around for more than a century and today we are one of the worlds largest global suppliers of bearings and supporting solutions for rotating equipment.Our products can be found literally everywhere in society. This means that we are an important part of the everyday lives of people and companies around the world.
In September of 2024 SKF announced the separation of its Automotive business with the objective to build two world-leading businesses. The role you are applying for will be part of the automotive means you will have the opportunity to be a part of shaping a new company aimed at meeting the needs of the transforming global automotive market.
Would you like to join us in shaping the future of motion We arenow looking for a
Data Engineer India Automobile Business
Design build and maintain the data infrastructure and systems that support SKF VA data needs. By leveraging their skills in data modeling data integration data processing data storage data retrieval and performance optimization this role can help VA manage and utilize their data more effectively.
Key responsibilities (or What you can expect in the role)
- Build an VA data warehouse which is scalable secured and compliant using snowflake technologies. This would include designing and developing Snowflake data models
- Work with Central data warehouse like SDW MDW OIDW to extract data and enrich with VA specific customer grouping program details etc.
- Data integration: Responsible for integrating data from ERPs BPC and other systems into Snowflake SKF standard DWs ensuring that data is accurate complete and consistent.
- Performance optimization: Responsible for optimizing the performance of Snowflake queries and data loading processes. Involves optimizing SQL queries creating indexes and tuning data loading processes.
- Security and access management: Responsible for managing the security and access controls of the Snowflake environment. This includes configuring user roles and permissions managing encryption keys and monitoring access logs.
- Maintain existing databases warehouse solutions addressing support needs enhancements Troubleshooting etc.
Metrics
- Technical metrics: Data quality for whole of VA BU data processing time data storage capacity and systems availability
- Business metrics: data driven decision making data security and compliance cross functional collaboration.
Competencies
- Should have a good understanding of data modeling concepts and should be familiar with Snowflakes data modeling tools and techniques.
- SQL: Should be expert in SQL. Should be able to write complex SQL queries and understand how to optimize SQL performance in Snowflake.
- Pipeline Management & ETL: Should be able to design and manage data pipelines on Snowflake and Azure using ETL/ELT tools (e.g. DBT Alteryx Talend Informatica).
- Should have a good understanding of cloud computing concepts and be familiar with the cloud infrastructure on which Snowflake operates.
- Good understanding of data warehousing concepts and be familiar with Snowflakes data warehousing tools and techniques
- Familiar with data governance and security concepts
- Able to identify and troubleshoot issues with Snowflake and SKFs data infrastructure
- Experience with Agile solution development
- Good to have knowledge on SKF ERP systems (XA SAP PIM etc.) data related sales supply chain data manufacturing.
Candidate Profile:
- Bachelors degree in computer science Information technology or a related field
SKF is committed to creating a diverse environment and we firmly believe that a diverse workforce is essential for our continued success. Therefore we only focus on your experience skills and potential. Come as you are just be yourself. #weareSKF
Some additional information
This position will be located in Bangalore.
For questions regarding the recruitment process please contact Anuradha Seereddy Recruitment Specialist on email.