Our client is a rapidly growing automation-led service provider specializing in IT business process outsourcing (BPO) and consulting services. With a strong focus on digital transformation cloud solutions and AI-driven automation they help businesses optimize operations and enhance customer experiences. Backed by a global workforce of over 32000 employees our client fosters a culture of innovation collaboration and continuous learning making it an exciting environment for professionals looking to advance their careers.
Committed to excellence our client serves 31 Fortune 500 companies across industries such as financial services healthcare and manufacturing. Their approach is driven by the Automate Everything Cloudify Everything and Transform Customer Experiences strategy ensuring they stay ahead in an evolving digital landscape.
As a company that values growth and professional development our client offers global career opportunities a dynamic work environment and exposure to high-impact projects. With 54 offices worldwide and a presence in 39 delivery centers across 28 countries employees benefit from an international network of expertise and innovation. Their commitment to a customer success first and always philosophy ensures a rewarding and forward-thinking workplace for driven professionals.
We are currently searching for a Big Data Lead-Snowflake:
Responsibilities:
- Lead and execute end-to-end Data Warehouse ETL and BI projects using Snowflake and AWS.
- Implement complex Stored Procedures and maintain standard DWH and ETL concepts.
- Design and manage Snowflake architecture ensuring high performance and scalability.
- Develop and maintain data pipelines using Python and PySpark.
- Perform performance tuning and troubleshooting for Oracle databases and complex PL/SQL environments.
- Automate processes using Unix Shell Scripting and manage code via GitHub and Jenkins.
Requirements:
- 7 years of experience in Data Warehouse ETL and BI projects.
- 5 years of hands-on experience with Snowflake with a strong command of its architecture.
- 3 years of experience and a strong hold in Python or PySpark.
- Proficiency in Oracle database management and PL/SQL.
- Experience with Unix Shell Scripting and CI/CD tools (GitHub/Jenkins).
- Excellent communication and analytical skills.
Desired:
- Snowflake Certification (Core or Pro).
- Experience with AWS services and creating DevOps templates for cloud infrastructure.
Languages
- Advanced Oral English.
- Native Spanish.
Note:
If you meet these qualifications and are pursuing new challenges start your application on our website to join an award-winning employer. Explore all our job openings Sequoia Careers Page:
Snowflake Big Data Lead ETL Data Warehouse Python PySpark Oracle PL/SQL Unix Shell Scripting AWS DevOps Jenkins Performance Tuning.
Requirements:
Requirements:
- 7 years of experience in Data Warehouse ETL and BI projects.
- 5 years of hands-on experience with Snowflake with a strong command of its architecture.
- 3 years of experience and a strong hold in Python or PySpark.
- Proficiency in Oracle database management and PL/SQL.
- Experience with Unix Shell Scripting and CI/CD tools (GitHub/Jenkins).
- Excellent communication and analytical skills.
Our client is a rapidly growing automation-led service provider specializing in IT business process outsourcing (BPO) and consulting services. With a strong focus on digital transformation cloud solutions and AI-driven automation they help businesses optimize operations and enhance customer experien...
Our client is a rapidly growing automation-led service provider specializing in IT business process outsourcing (BPO) and consulting services. With a strong focus on digital transformation cloud solutions and AI-driven automation they help businesses optimize operations and enhance customer experiences. Backed by a global workforce of over 32000 employees our client fosters a culture of innovation collaboration and continuous learning making it an exciting environment for professionals looking to advance their careers.
Committed to excellence our client serves 31 Fortune 500 companies across industries such as financial services healthcare and manufacturing. Their approach is driven by the Automate Everything Cloudify Everything and Transform Customer Experiences strategy ensuring they stay ahead in an evolving digital landscape.
As a company that values growth and professional development our client offers global career opportunities a dynamic work environment and exposure to high-impact projects. With 54 offices worldwide and a presence in 39 delivery centers across 28 countries employees benefit from an international network of expertise and innovation. Their commitment to a customer success first and always philosophy ensures a rewarding and forward-thinking workplace for driven professionals.
We are currently searching for a Big Data Lead-Snowflake:
Responsibilities:
- Lead and execute end-to-end Data Warehouse ETL and BI projects using Snowflake and AWS.
- Implement complex Stored Procedures and maintain standard DWH and ETL concepts.
- Design and manage Snowflake architecture ensuring high performance and scalability.
- Develop and maintain data pipelines using Python and PySpark.
- Perform performance tuning and troubleshooting for Oracle databases and complex PL/SQL environments.
- Automate processes using Unix Shell Scripting and manage code via GitHub and Jenkins.
Requirements:
- 7 years of experience in Data Warehouse ETL and BI projects.
- 5 years of hands-on experience with Snowflake with a strong command of its architecture.
- 3 years of experience and a strong hold in Python or PySpark.
- Proficiency in Oracle database management and PL/SQL.
- Experience with Unix Shell Scripting and CI/CD tools (GitHub/Jenkins).
- Excellent communication and analytical skills.
Desired:
- Snowflake Certification (Core or Pro).
- Experience with AWS services and creating DevOps templates for cloud infrastructure.
Languages
- Advanced Oral English.
- Native Spanish.
Note:
If you meet these qualifications and are pursuing new challenges start your application on our website to join an award-winning employer. Explore all our job openings Sequoia Careers Page:
Snowflake Big Data Lead ETL Data Warehouse Python PySpark Oracle PL/SQL Unix Shell Scripting AWS DevOps Jenkins Performance Tuning.
Requirements:
Requirements:
- 7 years of experience in Data Warehouse ETL and BI projects.
- 5 years of hands-on experience with Snowflake with a strong command of its architecture.
- 3 years of experience and a strong hold in Python or PySpark.
- Proficiency in Oracle database management and PL/SQL.
- Experience with Unix Shell Scripting and CI/CD tools (GitHub/Jenkins).
- Excellent communication and analytical skills.
View more
View less