drjobs Remote Big Data Engineer (snowflake)

Remote Big Data Engineer (snowflake)

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Mexico City - Mexico

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Our client is a rapidly growing automation-led service provider specializing in IT business process outsourcing (BPO) and consulting services. With a strong focus on digital transformation cloud solutions and AI-driven automation they help businesses optimize operations and enhance customer experiences. Backed by a global workforce of over 32000 employees our client fosters a culture of innovation collaboration and continuous learning making it an exciting environment for professionals looking to advance their careers.

Committed to excellence our client serves 31 Fortune 500 companies across industries such as financial services healthcare and manufacturing. Their approach is driven by the Automate Everything Cloudify Everything and Transform Customer Experiences strategy ensuring they stay ahead in an evolving digital landscape.

As a company that values growth and professional development our client offers global career opportunities a dynamic work environment and exposure to high-impact projects. With 54 offices worldwide and a presence in 39 delivery centers across 28 countries employees benefit from an international network of expertise and innovation. Their commitment to a customer success first and always philosophy ensures a rewarding and forward-thinking workplace for driven professionals.

We are currently searching for a Remote Big Data Engineer (snowflake):

Responsibilities:

  • Design build and optimize data pipelines for ETL/ELT processes in data warehousing and BI projects.
  • Develop and maintain complex stored procedures DWH schemas and SQL/PL-SQL scripts.
  • Implement PySpark-based solutions for large-scale data processing and transformation.
  • Collaborate on Snowflake database architecture performance tuning and troubleshooting.
  • Integrate data workflows with AWS services (S3 Lambda) and orchestration tools (Jenkins GitHub).
  • Manage JIRA workflows for task tracking and Agile project delivery.

Requirements:

  • 5 years of experience in data engineering ETL and data warehousing.
  • Expertise in Python/PySpark for big data processing.
  • Advanced SQL/PL-SQL skills (complex queries stored procedures performance tuning).
  • Hands-on experience with Snowflake Oracle Database and Unix Shell Scripting.
  • 3 years of experience with Snowflake.
  • Familiarity with AWS cloud services (S3 Lambda).
  • Proficiency in CI/CD tools (GitHub Jenkins).
  • Strong analytical skills and ability to troubleshoot data pipeline issues.

Desired:

  • Experience with Kafka for real-time data streaming.
  • Knowledge of Netezza DB Informatica or Talend.
  • Basic understanding of data governance and workflow automation.


Languages

  • Advanced Oral English.
  • Native Spanish.

Note:

  • Fully remote


If you meet these qualifications and are pursuing new challenges start your application on our website to join an award-winning employer. Explore all our job openings Sequoia Careers Page:
Snowflake PySpark ETL AWS S3/Lambda Oracle PL/SQL Unix Shell Scripting

Requirements:

Requirements:

Employment Type

Remote

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.