At Sequoia Connect we are a Talent-First Technology Ecosystem that redefines how elite professionals interact with the global digital landscape. We move beyond traditional models to act as a catalyst for the top 1% of global talent connecting human potential with complex industrial execution. By joining our inner circle you are not simply taking a position; you are aligning with a strategic partner dedicated to updating your Human OS and accelerating your growth through world-class high-impact projects.
We are currently partnering with a global IT powerhouse that represents the connected world through innovative customer-centric experiences. As a USD 6 billion organization and one of the top 7 IT service providers globally our client empowers over 1200 global customersincluding several Fortune 500 companiesto Rise. With a massive network of 163000 professionals across 90 countries they are at the absolute forefront of digital transformation leveraging next-generation technologies such as 5G AI Blockchain and Quantum Computing.
This is your chance to thrive in a workplace recognized as one of the most sustainable corporations in the world. You will join an environment that values innovation and societal impact working on end-to-end digital transformation projects for global leaders. If you are a driven professional looking for global career opportunities and exposure to high-impact projects within an international network of expertise this is where you belong.
We are currently searching for a Solution Data Engineer Sr:
Responsibilities:
- Full management of the Databricks environment including configuration and workspace administration.
- Implement data governance using Unity Catalog and manage structured storage with Delta Lake tables.
- Write and optimize data processing logic using Python SQL and PySpark.
- Manage data architecture and performance within Snowflake.
- Handle Adobe Data Feeds and Adobe Analytics ensuring correct ingestion and transformation of marketing data.
- Oversee data warehousing operations in Amazon Redshift.
- Schedule and monitor complex data workflows using Airflow.
- Own end-to-end maintenance of data pipelines ensuring reliability from ingestion to delivery across multiple cloud platforms.
- Manage code lifecycle and collaborative development using Git.
Requirements:
- Advanced proficiency in the Databricks Ecosystem (Workspace admin Delta Lake Unity Catalog).
- Expertise in Data Warehousing & Integration specifically with Snowflake.
- Strong development skills in Python SQL and PySpark.
- Hands-on experience with Adobe Data Ingestion (Data Feeds/Analytics).
- Solid experience in version control using Git.
Desired:
- Experience with Amazon Redshift for data warehousing operations.
- Proficiency in orchestration tools specifically Airflow.
- Familiarity with Google Cloud Platform (BigQuery) operations.
- Knowledge of AWS Data Services such as Athena and AWS Glue.
Languages
- Advanced Oral English.
- Advanced Spanish.
Note:
If you meet these qualifications and are pursuing new challenges start your application on our website to join an award-winning employer. Explore all our job openings Sequoia Careers Page:
Data Engineer Databricks Unity Catalog Delta Lake PySpark Snowflake Adobe Analytics Airflow Python SQL AWS Glue Athena BigQuery Redshift Data Governance ETL.
Requirements:
Requirements:
- Advanced proficiency in the Databricks Ecosystem (Workspace admin Delta Lake Unity Catalog).
- Expertise in Data Warehousing & Integration specifically with Snowflake.
- Strong development skills in Python SQL and PySpark.
- Hands-on experience with Adobe Data Ingestion (Data Feeds/Analytics).
- Solid experience in version control using Git.
- Experience with Amazon Redshift for data warehousing operations.
- Proficiency in orchestration tools specifically Airflow.
- Familiarity with Google Cloud Platform (BigQuery) operations.
- Knowledge of AWS Data Services such as Athena and AWS Glue.
At Sequoia Connect we are a Talent-First Technology Ecosystem that redefines how elite professionals interact with the global digital landscape. We move beyond traditional models to act as a catalyst for the top 1% of global talent connecting human potential with complex industrial execution. By joi...
At Sequoia Connect we are a Talent-First Technology Ecosystem that redefines how elite professionals interact with the global digital landscape. We move beyond traditional models to act as a catalyst for the top 1% of global talent connecting human potential with complex industrial execution. By joining our inner circle you are not simply taking a position; you are aligning with a strategic partner dedicated to updating your Human OS and accelerating your growth through world-class high-impact projects.
We are currently partnering with a global IT powerhouse that represents the connected world through innovative customer-centric experiences. As a USD 6 billion organization and one of the top 7 IT service providers globally our client empowers over 1200 global customersincluding several Fortune 500 companiesto Rise. With a massive network of 163000 professionals across 90 countries they are at the absolute forefront of digital transformation leveraging next-generation technologies such as 5G AI Blockchain and Quantum Computing.
This is your chance to thrive in a workplace recognized as one of the most sustainable corporations in the world. You will join an environment that values innovation and societal impact working on end-to-end digital transformation projects for global leaders. If you are a driven professional looking for global career opportunities and exposure to high-impact projects within an international network of expertise this is where you belong.
We are currently searching for a Solution Data Engineer Sr:
Responsibilities:
- Full management of the Databricks environment including configuration and workspace administration.
- Implement data governance using Unity Catalog and manage structured storage with Delta Lake tables.
- Write and optimize data processing logic using Python SQL and PySpark.
- Manage data architecture and performance within Snowflake.
- Handle Adobe Data Feeds and Adobe Analytics ensuring correct ingestion and transformation of marketing data.
- Oversee data warehousing operations in Amazon Redshift.
- Schedule and monitor complex data workflows using Airflow.
- Own end-to-end maintenance of data pipelines ensuring reliability from ingestion to delivery across multiple cloud platforms.
- Manage code lifecycle and collaborative development using Git.
Requirements:
- Advanced proficiency in the Databricks Ecosystem (Workspace admin Delta Lake Unity Catalog).
- Expertise in Data Warehousing & Integration specifically with Snowflake.
- Strong development skills in Python SQL and PySpark.
- Hands-on experience with Adobe Data Ingestion (Data Feeds/Analytics).
- Solid experience in version control using Git.
Desired:
- Experience with Amazon Redshift for data warehousing operations.
- Proficiency in orchestration tools specifically Airflow.
- Familiarity with Google Cloud Platform (BigQuery) operations.
- Knowledge of AWS Data Services such as Athena and AWS Glue.
Languages
- Advanced Oral English.
- Advanced Spanish.
Note:
If you meet these qualifications and are pursuing new challenges start your application on our website to join an award-winning employer. Explore all our job openings Sequoia Careers Page:
Data Engineer Databricks Unity Catalog Delta Lake PySpark Snowflake Adobe Analytics Airflow Python SQL AWS Glue Athena BigQuery Redshift Data Governance ETL.
Requirements:
Requirements:
- Advanced proficiency in the Databricks Ecosystem (Workspace admin Delta Lake Unity Catalog).
- Expertise in Data Warehousing & Integration specifically with Snowflake.
- Strong development skills in Python SQL and PySpark.
- Hands-on experience with Adobe Data Ingestion (Data Feeds/Analytics).
- Solid experience in version control using Git.
- Experience with Amazon Redshift for data warehousing operations.
- Proficiency in orchestration tools specifically Airflow.
- Familiarity with Google Cloud Platform (BigQuery) operations.
- Knowledge of AWS Data Services such as Athena and AWS Glue.
View more
View less