Key Highlights / Must-Haves -
Hands-on experience with PySpark
-
Experience with Snowflake Cortex AI
-
Strong knowledge of modern Data Warehousing practices
-
Recent experience with highly advantageous
Role Summary
Client is seeking a Senior Python and Snowflake Data Engineer to design build and optimize scalable data pipelines implement Snowflake data models and deliver end-to-end data engineering solutions. The role involves collaborating with cross-functional teams to improve data quality performance and secure delivery of data products in complex enterprise environments.
Note: The client has an urgent need and is seeking candidates who can start immediately.
Key Responsibilities
-
Design build and deploy high-performance data pipelines and backend services
-
Implement and optimize Snowflake data models and data marts
-
Deliver end-to-end data engineering solutions including ingestion processing and delivery
-
Optimize performance scalability and cost of data workflows
-
Collaborate with cross-functional teams to support complex business requirements
-
Support front-end applications with where applicable (Preferred)
Required Skills & Expertise
-
5 years in data engineering or backend development
-
Strong hands-on experience with Python PySpark and Snowflake
-
Proven expertise in Snowflake data modeling performance tuning and optimization
-
Experience with Snowflake Cortex AI or similar AI-enabled data platform capabilities
-
Solid understanding of modern data warehousing concepts (dimensional modeling ELT/ETL optimization strategies)
-
Advanced SQL skills for designing scalable data models
-
Experience delivering cloud-based end-to-end data engineering solutions (AWS Azure GCP)
-
Working knowledge of for supporting data-driven front-end applications
-
Strong understanding of distributed data processing frameworks
-
Knowledge of data security governance and compliance best practices
-
Proven ability to collaborate effectively in complex enterprise environments
Key Highlights / Must-Haves Hands-on experience with PySpark Experience with Snowflake Cortex AI Strong knowledge of modern Data Warehousing practices Recent experience with highly advantageous Role Summary Client is seeking a Senior Python and Snowflake Data Engineer to design build and ...
Key Highlights / Must-Haves -
Hands-on experience with PySpark
-
Experience with Snowflake Cortex AI
-
Strong knowledge of modern Data Warehousing practices
-
Recent experience with highly advantageous
Role Summary
Client is seeking a Senior Python and Snowflake Data Engineer to design build and optimize scalable data pipelines implement Snowflake data models and deliver end-to-end data engineering solutions. The role involves collaborating with cross-functional teams to improve data quality performance and secure delivery of data products in complex enterprise environments.
Note: The client has an urgent need and is seeking candidates who can start immediately.
Key Responsibilities
-
Design build and deploy high-performance data pipelines and backend services
-
Implement and optimize Snowflake data models and data marts
-
Deliver end-to-end data engineering solutions including ingestion processing and delivery
-
Optimize performance scalability and cost of data workflows
-
Collaborate with cross-functional teams to support complex business requirements
-
Support front-end applications with where applicable (Preferred)
Required Skills & Expertise
-
5 years in data engineering or backend development
-
Strong hands-on experience with Python PySpark and Snowflake
-
Proven expertise in Snowflake data modeling performance tuning and optimization
-
Experience with Snowflake Cortex AI or similar AI-enabled data platform capabilities
-
Solid understanding of modern data warehousing concepts (dimensional modeling ELT/ETL optimization strategies)
-
Advanced SQL skills for designing scalable data models
-
Experience delivering cloud-based end-to-end data engineering solutions (AWS Azure GCP)
-
Working knowledge of for supporting data-driven front-end applications
-
Strong understanding of distributed data processing frameworks
-
Knowledge of data security governance and compliance best practices
-
Proven ability to collaborate effectively in complex enterprise environments
View more
View less