Who we are
Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to customers.
Job Description:
Job Title : Senior Snowflake Data Engineer
Job Type : W2/C2C
Experience : 8-15 Years
Location : Dallas Texas (On-Site)
Responsibilities:
-
10 years of data engineering experience with 5 years in Snowflake Data Cloud.
-
Expertise in SQL optimization and Snowflake performance tuning.
-
Hands-on with Snowpipe Streams & Tasks Snowpark Zero-Copy Cloning and Secure Data Sharing.
-
Proficiency in Python Scala or Java for Snowpark development.
-
Experience integrating with cloud platforms like AWS.
-
Exposure to ETL/ELT tools (Informatica Matillion Fivetran).
-
Familiarity with CI/CD Git and DevOps practices for data operations.
-
Preferred Certifications: SnowPro Core
-
Design and implement Snowflake schemas (star snowflake data vault) optimized with micro-partitioning clustering keys materialized views and search optimization services.
-
Build real-time and batch ingestion pipelines into Snowflake using Snowpipe Kafka Connect Fivetran Matillion Informatica or dbt.
-
Automate incremental data processing with Streams & Tasks to support CDC (Change Data Capture).
-
Use Zero-Copy Cloning for environment management testing and sandboxing.
-
Apply Time Travel and Fail-safe features for data recovery and auditing.
-
Develop data transformation logic in Snowpark for Python/SQL/Scala to push compute directly into Snowflake.
-
Design integrations with cloud storage (S3 Azure ADLS GCS) for staging and external tables.
-
Implement data sharing and data marketplace solutions via Snowflake Secure Data Sharing and Snowflake Marketplace.
-
Enable semi-structured data handling (JSON Avro Parquet ORC XML) using VARIANT columns and lateral flattening.
-
Integrate Snowflake with BI tools (Power BI Tableau) via live connections and semantic layers.
-
Implement RBAC (Role-Based Access Control) Row Access Policies and Dynamic Data Masking for data security.
-
Integrate with data catalog & governance platforms (Collibra Alation Informatica CDGC) using Snowflake metadata and APIs.
-
Support CI/CD automation for Snowflake code deployment using GitHub Actions Azure DevOps or dbt Cloud.
Preferred Key Skills:
-
Snowflake-native feature design and implementation (Snowpark Streams Time Travel Secure Data Sharing)
-
Data ingestion (Snowpipe CDC Kafka Fivetran)
-
Semi-structured data handling (VARIANT JSON Avro Parquet)
-
Advanced SQL and performance tuning
-
Data governance (RBAC masking lineage catalogs)
-
Cloud data platform integrations (AWS S3 Azure ADLS GCP GCS)
-
BI and analytics tool integration
-
Cost optimization and warehouse orchestration
Who we are Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to customers. Job Description: Job Title : Senior Snowflake Data Engineer Job Type : W2/C2C Experience : 8-15 Years Location : Dallas Texas (O...
Who we are
Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to customers.
Job Description:
Job Title : Senior Snowflake Data Engineer
Job Type : W2/C2C
Experience : 8-15 Years
Location : Dallas Texas (On-Site)
Responsibilities:
-
10 years of data engineering experience with 5 years in Snowflake Data Cloud.
-
Expertise in SQL optimization and Snowflake performance tuning.
-
Hands-on with Snowpipe Streams & Tasks Snowpark Zero-Copy Cloning and Secure Data Sharing.
-
Proficiency in Python Scala or Java for Snowpark development.
-
Experience integrating with cloud platforms like AWS.
-
Exposure to ETL/ELT tools (Informatica Matillion Fivetran).
-
Familiarity with CI/CD Git and DevOps practices for data operations.
-
Preferred Certifications: SnowPro Core
-
Design and implement Snowflake schemas (star snowflake data vault) optimized with micro-partitioning clustering keys materialized views and search optimization services.
-
Build real-time and batch ingestion pipelines into Snowflake using Snowpipe Kafka Connect Fivetran Matillion Informatica or dbt.
-
Automate incremental data processing with Streams & Tasks to support CDC (Change Data Capture).
-
Use Zero-Copy Cloning for environment management testing and sandboxing.
-
Apply Time Travel and Fail-safe features for data recovery and auditing.
-
Develop data transformation logic in Snowpark for Python/SQL/Scala to push compute directly into Snowflake.
-
Design integrations with cloud storage (S3 Azure ADLS GCS) for staging and external tables.
-
Implement data sharing and data marketplace solutions via Snowflake Secure Data Sharing and Snowflake Marketplace.
-
Enable semi-structured data handling (JSON Avro Parquet ORC XML) using VARIANT columns and lateral flattening.
-
Integrate Snowflake with BI tools (Power BI Tableau) via live connections and semantic layers.
-
Implement RBAC (Role-Based Access Control) Row Access Policies and Dynamic Data Masking for data security.
-
Integrate with data catalog & governance platforms (Collibra Alation Informatica CDGC) using Snowflake metadata and APIs.
-
Support CI/CD automation for Snowflake code deployment using GitHub Actions Azure DevOps or dbt Cloud.
Preferred Key Skills:
-
Snowflake-native feature design and implementation (Snowpark Streams Time Travel Secure Data Sharing)
-
Data ingestion (Snowpipe CDC Kafka Fivetran)
-
Semi-structured data handling (VARIANT JSON Avro Parquet)
-
Advanced SQL and performance tuning
-
Data governance (RBAC masking lineage catalogs)
-
Cloud data platform integrations (AWS S3 Azure ADLS GCP GCS)
-
BI and analytics tool integration
-
Cost optimization and warehouse orchestration
View more
View less