Snowflake Databrick Architect

Cloud BC Labs

Not Interested
Bookmark
Report This Job

profile Job Location:

Chicago, IL - USA

profile Monthly Salary: Not Disclosed
Posted on: 4 hours ago
Vacancies: 1 Vacancy

Job Summary

Position : Snowflake Databrick Architect
Location : Peoria IL (Onsite)
Term : C2C/W2 role
Duration : 12 months
Job Description:
We are looking for an experienced Snowflake & Databricks Architect to design develop and optimize scalable cloud-based data platforms. The ideal candidate should have strong expertise in Snowflake Databricks cloud ecosystems data architecture ETL/ELT pipelines and modern data engineering practices. The candidate will work closely with business stakeholders data engineers and analytics teams to build high-performance enterprise data solutions.
Key Responsibilities
  • Design and implement scalable data architecture solutions using Snowflake and Databricks.
  • Build and optimize modern data lakehouse architectures.
  • Develop ETL/ELT pipelines for structured and unstructured data processing.
  • Implement data ingestion frameworks from multiple sources including APIs databases flat files and streaming platforms.
  • Optimize Snowflake performance using clustering partitioning caching and query tuning techniques.
  • Design Databricks solutions using PySpark Spark SQL Delta Lake and notebooks.
  • Work with cloud platforms such as AWS Azure or GCP for end-to-end data solutions.
  • Implement data governance security masking and access control strategies.
  • Collaborate with BI Analytics and Data Science teams for reporting and ML use cases.
  • Define CI/CD and DevOps practices for data engineering deployments.
  • Lead architecture discussions and provide technical guidance to engineering teams.
  • Ensure high availability scalability reliability and cost optimization of data platforms.
  • Create technical documentation architecture diagrams and best practice guidelines.
Required Skills
Must Have Skills
  • Strong experience in Snowflake Architecture & Administration
  • Hands-on expertise with Databricks and Apache Spark
  • Strong SQL and PySpark programming skills
  • Experience with Delta Lake and Lakehouse architecture
  • Cloud experience in AWS / Azure / GCP
  • ETL/ELT pipeline development experience
  • Data Modeling (Star Schema Snowflake Schema Dimensional Modeling)
  • Experience with data integration tools and orchestration frameworks
  • Performance tuning and optimization experience
  • CI/CD and DevOps knowledge for data platforms
Good to Have Skills
  • Kafka or Streaming technologies
  • Airflow ADF or Informatica experience
  • Terraform or Infrastructure as Code (IaC)
  • ML/AI integration knowledge
  • Experience with Power BI Tableau or Looker
  • Knowledge of data governance and security frameworks
Qualifications
  • Bachelors or Masters degree in Computer Science IT or related field.
Roles & Responsibilities
  • Architect enterprise-scale data platforms
  • Lead migration from traditional DW to Snowflake/Databricks
  • Mentor data engineering teams
  • Drive best practices for cloud data modernization
  • Coordinate with stakeholders and delivery teams

Cloud BC Labs Inc is a digital transformation organization aimed at creating seamless solutions for clients to effectively manage their business operations. The company specializes in Business and Management Consulting AI/ML Data Analytics & Visualization Cloud Data Warehouse Migration Snowflake Implementation Informatica Implementation & Upgrade Staffing Services and Data Management Solutions

Position : Snowflake Databrick Architect Location : Peoria IL (Onsite) Term : C2C/W2 role Duration : 12 months Job Description: We are looking for an experienced Snowflake & Databricks Architect to design develop and optimize scalable cloud-based data platforms. The ideal candidate should...
View more view more