drjobs Snowflake Data Engineer

Snowflake Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Maryland Heights, MO - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Sr. Snowflake Data Engineer-

Requirements: Python SQL ETL Stored procedures Snowflake (can be basic upskilling acquisition of badges ahead of joining) Job orchestration i.e. Airflow (Airflow itself is nice to have) Version and Source Control experience Nice to Have: Airflow Gitlab Flyway

Responsibilities:

Design develop and optimize scalable data pipelines and ETL processes utilizing Snowflake and related cloud technologies.

Develop and implement efficient data ingestion methods integrating various structured and semi-structured data sources.

Create and maintain data warehouse schemas tables views and materialized views in Snowflake.

Ensure data quality data integrity and data governance standards are maintained throughout ETL processes.

Collaborate closely with data analysts data architect and business stakeholders to understand and fulfill data requirements.

Perform regular performance tuning and optimization of Snowflake queries and warehouse operations.

Develop and enforce best practices for data modeling ETL architecture data warehousing and data security.

Proactively identify troubleshoot and resolve data issues and pipeline failures.

Document technical processes system architecture and data lineage clearly and comprehensively.

Data Engineer:

Overview: Project is related to customer intelligence has data coming in from different vendors (Google Analytics HootSuite Known) different file formats (API CSV) transforming and ingesting into S3 buckets from there into Snowflake. Different layers in Snowflake: staging cleansed reporting; need to move and curate data with business logic in the cleansing layer.

  • Leveraging different technologies: Python Snowflake Airflow orchestration and DAG creation Gitlab source control Flyway for version control (dont really need expertise in it but developers with experience in source/version control and CI/CD)
  • Python to ingest data and build out Airflow-related code performing transformations by passing parameters.
  • Creating tasks within DAG DAGs trigger Gitlab Flyway runs version control from Gitlab implements schema changes.
  • Using Snowflake procedures SQL stored procedures for smaller volumes of data. Still solutioning large volumes of data--may end up being a Python/Spark approach.

Required Skills : Requirements: Python SQL ETL Stored procedures Snowflake (can be basic upskilling acquisition of badges ahead of joining) Job orchestration i.e. Airflow (Airflow itself is nice to have) Version and Source Control experience Nice to Have: Airflow Gitlab Flyway

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.