drjobs
Sr Data ConsultantArchitect - Snowflake
drjobs
Sr Data ConsultantAr....
drjobs Sr Data ConsultantArchitect - Snowflake العربية

Sr Data ConsultantArchitect - Snowflake

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Jobs by Experience

drjobs

12-13years

Job Location

drjobs

Tokyo - Japan

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2725078
We are seeking a Sr Data Consultant/Architect to join our Japan team in Tokyo. As a Sr Data Consultant you will be accountable for developing data products with decentralized data architecture. You will be able to apply your data warehousing experience as well as learn modern and efficient ways of delivering data to business partners. You will be a part of the global team working on multiple cuttingedge technologies to develop solutions for Data Mesh

Responsibilities:
  • Candidate will be developing data products in Snowflake deployed on an Azure cloud platform
  • You will be using DataOps.live to develop manage and deploy data vault and developing transformation workflows with dbt
  • You will ensure data product characteristics are met with QA
  • You will be responsible for releasing new versions of data products including writing release notes and following change request process and automating it
  • You will work on tasks in Jira following agile processes of estimation sprint planning and sprint execution
  • You will document results of your work in Confluence
  • Engage in data governance data management business intelligence and big data analytics and so on
  • Analyse data(structured/unstructured) to find actionable insight for the improvement of business using SQL and Python and so on
  • Create and maintain BI dashboards (optional and good to have)
  • Educate the importance of data to internal/external stakeholders
  • Collaborate with other team and crossteam members effectively
  • Debugging and resolving technical problems


Requirements

  • Bachelors degree in Computer Science or related field
  • Handson experience on Snowflake Cloud Datawarehouse with at least couple of Snowflake implementation
  • Experience in Snowflake or similar technologies (building data warehouses using MS SQL Server Oracle AWS Redshift etc.) and knowledge of SQL
  • Experience and demonstrated proficiency with Snowflake query logging system logging and other system management tools
  • Experience in DataOps.live and dbt or similar technologies (SSIS Informatica Apache Airflow etc.)
  • Knowledge of data modelling methodologies data vault data products and data mesh
  • Understanding of Datawarehouse (DWH) systems and migration from DWH to data lakes/Snowflake
  • Understanding of data models and transforming data into models
  • Strong understanding of Data Analytics architecture components including cloud data warehouses metadata driven processes devops data as an asset etc
  • Good knowledge on ETL (Datastage) Concepts data pipeline and workflow management
  • Experience in writing troubleshooting and optimizing complex SQL
  • Experience with Python or Scala programming
  • Basic knowledge of Git and modern ways of working with code
  • Advanced working SQL knowledge and experience working with relational databases query authoring (SQL) as well as working familiarity with a variety of databases
  • Working knowledge of Unix/Shell scripting is good to have


We are seeking a Sr Data Consultant/Architect to join our Japan team in Tokyo. As a Sr Data Consultant you will be accountable for developing data products with decentralized data architecture. You will be able to apply your data warehousing experience as well as learn modern and efficient ways of delivering data to business partners. You will be a part of the global team working on multiple, cutting-edge technologies to develop solutions for Data Mesh

Responsibilities:
  • Candidate will be developing data products in Snowflake deployed on an Azure cloud platform
  • You will be using DataOps.live to develop, manage and deploy data vault and developing transformation workflows with dbt
  • You will ensure data product characteristics are met with QA
  • You will be responsible for releasing new versions of data products including writing release notes and following change request process and automating it
  • You will work on tasks in Jira following agile processes of estimation, sprint planning and sprint execution
  • You will document results of your work in Confluence
  • Engage in data governance, data management, business intelligence, and big data analytics and so on
  • Analyse data(structured/unstructured) to find actionable insight for the improvement of business, using SQL and Python and so on
  • Create and maintain BI dashboards (optional and good to have)
  • Educate the importance of data to internal/external stakeholders
  • Collaborate with other team and cross-team members effectively
  • Debugging and resolving technical problems


Requirements

  • Bachelors degree in Computer Science or related field
  • Hands-on experience on Snowflake Cloud Datawarehouse with at least couple of Snowflake implementation
  • Experience in Snowflake or similar technologies (building data warehouses using MS SQL Server, Oracle, AWS Redshift etc.) and knowledge of SQL
  • Experience and demonstrated proficiency with Snowflake query logging, system logging, and other system management tools
  • Experience in DataOps.live and dbt or similar technologies (SSIS, Informatica, Apache Airflow etc.)
  • Knowledge of data modelling methodologies - data vault, data products, and data mesh
  • Understanding of Datawarehouse (DWH) systems, and migration from DWH to data lakes/Snowflake
  • Understanding of data models and transforming data into models
  • Strong understanding of Data Analytics architecture components, including cloud data warehouses, metadata driven processes, devops, data as an asset, etc
  • Good knowledge on ETL (Datastage) Concepts, data pipeline and workflow management
  • Experience in writing, troubleshooting, and optimizing complex SQL
  • Experience with Python or Scala programming
  • Basic knowledge of Git and modern ways of working with code
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Working knowledge of Unix/Shell scripting is good to have


Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.