drjobs
Business Intelligence 7221-2112
drjobs
Business Intelligenc....
drjobs Business Intelligence 7221-2112 العربية

Business Intelligence 7221-2112

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Job Location

drjobs

Toronto - Canada

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2641244

HM Note: This hybrid role is three (3) days in office and candidates need to include three references with the resume.


Description:

General Responsibilities

  • Design develop and implement ingestion framework from Oracle source to Azure Data Lake initial load and incremental ETL. Used tools are:
  • Azure Data Factory (expert knowledge) to maintain pipeline from Oracle to Azure Data Lake
  • Azure Synapse to build stored procedures and read data from data lake
  • Review the requirements database tables and database relationships Identify gaps and inefficiencies in current production reporting environment and provide recommendations address them in the new platform.
  • Continue to evolve and design ingesting framework and CDC
  • Prepare design artifacts
  • Analysis of data physical model mapping from data source to reporting destination.
  • Understand the requirements. Recommend changes to the Physical model.
  • Develop the scripts of physical model and create DB.
  • Access Oracle DB environments and nbsp;use and nbsp;SSIS SQL Server and other development and nbsp;tools for developing solution.
  • Proactively communicate with business on any changes required to conceptual logical and Physical models communicate and review dependencies and risks.
  • Development of ETL and nbsp;strategy and and nbsp;solution based on different set of modules
  • Understand the Tables and Relationships.
  • Create low level design documents and unit test cases.
  • Create the workflows of package design
  • Development and testing of data with Incremental and Full Load.
  • Develop high quality ETL mappings/scripts/jobs
  • ETL data from Applications to Data Warehouse
  • ETL data from Data Warehouse to Data Mart
  • Perform unit tests.
  • Performance Review data Consistency checks
  • Troubleshoot performance issues ETL Load issues log activity for each Individual package and transformation.
  • Review Performance of ETL Overall.
  • End to end Integrated testing for Full Load and Incremental Load
  • Plan for Go Live Production Deployment.
  • Create production deployment steps.
  • Configure parameters scripts for go live. Test and review the instructions.
  • Create release documents and help build and deploy code across servers.
  • Go Live Support and Review after Go Live.
  • Review existing ETL process tools and provide recommendation on improving performance and reduce ETL timelines.
  • Review Infrastructure and any pain points for overall process improvement
  • Knowledge Transfer to Ministry staff development of documentation on the work completed.
  • Document share and work on the ETL end to end working knowledge Troubleshooting steps configuration and scripts review.
  • Transfer documents scripts and review of documents.


Skills

Experience and Skill Set Requirements



Experience:

  • Experience of 7 years of working with and nbsp;SQL Server ADF and TSQL Development and nbsp;(Must Have)
  • Experience working with building Databases Data Warehouse and Data Mart and working with delta / incremental and full loads and nbsp;(Must Have)
  • Experience with any ETL tools such as SQL Server ADF Cloud tools and nbsp;(Must Have)
  • Experience working with MSSQL Sever on premise and within Azure Environment and nbsp;(Must Have)
  • Experience on Data modeling and tools e.g. SAP Power Designer
  • Experience with snowflake and star schema model. Experience in designing data warehouse solutions using slowly changing dimensions.
  • Experience working with SQL Server SSIS and other ETL tools solid knowledge and experience with SQL other RDBMS (SQL Server SSIS)
  • Understanding data warehouse architecture with a data vault dimensional data and fact model.
  • Analyze design develop test and document ETL programs from detailed and highlevel specifications and assist in troubleshooting.
  • Utilize SQL to perform tasks other than data transformation (DDL complex queries)
  • Good knowledge of database performance optimization techniques
  • Ability to assist in the requirements analysis and subsequent developments
  • Ability to conduct unit tests and assist in test preparations to ensure data integrity
  • Work closely with Designers Business Analysts and other Developers
  • Liaise with Project Managers Quality Assurance Analysts and Business Intelligence Consultants
  • Design and implement technical enhancements of Data Warehouse as required.


Skills:

  • 7 years in ETL tools such as Microsoft SSIS stored procedures and nbsp;(Must Have)
  • 2 Azure Data Lake and Data Warehouse and building Azure Data Factory pipelines and nbsp;(Must Have)
  • 2 years Python and nbsp;(nice to have)
  • Databricks
  • Synapse (nice to have)
  • SQL Server
  • Oracle
  • Ability to present technical requirements to the business


Evaluation Criteria


Design Documentation and Analysis Skills (45 points)

  • Demonstrated experience in creating both Functional Design Documents (FDD) and amp; Detailed Design Documents (DDD).
  • Experience in FitGap analysis system use case reviews requirements reviews coding exercises and reviews.
  • Experience in the development and maintaining a plan to address contract deliverables through the identification of significant milestones and expected results with weekly status reporting.
  • Work with the Client and amp; Developer(s) assigned to refine/confirm Business Requirements
  • Participate in defect fixing testing support and development activities for ETL tool. Assist with defect fixing and testing support for Power BI reports.
  • Analyze and nbsp;and document solution complexity and interdependencies by function including providing support for data validation.


Development Database and ETL experience (45 points)

  • Demonstrated experience in Microsoft specific software development and a number of years of practical experience (minimum 7 years)
  • Proven experience in developing in Azure DevOps
  • Experience in application mapping to populate data vault and dimensional data mart schemas
  • Demonstrated experience in Extract Transform and amp; Load and Extract Load and and nbsp;Transforms and nbsp;software development and a number of years of practical experience (minimum 7 years)
  • Experience in providing ongoing support on Azure pipeline/configuration and SSIS and nbsp;development
  • Experience building data ingesting and change data capture using Golden Gate (an asset but not mandatory)
  • Assist in the development of the predefined and adhoc reports and meet the coding and accessibility requirements.
  • Demonstrated experience with Oracle and Microsoft interfaces
  • Proficient in SQL and Azure DevOps
  • Implementing logical and physical data models


Knowledge Transfer (10 points)

  • The Developer must have previous work experience in conducting Knowledge Transfer and training sessions ensuring the resources will receive the required knowledge to support the system. The resource must develop learning activities using reviewwatchdo methodology and amp; demonstrate the ability to prepare and present.
  • Development of documentation and materials as part of a review and knowledge transfer to other members
  • Development and facilitation of classroom based or virtual instructor demo led sessions for Developers
  • Monitor identified milestones and submission of status reports to ensure Knowledge Transfer is fully completed

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.