drjobs RQ08853 - Business Intelligence Specialist - Senior

RQ08853 - Business Intelligence Specialist - Senior

Employer Active

1 Vacancy
The job posting is outdated and position may be filled
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Jobs by Experience drjobs

10years

Job Location drjobs

Toronto - Canada

Monthly Salary drjobs

60 - 60

Vacancy

1 Vacancy

Job Description

Description

General Responsibilities

  • Develop and maintain onpremise ETL pipelines using SqlServer Integration Services TSQL stored procedures
  • Develop and maintain Oracle Materialized Views as Change Data Capture (CDC) mechanism for incremental updates
  • Participate in requirements design and implementation of Oracle GoldenGate solution as new CDC tool
  • Design develop and implement ingestion framework from Oracle source to Azure Data Lake initial load and incremental ETL. Used tools are:
    • Azure Data Factory (expert knowledge) to maintain pipeline from Oracle to Azure Data Lake
    • Azure Databricks to build stored procedures and read data from data lake
  • Review requirements database tables and database relationships Identify gaps and inefficiencies in current production reporting environment and provide recommendations address them in the new platform.
  • Prepare and maintain design artifacts such as ETL requirements sourcetotarget mapping (STM) ETL workflows data model diagrams (ERD) data lineage diagrams error handling and logging strategy using specialized tools
  • Work in an Agile environment using Azure DevOps tools for code repository and defect tracking
  • Data analysis and design physical model mapping from data source to reporting destination.
    • Understand the requirements. Recommend changes to the physical model.
    • Develop the scripts of physical model and create DB.
    • Access Oracle DB environments use SSIS SQL Server and other development tools for developing solution.
    • Proactively communicate with business analysts on any changes required to conceptual logical and physical models communicate and review dependencies and risks.
  • Development of ETL strategy and solution for different subject areas
    • Understand the Tables and Relationships.
    • Create low level design documents and unit test cases.
    • Create the workflows of package design
  • Development and testing of data with Incremental and Full Load.
    • Develop high quality ETL mappings/scripts/jobs
    • ETL data from Applications to Data Warehouse
    • ETL data from Data Warehouse to Data Mart
    • Perform unit tests.
  • Performance and data consistency checks
    • Troubleshoot performance issues ETL Load issues log activity for each Individual package and transformation.
    • Review performance of ETL overall.
  • End to end Integrated testing for Full Load and Incremental Load
  • Plan for Go Live Production Deployment.
    • Create production deployment steps.
    • Configure parameters scripts for go live. Test and review the instructions.
    • Create release documents and help build and deploy code across servers.
  • Go Live Support and Review after Go Live.
    • Review existing ETL process and tools and provide recommendation on improving performance and to reduce ETL time.
    • Review infrastructure and any pain points for overall process improvement
  • Knowledge Transfer to Ministry staff development of documentation on the work completed.
    • Document share and work on the ETL end to end working knowledge troubleshooting steps configuration and scripts review.
    • Transfer documents scripts and review of documents.


Requirements

Experience and Skill Set Requirements


Experience:

Experience of 7 years of working with SQL Server SSIS and TSQL Development (Must Have)

Experience working with building Databases Data Warehouse and Data Mart and working with delta/incremental and full loads (Must Have)

Data Warehouse concepts Kimball and Inmon design methodologies (Must Have)

Experience with Azure ETL tools such as Azure Data Factory and Azure Databricks (Must Have)

Experience working with MSSQL Sever and Oracle database tools (Must Have)

Experience configuring and using Oracle GoldenGate (Nice to have)

Knowledge of Dimensional Data modeling and tools e.g. Power Designer

Experience with snowflake and star schema design.

Experience in designing and implementing data warehouse solutions using slowly changing dimensions SCD Type 1 Type 2 and Type 3.

Analyze design develop test and document ETL programs from detailed and highlevel specifications and assist in troubleshooting.

Utilize SQL to perform tasks other than data transformation (DDL complex queries)

Good knowledge of database performance optimization techniques

Ability to assist in the requirements analysis and subsequent developments

Ability to conduct unit tests and assist in test preparations to ensure data integrity

Work closely with Data Analysts Business Analysts and Developers

Liaise with Project Managers Quality Assurance Analysts and Business Intelligence Consultants

Design and implement technical enhancements of Data Warehouse as required.

2 years Data Warehouse concepts and principles

Knowledge of Dimensional modeling (nice to have)

Databricks

SQL Server

Oracle

Ability to present technical requirements to business users



Mandatory Skills:

7 years in ETL tools such as Microsoft SSIS stored procedures (Must Have)

2 Azure Data Lake and Data Warehouse and building Azure Data Factory pipelines (Must Have)

2 years Python/PySpark (nice to have)



PM

Education

PM

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.