drjobs RQ09450 - Software Developer - ETL - Senior

RQ09450 - Software Developer - ETL - Senior

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Jobs by Experience drjobs

8years

Job Location drjobs

Toronto - Canada

Monthly Salary drjobs

85 - 85

Vacancy

1 Vacancy

Job Description

Scope:

The Office of the Public Guardian and Trustee (OPGT) requires a Senior Software Developer ETL to perform activities for interfacing the new Dynamics 365 solution and data warehouse with internal (OPS) and external systems as a member of the integrations team made up of internal and vendor staff.


Assignment Deliverables:

As a member of the integrations team the ETL Developer will be responsible for integrating the new Dynamics 365 solution the data warehouse and various internal (OPS) and external systems. This team will comprise both internal staff and vendor resources.

A high-level list of deliverables for the ETL Developer includes:

  • ETL Solution Design and Development:
    • Design develop and implement robust ETL (Extract Transform Load) processes for data migration and ongoing integrations between diverse source systems (e.g. internal legacy systems external vendor platforms) and Microsoft Dynamics 365 Customer Engagement (CE) and Finance & Operations (F&O).
    • Develop and optimize data transformation logic to ensure data quality consistency and adherence to business rules and D365 data models.
    • Utilize and recommend appropriate ETL tools and technologies (e.g. Azure Data Factory SSIS other cloud-based ETL services) to build efficient and scalable data pipelines. I
    • Implement data cleansing validation and error handling mechanisms within ETL processes.
  • Data Migration Planning and Execution:
    • Lead and execute all phases of data migration activities from legacy systems to D365 CE and F&O including data profiling mapping cleansing transformation and loading.
    • Develop and maintain data migration strategies cutover plans and rollback procedures.
    • Collaborate with data owners and business users to ensure data accuracy and completeness during migration.
  • Testing and Quality Assurance:
    • Design develop and execute comprehensive test plans cases scripts and test data (e.g. manufactured obfuscated) based on functional and technical specifications to validate ETL solutions and data integrity.
    • Create and maintain a full test plan testing procedures and an associated library of reusable test cases and scripts ensuring full traceability from requirements to test outcomes.
    • Perform both manual and automated testing to validate system and integration functionality data accuracy performance and scalability. This includes unit testing integration testing system testing and performance testing for ETL processes.
  • Collaboration and CI/CD Integration:
    • Actively collaborate with stakeholders across business units development teams and external vendors to understand integration requirements and ensure proper data flow.
    • Ensure proper integration of ETL processes and tests into the continuous integration/continuous delivery (CI/CD) pipeline to support automated deployments and efficient release cycles.
  • Support and Documentation:
    • Provide analytical development and testing support for ETL processes and data integrations throughout the project lifecycle.
    • Develop and maintain detailed technical documentation for all ETL processes data mappings data dictionaries and integration architectures.
    • Assist in troubleshooting and resolving data integration issues providing timely support and solutions.


Requirements

Experience and Skill Set Requirements:

Must Haves:

At least 8 years of hands-on experience in enterprise-level data integration and ETL (Extract Transform Load) development with a significant focus on integrating with Microsoft Dynamics 365 (Customer Engagement and/or Finance & Operations) and related Azure data services.


Skill Set Requirements:

Desired Skills and Experience

The ideal candidate for this ETL Developer role will possess a strong blend of technical expertise in data integration a deep understanding of Microsofts data ecosystem and excellent collaborative abilities.

  • ETL Tool Proficiency:

Mandatory: Proven hands-on experience with Microsofts primary ETL tools for enterprise data integration specifically Azure Data Factory (ADF). This includes designing and implementing pipelines data flows activities datasets linked services and integration runtimes.

  • Highly Desirable: Experience with SQL Server Integration Services (SSIS) for existing on-premise integrations or migration scenarios.
  • Familiarity with other relevant data integration tools and concepts (e.g. Change Data Capture - CDC data streaming) is a plus.
  • Database and Data Warehousing Expertise:

Strong SQL proficiency: Ability to write complex SQL queries stored procedures functions and views for data extraction transformation and loading across various database platforms (e.g. SQL Server Azure SQL Database).

  • Solid understanding of data warehousing concepts (e.g. dimensional modeling star/snowflake schemas data marts) and experience designing and implementing data warehouse solutions.
  • Experience with Azure data services relevant to data warehousing and analytics (e.g. Azure Synapse Analytics Azure Data Lake Storage).
  • Microsoft Dynamics 365 Data Acumen:

Fundamental understanding of Dynamics 365 data models for both Customer Engagement (CRM) and Finance & Operations (ERP). This includes knowledge of key entities relationships and common data patterns within D365.

  • Ability to extract data from D365 APIs and OData feeds and load data effectively into D365 (e.g. using Data Management Framework - DMF KingswaySoft or custom integrations).
  • Data Quality and Governance:

Experience implementing data cleansing validation error handling and reconciliation processes within ETL pipelines to ensure high data quality.

  • Understanding of data governance principles and best practices for managing data integrity and consistency.
  • Programming/Scripting (Desirable):

Proficiency in scripting languages such as Python PowerShell or C# for custom data transformations automation of ETL tasks and interacting with APIs.

  • Version Control and CI/CD:

Experience with version control systems (e.g. Git Azure DevOps Repos) for managing ETL code and configurations.

  • Familiarity with Continuous Integration/Continuous Delivery (CI/CD) pipelines for automated deployment of ETL solutions.
  • Analytical and Problem-Solving Skills:

Excellent analytical and problem-solving skills with a keen eye for detail to identify data discrepancies troubleshoot complex integration issues and optimize ETL performance.

  • Ability to translate business requirements into technical data integration solutions.
  • Communication and Collaboration:

Strong verbal and written communication skills to articulate technical concepts clearly to both technical and non-technical stakeholders.

  • Ability to collaborate effectively within a multidisciplinary team (internal and vendor staff) including data architects D365 functional consultants and business users.
  • Demonstrated ability to document technical designs data mappings and ETL processes thoroughly.


Technical Skills:

  • Hands-on experience with Microsoft Dynamics 365 cloud environments (both Customer Engagement and Finance & Operations) specifically concerning data extraction loading and integration points.
  • Extensive experience (8 years) working across various data platforms database technologies and integration patterns including relational databases (SQL Server Azure SQL) data lakes (Azure Data Lake Storage) and cloud data warehouses (Azure Synapse Analytics).
  • Proven experience with middleware integration platforms and APIs particularly those used for connecting diverse systems to Dynamics 365 (e.g. Azure Data Factory Logic Apps API Management or other enterprise integration tools).
  • Deep understanding and practical application of performance optimization techniques for ETL processes large-scale data migrations and data synchronization in cloud environments.
  • Demonstrated experience with structured methodologies for the design development and implementation of data integration solutions including requirements gathering data mapping and detailed technical design documentation.
  • Strong background in data analysis and system design within large-scale complex enterprise environments focusing on data flow data quality and system interoperability.


Broader Technical Acumen & Methodological Proficiency:

  • Demonstrated experience integrating diverse enterprise systems beyond D365 leveraging various integration patterns middleware technologies (e.g. Azure Integration Services Logic Apps) and communication protocols (e.g. REST SOAP SFTP).
  • Proven experience in managing and optimizing the performance of large-scale data migrations and continuous data synchronization processes across heterogeneous systems.
  • Extensive experience with structured methodologies for the entire data integration lifecycle from detailed requirements gathering and data mapping to solution design development testing and deployment.
  • Strong background in data analysis data quality management and troubleshooting complex data discrepancies within large integrated system landscapes.
  • Familiarity with modern software development practices including version control (e.g. Git Azure DevOps Repos) and supporting Continuous Integration/Continuous Delivery (CI/CD) pipelines for automated ETL deployments.


Interpersonal Skill:

Exceptional Communication and Collaboration:

  • Articulate and concise communication skills both verbal and written capable of conveying complex technical information about data integration ETL processes and data quality issues to diverse audiences including technical teams D365 functional consultants and non-technical business stakeholders.
  • Proven ability to actively participate in and lead technical discussions offering informed solution recommendations explaining design choices and effectively documenting work for clarity and future reference.
  • Strong negotiation and influencing skills to align stakeholders on data integration strategies resolve data mapping discrepancies and gain buy-in for proposed ETL solutions ensuring project objectives are met.
  • Demonstrated ability to work effectively within a multidisciplinary team environment (comprising internal staff vendors and cross-functional departments) fostering a collaborative atmosphere and successfully integrating individual contributions into a cohesive project outcome.


Knowledge Transfer:

What needs to be KT

ETL Solution Design Documentation:

  • Detailed ETL Process Flows/Pipelines (including end-to-end data flow from source to D365/data warehouse).
  • Comprehensive Data Mapping Specifications (source-to-target transformations data types cleansing logic).
  • Integration Architecture Diagrams (showing connections between D365 data warehouse and other systems).
  • Documentation of Data Governance & Quality Rules (validation error handling reconciliation).

ETL Development Artefacts:

  • Fully commented and version-controlled ETL code and scripts (Azure Data Factory pipelines SSIS packages SQL scripts custom code).
  • Deployment and Configuration Guides (step-by-step instructions for environments including environment-specific settings).
  • Performance Optimization and Monitoring Artefacts (tuning strategies monitoring queries/dashboards).

Testing and Validation Assets:

  • ETL Test Plans and Test Cases (strategy unit integration data validation test cases).
  • Sample Test Data and Data Validation Scripts (examples of test data scripts/queries for integrity validation).

Operational Runbooks & Troubleshooting Guides:

  • Daily Operations Runbook (procedures for routine monitoring scheduling common tasks).
  • Troubleshooting Guides (identifying/resolving job failures discrepancies performance bottlenecks escalation paths).

Project-Specific Documentation:

  • Key Data Integration Decision Logs (rationale for significant design choices).
  • Technical Presentations & Walkthroughs (materials used for deep-dives potentially recorded sessions).
  • Relevant technical aspects of Status and Progress Reports (detailing technical progress challenges and resolutions).


Experience and Skill Set Requirements: Must Haves: At least 8 years of hands-on experience in enterprise-level data integration and ETL (Extract, Transform, Load) development, with a significant focus on integrating with Microsoft Dynamics 365 (Customer Engagement and/or Finance & Operations) and related Azure data services. Skill Set Requirements: Desired Skills and Experience The ideal candidate for this ETL Developer role will possess a strong blend of technical expertise in data integration, a deep understanding of Microsoft's data ecosystem, and excellent collaborative abilities. ETL Tool Proficiency: Mandatory: Proven hands-on experience with Microsoft's primary ETL tools for enterprise data integration, specifically Azure Data Factory (ADF). This includes designing and implementing pipelines, data flows, activities, datasets, linked services, and integration runtimes. Highly Desirable: Experience with SQL Server Integration Services (SSIS) for existing on-premise integrations or migration scenarios. Familiarity with other relevant data integration tools and concepts (e.g., Change Data Capture - CDC, data streaming) is a plus. Database and Data Warehousing Expertise: Strong SQL proficiency: Ability to write complex SQL queries, stored procedures, functions, and views for data extraction, transformation, and loading across various database platforms (e.g., SQL Server, Azure SQL Database). Solid understanding of data warehousing concepts (e.g., dimensional modeling, star/snowflake schemas, data marts) and experience designing and implementing data warehouse solutions. Experience with Azure data services relevant to data warehousing and analytics (e.g., Azure Synapse Analytics, Azure Data Lake Storage). Microsoft Dynamics 365 Data Acumen: Fundamental understanding of Dynamics 365 data models for both Customer Engagement (CRM) and Finance & Operations (ERP). This includes knowledge of key entities, relationships, and common data patterns within D365. Ability to extract data from D365 APIs and OData feeds, and load data effectively into D365 (e.g., using Data Management Framework - DMF, KingswaySoft, or custom integrations). Data Quality and Governance: Experience implementing data cleansing, validation, error handling, and reconciliation processes within ETL pipelines to ensure high data quality. Understanding of data governance principles and best practices for managing data integrity and consistency. Programming/Scripting (Desirable): Proficiency in scripting languages such as Python, PowerShell, or C# for custom data transformations, automation of ETL tasks, and interacting with APIs. Version Control and CI/CD: Experience with version control systems (e.g., Git, Azure DevOps Repos) for managing ETL code and configurations. Familiarity with Continuous Integration/Continuous Delivery (CI/CD) pipelines for automated deployment of ETL solutions. Analytical and Problem-Solving Skills: Excellent analytical and problem-solving skills with a keen eye for detail to identify data discrepancies, troubleshoot complex integration issues, and optimize ETL performance. Ability to translate business requirements into technical data integration solutions. Communication and Collaboration: Strong verbal and written communication skills to articulate technical concepts clearly to both technical and non-technical stakeholders. Ability to collaborate effectively within a multidisciplinary team (internal and vendor staff), including data architects, D365 functional consultants, and business users. Demonstrated ability to document technical designs, data mappings, and ETL processes thoroughly. Technical Skills: Hands-on experience with Microsoft Dynamics 365 cloud environments (both Customer Engagement and Finance & Operations), specifically concerning data extraction, loading, and integration points. Extensive experience (8+ years) working across various data platforms, database technologies, and integration patterns, including relational databases (SQL Server, Azure SQL), data lakes (Azure Data Lake Storage), and cloud data warehouses (Azure Synapse Analytics). Proven experience with middleware, integration platforms, and APIs, particularly those used for connecting diverse systems to Dynamics 365 (e.g., Azure Data Factory, Logic Apps, API Management, or other enterprise integration tools). Deep understanding and practical application of performance optimization techniques for ETL processes, large-scale data migrations, and data synchronization in cloud environments. Demonstrated experience with structured methodologies for the design, development, and implementation of data integration solutions, including requirements gathering, data mapping, and detailed technical design documentation. Strong background in data analysis and system design within large-scale, complex enterprise environments, focusing on data flow, data quality, and system interoperability. Broader Technical Acumen & Methodological Proficiency: Demonstrated experience integrating diverse enterprise systems beyond D365, leveraging various integration patterns, middleware technologies (e.g., Azure Integration Services, Logic Apps), and communication protocols (e.g., REST, SOAP, SFTP). Proven experience in managing and optimizing the performance of large-scale data migrations and continuous data synchronization processes across heterogeneous systems. Extensive experience with structured methodologies for the entire data integration lifecycle, from detailed requirements gathering and data mapping to solution design, development, testing, and deployment. Strong background in data analysis, data quality management, and troubleshooting complex data discrepancies within large, integrated system landscapes. Familiarity with modern software development practices, including version control (e.g., Git, Azure DevOps Repos) and supporting Continuous Integration/Continuous Delivery (CI/CD) pipelines for automated ETL deployments. Interpersonal Skill: Exceptional Communication and Collaboration: Articulate and concise communication skills, both verbal and written, capable of conveying complex technical information about data integration, ETL processes, and data quality issues to diverse audiences, including technical teams, D365 functional consultants, and non-technical business stakeholders. Proven ability to actively participate in and lead technical discussions, offering informed solution recommendations, explaining design choices, and effectively documenting work for clarity and future reference. Strong negotiation and influencing skills to align stakeholders on data integration strategies, resolve data mapping discrepancies, and gain buy-in for proposed ETL solutions, ensuring project objectives are met. Demonstrated ability to work effectively within a multidisciplinary team environment (comprising internal staff, vendors, and cross-functional departments), fostering a collaborative atmosphere and successfully integrating individual contributions into a cohesive project outcome. Knowledge Transfer: What needs to be KT ETL Solution Design Documentation: Detailed ETL Process Flows/Pipelines (including end-to-end data flow from source to D365/data warehouse). Comprehensive Data Mapping Specifications (source-to-target, transformations, data types, cleansing logic). Integration Architecture Diagrams (showing connections between D365, data warehouse, and other systems). Documentation of Data Governance & Quality Rules (validation, error handling, reconciliation). ETL Development Artefacts: Fully commented and version-controlled ETL code and scripts (Azure Data Factory pipelines, SSIS packages, SQL scripts, custom code). Deployment and Configuration Guides (step-by-step instructions for environments, including environment-specific settings). Performance Optimization and Monitoring Artefacts (tuning strategies, monitoring queries/dashboards). Testing and Validation Assets: ETL Test Plans and Test Cases (strategy, unit, integration, data validation test cases). Sample Test Data and Data Validation Scripts (examples of test data, scripts/queries for integrity validation). Operational Runbooks & Troubleshooting Guides: Daily Operations Runbook (procedures for routine monitoring, scheduling, common tasks). Troubleshooting Guides (identifying/resolving job failures, discrepancies, performance bottlenecks, escalation paths). Project-Specific Documentation: Key Data Integration Decision Logs (rationale for significant design choices). Technical Presentations & Walkthroughs (materials used for deep-dives, potentially recorded sessions). Relevant technical aspects of Status and Progress Reports (detailing technical progress, challenges, and resolutions).

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.