General Responsibilities
- Responsibilities Works in partnership with clients advising them on information technology in order to meet their business objectives or overcome problems work to improve structure and efficiency of an organization s I&IT systems. The I&IT Consultant may be used to provide strategic guidance to organizations with regard to Information Management & IT technology IT infrastructures and the enablement of major business processes through enhancements to IT. Provides subject matter expertise in their field and highly expert technical assistance. General Skills Acts as the technical advisor/expert on all aspects of a specific deliverable Provide the quality assurance/quality control of specific deliverables Anticipates and resolves problems to ensure that the deliverables are completed within budget to the highest quality meeting or exceeding expectations Develops processes and procedures for implementing deliverables Prepares reports and presentations including options recommendations implementation plans etc. Works with clients to define the scope of a project and to determine requirements Defines software hardware and network requirements Analyzes I&IT requirements giving independent and objective advice on the use of I&IT Designs tests installs and monitors new systems and develops solutions and implementation of new systems Familiar with changemanagement principles and methodology Knowledge and understanding of Information Management principles concepts policies and practices
Additional Responsibilities
This role will focus on data architecture data warehousing data lakes and analytics. The individual will be designing developing maintaining and optimizing ETL (Extract Transform Load) processes in Databricks for data warehousing data lakes and analytics. The individual will work closely with data architects and business teams to ensure the efficient transformation and movement of data to meet business needs including handling Change Data Capture (CDC) and streaming data.
- Review business requirements familiarize with and understand business rules and transactional data model
- Define conceptual logical model and physical model mapping from data source to curated model and data mart.
- Analyze requirements and recommend changes to the physical model.
- Develop scripts for the physical model create database and/or delta lake file structure.
- Access Oracle DB environments set necessary tools for developing solution.
- Implement data design methodologies historical and dimensional models
- Perform data profiling assess data accuracy design and document data quality and master data management rules
- Functionality Review Data Load review Performance Review Data Consistency checks.
- Help troubleshooting data mart design issues
- Review performance of ETL with developers and suggest improvements
- Participate in endtoend integrated testing for Full Load and Incremental Load and advise on issues
Tools used are:
- Azure Databricks Delta Lake Delta Live Tables and Spark to process structured and unstructured data.
- Azure Databricks/PySpark (good Python/PySpark knowledge required) to build transformations of raw data into curated zone in the data lake.
- Azure Databricks/PySpark/SQL (good SQL knowledge required) to develop and/or troubleshoot transformations of curated data into FHIR.
Requirements
Data design
- Understand the requirements. Recommend changes to models to support ETL design.
- Define primary keys indexing strategies and relationships that enhance data integrity and performance across layers.
- Define the initial schemas for each data layer
- Assist with data modelling and updates of sourcetotarget mapping documentation
- Document and implement schema validation rules to ensure incoming data conforms to expected formats and standards
- Design data quality checks within the pipeline to catch inconsistencies missing values or errors early in the process.
- Proactively communicate with business and IT experts on any changes required to conceptual logical and physical models communicate and review timelines dependencies and risks.
Development of ETL strategy and solution for different sets of data modules
- Understand the Tables and Relationships in the data model.
- Create low level design documents and test cases for ETL development.
- Implement errorcatching logging retry mechanisms and handling data anomalies.
- Create the workflows and pipeline design
- Development and testing of data pipelines with Incremental and Full Load.
Develop high quality ETL mappings/scripts/notebooks
- Develop and maintain pipeline from Oracle data source to Azure Delta Lakes and FHIR
- Perform unit testing
- Ensure performance monitoring and improvement
Performance review data consistency checks
- Troubleshoot performance issues ETL issues log activity for each pipeline and transformation.
- Review and optimize overall ETL performance.
Endtoend integrated testing for Full Load and Incremental Load
- Plan for Go Live Production Deployment.
- Create production deployment steps.
- Configure parameters scripts for go live. Test and review the instructions.
- Create release documents and help build and deploy code across servers.
Go Live Support and Review after Go Live.
- Review existing ETL process tools and provide recommendation on improving performance and reduce ETL timelines.
- Review infrastructure and remediate issues for overall process improvement
Knowledge Transfer to Ministry staff development of documentation on the work completed.
- Document work and share the ETL endtoend design troubleshooting steps configuration and scripts review.
- Transfer documents scripts and review of documents to Ministry.
Must Have Skills
- 7 years using ETL tools such as Microsoft SSIS stored procedures TSQL
- 2 Delta Lake Databricks and Azure Databricks pipelines
- Strong knowledge of Delta Lake for data management and optimization.
- Familiarity with Databricks Workflows for scheduling and orchestrating tasks.
- 2 years Python and PySpark
- Solid understanding of the Medallion Architecture (Bronze Silver Gold) and experience implementing it in production environments.
- Handson experience with CDC tools (e.g. GoldenGate) for managing realtime data.
- SQL Server Oracle