Data Engineer AIML

Not Interested
Bookmark
Report This Job

profile Job Location:

Toronto - Canada

profile Monthly Salary: CAD 10 - 10
profile Experience Required: 5years
Posted on: 8 hours ago
Vacancies: 1 Vacancy

Job Summary

Data Engineer
Toronto - Hybrid (3-4 days from Office)
Primary Skill
Digital : PythonAI & Gen AI - Products & Tools

Role Summary
The Data Engineer Regulatory Reporting is responsible for designing building and maintaining data pipelines and AI powered solutions that support regulatory reporting requirements. This role combines strong data engineering foundations with emerging GenAI and ML technologies to ensure accurate timely and compliant reporting across the enterprise. The position also includes training and work with the Axiom regulatory reporting tool supporting automation and data quality efforts and collaborating with technical and business stakeholders.

Key Responsibilities
1) Design & Development
Design develop and maintain large scale data pipelines and data architectures using Python.
Integrate GenAI models (e.g. ChatGPT) to enhance data processing and reporting automation.
Build scalable reusable and secure data solutions aligned with regulatory reporting needs.
2) Data Analysis
Perform data analytics to extract insights from large datasets.
Support regulatory reporting teams with data investigations validation and root cause analysis.
3) Model Development
Develop and deploy AI/ML models using GenAI technologies with a focus on NLP and machine learning.
Apply models to streamline and enhance regulatory reporting workflows.
4) Axiom Vendor Tool (Training Provided)
Receive training on and work with the Axiom regulatory reporting tool.
Integrate Axiom with existing data pipelines and support ongoing regulatory reporting requirements.
Strong SQL knowledge required to effectively learn and use Axiom.
5) Additional Technical Skills
Cohere Model Experience (Nice to Have): Ability to leverage Cohere models for NLP use cases.
Unix Experience (Nice to Have): Familiarity with Unix systems for automation and Axiom related tasks.
6) Problem Solving & Optimization
Troubleshoot data pipeline failures and data quality issues.
Optimize data processing performance and ensure end to end data accuracy for reporting.
7) Documentation
Maintain clear and up to date documentation covering data pipelines architectures integration logic and ML models.
Support knowledge sharing across engineering and compliance teams.
8) Continuous Improvement
Stay current with trends in GenAI AI/ML Python engineering regulatory reporting and data tooling.
Recommend and implement improvements to data quality automation and regulatory workflows.

Requirements
Experience
46 years of experience in data engineering with strong exposure to Python and GenAI technologies.
Hands on experience using ChatGPT or other GenAI models.
Strong SQL expertise and experience working with large datasets.
Technical Skills
Python SQL data pipeline engineering.
Understanding of AI/ML fundamentals and NLP models.
Experience with Unix (preferred).
Familiarity with Cohere models (nice to have).
Ability to analyze logs troubleshoot issues and support production pipelines.


Required Skills:

Experience (Years): 8-10

Data EngineerToronto - Hybrid (3-4 days from Office)Primary SkillDigital : PythonAI & Gen AI - Products & ToolsRole SummaryThe Data Engineer Regulatory Reporting is responsible for designing building and maintaining data pipelines and AI powered solutions that support regulatory reporting requireme...
View more view more

Company Industry

IT Services and IT Consulting

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala