drjobs Data Architect Data Ingestion, ETL, Cloud

Data Architect Data Ingestion, ETL, Cloud

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Jobs by Experience drjobs

5years

Job Location drjobs

Toronto - Canada

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Role Description:
We are looking for a seasoned Technical Architect with deep expertise in ETL design data ingestion architecture and end-to-end solutioning.
The ideal candidate will be responsible for designing scalable and efficient data solutions driving best practices across the data lifecycle and contributing to strategic initiatives.
Experience with Order and Trade data is a plus.

Key Responsibilities:
Lead the design and architecture of data ingestion and ETL pipelines across multiple platforms.
Define and implement solutioning strategies for complex data integration and transformation use cases.
Collaborate with cross-functional teams to gather requirements and translate them into technical designs.
Establish and promote best practices in data engineering including data quality governance and performance optimization.
Architect solutions for real-time and batch data ingestion using modern data platforms and tools.
Ensure alignment with enterprise architecture standards and data security policies.
Provide technical leadership and guidance to development teams throughout the project lifecycle.
Document architecture decisions data flow diagrams and integration patterns.

Required Skills Experience:
8 years of experience in data architecture ETL development and solution design.
Strong hands-on experience with ETL tools (e.g. Informatica Talend Apache NiFi Azure Data Factory).
Proficiency in cloud data platforms (Azure AWS GCP).
Solid understanding of data ingestion frameworks data lakes and data warehousing.
Expertise in SQL Python or Scala for data processing and transformation.
Familiarity with streaming technologies (e.g. Kafka Spark Streaming).
Strong communication and stakeholder management skills.

Nice to Have:
Experience working with Order and Trade data in financial services or capital markets.
Knowledge of industry best practices for handling financial transaction data.
Understanding of regulatory compliance and data privacy in financial domains.

Competencies: Amazon Web Service (AWS) Cloud Computing Python for Data Science Informatica CDI (Cloud Data Integration)
Experience (Years): 4-6



Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.