The data engineer is responsible for designing implementing and managing data pipelines and establishing data models to deliver reliable information to the various business lines of the company.
Joining the data engineering team means growing in an environment where autonomy and initiative are valued. Here you have the freedom to propose ideas demonstrate resourcefulness and embrace a resultsoriented approach.
Responsabilities:
Design develop and maintain cloud-based data processing and storage solutions.
Design and implement robust data pipelines (ETL/ELT) for ingesting transforming enriching and distributing data from various sources (structured semi-structured unstructured).
Collaborate with stakeholders (business teams analysts data scientists architects) to understand business needs and translate them into appropriate technical solutions.
Participate in defining the cloud data strategy: develop roadmaps migration plans gradual adoption of native cloud services and modernization of existing systems.
Ensure data quality consistency security and integrity through controls automated testing and continuous monitoring of data flows.
Stay up to date on cloud solutions architecture Ops practices (Data/ML-Ops) and AWS technologies as well as their interdependencies.
Implement observability and logging practices (logging monitoring alerting) for data pipelines and infrastructures.
Identify and adopt new tools and techniques to improve performance automation and scalability.
Document solutions data flows and technical decisions to ensure the sustainability and maintainability of implemented solutions.
Qualifications :
Studies in Computer Science or a related field or equivalent professional experience.
At least three (3) years of experience in data engineering including designing data pipelines large-scale data processing and integrating diverse data sources
Experience building and implementing AWS infrastructures
Proficiency with pipeline orchestration tools (e.g. Airflow AWS Step Functions DBT) and data processing languages (Python SQL Spark etc.)
Experience configuring and tuning virtual private clouds
Knowledge of best practices in data security monitoring and quality (e.g. encryption restricted access data lineage)
Experience with infrastructure-as-code (Terraform CloudFormation or equivalent) and DevOps practices (CI/CD Git automation)
Strong analytical and problem-solving skills
Self-starter with the ability to work independently or as part of a project team
Technical curiosity openness to learning new technologies and a drive for continuous improvement
Ability to work in an agile environment collaborate with multidisciplinary teams and adapt to changing priorities
Bilingual in English and French (spoken and written)
THE TECHNOLOGY STACK THAT AWAITS YOU:
AWS
Databricks
Python SQL
Terraform
Additional Information :
EQUAL OPPORTUNITY EMPLOYER
At VOSKER we value the uniqueness of every individual and celebrate the diversity that helps us redefine whats possible. We foster collaboration in a healthy inclusive work environment where all voices are heard.
If you have specific needs to make the recruitment process more accessible dont hesitate to reach out.
Now its your turn tell us about yourself and apply today!
Remote Work :
Yes
Employment Type :
Full-time
VOSKER, North American leader in remote area surveillance. Every day, we pride ourselves on helping our customers keep an eye on what really matters to them, by developing solar-powered and cellular-connected cameras on our proprietary platform. In a few words at VOSKER; We are ... View more