Addepto is a leading AI consulting ( and data engineering ( company that builds scalable ROI-focused AI solutions for some of the worlds largest enterprises and pioneering startups including Rolls Royce Continental Porsche ABB and WGU. With an exclusive focus on Artificial Intelligence and Big Data Addepto helps organizations unlock the full potential of their data through systems designed for measurable business impact and long-term growth.
The companys work extends beyond client engagements. Drawing from real-world challenges and insights Addepto has developed its own product - ContextClue - and actively contributes open-source solutions to the AI community. This commitment to transforming practical experience into scalable innovation has earned Addepto recognition by Forbes as one of the top 10 AI consulting companies worldwide.
As part of KMS Technology a US-based global technology group Addepto combines deep AI specialization with enterprise-scale delivery capabilitiesenabling the partnership to move clients from AI experimentation to production impact securely and at scale.
As a Data Engineer you will have the opportunity to support and further develop an Azure-based data integration solution built primarily around Azure Data Factory (ADF). The current environment includes Azure Functions and ingestion components but daily delivery is strongly centered on ADF pipeline design orchestration monitoring and continuous improvement.
The project focuses on expanding and stabilizing the existing data platform including the warehouse layer (Azure SQL Database or Synapse) optional dbt-based transformations and occasional Power BI reporting support. You will work closely with the client-side Product Owner and Architect proactively aligning business needs with technical implementation decisions and ensuring high-quality scalable solutions. Additionally you will contribute to Azure DevOps CI/CD pipelines and release processes to maintain reliable deployments across environments.
Your main responsibilities:
Design develop and maintain Azure Data Factory pipelines including orchestration parameterization and trigger management.
Configure and manage linked services and datasets within ADF.
Monitor troubleshoot and optimize ADF pipelines to ensure performance and reliability.
Develop and maintain ETL/ELT processes and support the evolution of the data warehouse layer (Azure SQL Database or Synapse).
Translate business requirements into technical solutions in close collaboration with the Product Owner and Architect.
Develop and maintain Python-based components (e.g. Azure Functions API integrations automation scripts).
Contribute to CI/CD processes in Azure DevOps including pipelines releases and environment promotion.
Support occasional Power BI reporting and dashboarding needs.
Ensure proactive communication stakeholder alignment and visibility of risks and impacts.
Take ownership of assigned tasks and actively contribute to continuous platform improvement.
What youll need to succeed in this role:
Strong hands-on experience with Azure Data Factory (must-have) including: pipeline and orchestration design linked services datasets triggers and parameterization operational monitoring troubleshooting and performance tuning and deployment-aware ADF development in enterprise environments.
Excellent knowledge of Python (Azure Functions REST APIs automation).
Strong SQL skills and solid understanding of data modeling for ETL/ELT and warehouse workloads.
Experience with CI/CD processes in Azure DevOps (pipelines releases multi-environment deployments).
Solid understanding of Azure services and cloud-based data solutions (e.g. Azure SQL Database Azure Key Vault).
Experience with Power BI for occasional dashboarding and reporting.
Experience working with modern development practices and tools.
Consulting mindset with proactive communication and strong stakeholder alignment skills.
Ability to effectively collaborate with Product Owner and Architect during planning and delivery.
Independent and responsible approach to delivering high-quality solutions.
Excellent command of English (at least C1 level).
Nice to have:
Knowledge of dbt.
Experience implementing data warehouses on Azure.
Discover our perks & benefits:
Work in a supportive team of passionate enthusiasts of AI & Big Data.
Engage with top-tier global enterprises and cutting-edge startups on international projects.
Enjoy flexible work arrangements allowing you to work remotely or from modern offices and coworking spaces.
Accelerate your professional growth through career paths knowledge-sharing initiatives language classes and sponsored training or conferences including a partnership with Databricks which offers industry-leading training materials and certifications.
Choose your preferred form of cooperation: B2B or a contract of mandate and make use of 20 fully paid days off.
Participate in team-building events and utilize the integration budget.
Celebrate work anniversaries birthdays and milestones.
Access medical and sports packages eye care and well-being support services including psychotherapy and coaching.
Get full work equipment for optimal productivity including a laptop and other necessary devices.
With our backing you can boost your personal brand by speaking at conferences writing for our blog or participating in meetups.
Experience a smooth onboarding with a dedicated buddy and start your journey in our friendly supportive and autonomous culture.
Your application has been successfully submitted!
Required Experience:
IC
We specialize in delivering custom-made AI solutions and Machine Learning services tailored to meet even the most niche industries.