Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailAbout Alphanext
Alphanext is a global talent solutions company with offices in London Pune and Indore. We connect top-tier technical talent with forward-thinking organizations to drive innovation and transformation through technology.
Role Overview
We are looking for a Senior Data Integration Engineer who will lead the design build and governance of scalable high-performance data pipelines across enterprise systems. The ideal candidate brings deep experience in data engineering and integration especially within manufacturing retail and supply chain ecosystems. This role is instrumental in ensuring near-real-time data flows robust data quality and seamless integration between ERP WMS commerce and finance platforms enabling AI and analytics capabilities across the enterprise.
Key Responsibilities
Design and maintain ELT/ETL pipelines integrating systems like BlueCherry ERP Manhattan WMS and Shopify Plus
Build event-driven architectures using Azure Service Bus Kafka or Event Hubs for real-time data streaming
Define and publish data contracts and schemas (JSON/Avro) in the enterprise Data Catalog to ensure lineage and governance
Automate reconciliation processes with workflows that detect mismatches raise alerts and track data-quality SLAs
Lead code reviews establish integration playbooks and mentor onshore/offshore engineering teams
Collaborate with the Cybersecurity team to implement encryption PII masking and audit-compliant data flows
Enable AI and analytics pipelines including feeds for feature stores and streaming ingestion to support demand forecasting and GenAI use cases
Year-One Deliverables
Replace existing nightly CSV-based exchange between BlueCherry and WMS with a near-real-time event bus integration
Launch a unified product master API feeding PLM OMS and e-commerce within 6 months
Automate three-way reconciliation of PO packing list warehouse receipt to support traceability audits (e.g. BCI cotton)
Deploy a data quality dashboard with rule-based alerts and SLA tracking metrics
Must-Have Technical Skills
5 years in data engineering or integration-focused roles
Experience with at least two of the following: Azure Data Factory Databricks Kafka/Event Hubs DBT SQL Server Logic Apps Python
Strong SQL proficiency and experience with a compiled or scripting language (Python C# or Java)
Proven experience integrating ERP WMS PLM or similar retail/manufacturing systems
Expertise in data modeling schema design (JSON/Avro) and schema versioning
Working knowledge of CI/CD pipelines and infrastructure-as-code using tools like GitHub Actions and Azure DevOps
Qualifications
Bachelors degree in Computer Science Engineering or related field (preferred)
Excellent problem-solving skills analytical mindset and attention to data governance
Strong communication and leadership skills with a track record of mentoring and team collaboration
Full Time