drjobs Data and Integration Engineer

Data and Integration Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

  • Assist in configuring and maintaining integration workflows using middleware tools to support data exchange between Tangos API architecture and third-party systems.
  • Support ETL processes including data extraction transformation and loading following Tangos proprietary data migration methodologiesincorporating data analysis of input internal and output message data.
  • Participate in testing and validating integrations ensuring data quality asynchronous processing and polling-based synchronization meet client requirements; plan and lead integration tests with external systems.
  • Collaborate on low-touch implementations by leveraging standard API endpoints and flat file transfers (e.g. SFTP-based) deploying standard integrations and providing daily operational support.
  • Provide Level 3 Production Support for integration-related issues including root-cause analysis and remediation within defined SLAs; serve as the primary coordinator for internal engineering resources.
  • Contribute to documentation updates for integration playbooks Swagger files user guides test procedures performance specifications and product manuals to enhance self-service capabilities.
  • Assist in estimating Level of Effort (LOE) for custom integrations during SOW/CO development and client engagements; prepare data templates based on Source to Target Mapping (STM) documents.
  • Perform data transformations including merging ordering aggregation and resolving data cleansing/quality issues using various data processing tools; add/remove columns in ETL scripts (e.g. Kettle) using STM/Data Templates.
  • Run lookups queries update lookups execute data quality scripts format and validate data quality reports and run scripts on data in staging databases while updating data load checklists.
  • Conduct internal smoke testing of loaded data in the Tango application and prepare/add/remove columns in sample Business Data validation trackers using STM.
  • Integrate information from multiple data sources solving common transformation problems and effectively communicate with managers and project teams regarding data sets and reporting needs.
  • Engage in agile iterations for refining transformation routines and business rules prioritizing critical path data elements while understanding business-domain context to clean and transform data for analysis.

Qualifications :

  • Bachelors degree in Computer Science Information Technology Data Science Mathematics Statistics or a related quantitative field (or equivalent experience).
  • 3-6 years of professional experience in software development integrations or data wrangling at least 3 years of SQL proficiency.
  • Proficiency in JavaScript with a solid understanding of scripting for data manipulation and automation; 2 years of experience with Kettle (Pentaho) or similar ETL/data processing tools.
  • Hands-on experience in any ETL/Reporting tool and manual data analysis using Excel or other spreadsheet tools.
  • Basic knowledge of RESTful APIs asynchronous processing and data formats (e.g. JSON CSV); understanding of both waterfall and agile project management methodologies.
  • Proficient in Microsoft Office software applications such as Word Excel and PowerPoint.
  • Excellent analytical and problem-solving skills with the ability to perform root-cause analysis; strong attention to detail and prioritization skills to handle multiple tasks in a fast-paced environment.
  • Strong written and verbal communication skills; aptitude to learn new technologies.

Preferred Qualifications:

  • Experience with for building scalable backend services and handling data pipelines; 3 years of scripting experience for data analysis in languages like SQL Python or R.
  • Familiarity with Node-RED or similar low-code/no-code flow-based tools for middleware and integration workflows.
  • Demonstrated experience in data wrangling including extracting and transforming different types of data files (e.g. CSV XLSX) into analyzable structures using reproducible coding and scripting techniques.
  • Exposure to ETL tools data migration processes or middleware architectures (e.g. Node-RED in a production environment).
  • Understanding of security protocols such as JWT authentication PGP encryption or SAML 2.0 for SSO.
  • Prior experience in SaaS environments particularly IWMS or enterprise software integrations.


Additional Information :

  • Need for immediate joining.
  • Work from home.


Remote Work :

Yes


Employment Type :

Full-time

Employment Type

Remote

Company Industry

Key Skills

  • APIs
  • Jenkins
  • REST
  • Python
  • SOAP
  • Systems Engineering
  • Service-Oriented Architecture
  • Java
  • XML
  • JSON
  • Scripting
  • Sftp

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.