Python , web scrapping data conversion VBA

Not Interested
Bookmark
Report This Job

profile Job Location:

Gurgaon - India

profile Monthly Salary: Not Disclosed
Posted on: Yesterday
Vacancies: 1 Vacancy

Job Summary

Location : Gurgaon

Hybrid working

UK time zone

Exp : 5 Yrs

 

 

Skills and Experience

 

Must-Have Skills

 

  • 5 years of hands-on experience in Data Engineering Data Automation or Data Pipeline development.
  • Strong Python expertise with experience building and maintaining production-grade data pipelines.
  • Data ingestion experience using APIs web scraping and file-based sources (Excel HTML JSON XML).
  • Strong working knowledge of Pandas NumPy Requests BeautifulSoup Selenium.
  • Solid understanding of data pipeline architecture modular design reusability and performance optimization.
  • Experience with workflow orchestration concepts such as scheduling dependencies retries and monitoring.
  • Hands-on experience with logging alerting error handling and monitoring for production workflows.
  • Proficiency with Git and strong coding testing and documentation standards.
  • Strong understanding of data quality validation governance and schema evolution.
  • Ability to troubleshoot and resolve production data pipeline issues independently.

 

Nice-to-Have Skills

  • Experience with Decisions or similar workflow automation platforms.
  • Exposure to PySpark or distributed data processing frameworks.
  • Experience working with on-premise enterprise data ecosystems.
  • Experience mentoring junior engineers or influencing engineering best practices.
  • VBA

 

Key Responsibilities

 

  • Design and develop scalable reusable data collection systems using APIs web scraping and file-based ingestion (e.g. Excel HTML JSON).
  • Lead development of Python-based data ingestion and automation methods for diverse data sources.
  • Integrate ingestion pipelines with enterprise data storage and processing layers ensuring reliability performance and maintainability.
  • Ensure data quality governance and cross-system consistency in collaboration with backend analytics and UI teams.
  • Productionize Python-based analytical models for scalable reliable execution in operational environments.
  • Enable end-to-end automation of data pipelines and workflows with minimal manual intervention.
  • Establish and enforce development standards including code structure testing documentation logging and error handling.
  • Contribute to internal workflow orchestration and Decisions-based transformation solutions.

 

Technical Environment

 

  • Python-based custom data pipeline framework
  • API-driven data access and web scraping
  • Decisions workflow automation (nice to have)
  • Internally managed workflow orchestration services

 


Qualifications :

Bachelors degree in Computer Science Information Technology Engineering Mathematics or a related fieldMasters degree in a relevant discipline is preferred but not mandatoryEquivalent practical experience in data engineering or data automation will be considered in lieu of formal education


Remote Work :

Yes


Employment Type :

Full-time

Location : GurgaonHybrid workingUK time zoneExp : 5 Yrs  Skills and Experience Must-Have Skills 5 years of hands-on experience in Data Engineering Data Automation or Data Pipeline development.Strong Python expertise with experience building and maintaining production-grade data pipelines.Data ingest...
View more view more

About Company

Company Logo

WNS (Holdings) Limited (NYSE: WNS), is a leading Business Process Management (BPM) company. We combine our deep industry knowledge with technology and analytics expertise to co-create innovative, digital-led transformational solutions with clients across 10 industries. We enable busin ... View more

View Profile View Profile