Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via email1. Design build data cleansing and imputation map to a standard data model transform to satisfy business rules and statistical computations and validate data content.
Develop modify and maintain Python and Unix Scripts and complex SQL
Performance tuning of the existing code and avoid bottlenecks and improve performance
Build an endtoend data flow from sources to entirely curated and enhanced data sets.
Develop automated Python jobs for ingesting data from various source systems
Provide technical expertise in areas of architecture design and implementation.
Work with team members to create useful reports and dashboards that provide insight improve/automate processes or otherwise add value to the team.
Write SQL queries for data validation.
Design develop and maintain ETL processess to extract transform and load Data from various sources into the data warehours
Colloborate with data architects analysts and other stake holders to understand data requirement and ensure quality
Optimize and tune ETL processes for performance and scalaiblity
Develop and maintain documentation for ETL processes data flows and data mappings
Monitor and trouble shoot ETL processes to ensure data accuracy and availability
Implement data validation and error handling mechanisms
Work with large data sets and ensure data integrity and consistency
skills
Python
ETL Tools like Informatica Talend SSIS or similar
SQL Mysql
Expertise in Oracle SQL Server and Teradata
DeV Ops GIT Lab
Exp in AWS glue or Azure data factory
Full-Time