Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailCompany Description:
Your Partner a Merkle Company is a global technology and services firm that helps clients transform and
realize their digital strategies. With a unique Experience Centric Transformation approach Extentia s
ground-breaking solutions are in the space of mobile cloud and design. The team is differentiated by
an emphasis on excellent design skills that they bring to every project. Focused on enterprise mobility
cloud computing and user experiences Extentia strives to accomplish and surpass their customers
business goals. The company s inclusive work environment and culture inspire team members to be
innovative and creative and to provide clients with an exceptional partnership experience.
Job Details:
Position: Data Engineer (ETL)
Experience: 5years
Work Mode: Hybrid
Notice Period: Immediate
Location: Mumbai (Thane and Goregaon) based out in Mumbai.
Shift: General Shift (but at times there will be over lapping of time as per client timings).
1. Technical Skills
Must have key skills
ETL Tools (IICS Talend Hadoop etc.
Azure with basic understanding of GCP & AWS
JIRA Confluence
MS Visio (SQL / PLSQL / Oracle)
PySpark
Shell script and Power BI
Data Cleaning Tools & Libraries: Proficiency with tools and libraries to clean and pre process data
for example:
Python
SQL
Excel- emphasis on Familiarity with data cleaning functions filters and pivot tables.
Good to have skills
Knowledge of R
2. Data Management & Analysis Skills
Data Validation & Consistency: Ability to identify data quality issues such as duplicates
missing values outliers and inconsistencies.
Data Transformation: Experience in transforming raw data into usable formats including
reshaping aggregating or normalizing data.
Handling Missing Data: Familiarity with imputation techniques or ways to deal with incomplete
datasets.
Data Normalization & Standardization: Ensuring uniformity in data formats units of
measurement and naming conventions.
Data Aggregation: Summarizing or grouping data for analysis and ensuring that it is consistent
across all sources.
3. Knowledge of Data Quality
Data Integrity: Understanding the importance of maintaining accurate and consistent data over time.
Data Profiling: Identifying patterns anomalies and key characteristics of the dataset.
Error Detection: Ability to find and correct errors within datasets by checking for outliers
misclassifications or missing values.
4. Soft Skills
Attention to Detail: The ability to identify small inconsistencies and issues within large datasets.
Problem-Solving: Being resourceful in resolving data issues and proposing solutions.
Critical Thinking: Analyzing data in-depth and understanding its implications.
Communication: Ability to explain data issues and cleaning steps to non-technical stakeholders.
5. Experience with Data Formats
Structured Data: Familiarity with both structured (tables databases)
Data Sources: Ability to clean data from various sources such as spreadsheets databases APIs logs
etc.
File Formats: Proficiency in working with common data file formats like CSV XML and Excel.
Statistical and Analytical Skills
Basic Statistics: Understanding of basic statistical concepts to spot anomalies outliers or trends in
data.
Data Visualization: Ability to visualize the cleaned data to identify trends and patterns (e.g. with
Power BI .
6. Automation and Scripting
Automating Repetitive Tasks: Experience in automating data cleaning processes with scripts or
workflow automation tools.
Batch Processing: Capability to clean data in bulk particularly when dealing with large datasets.
Domain Experience preferably worked with product based can be an added advantage.
data,aws,cloud,plsql,confluence,etl tools (iics, talend, hadoop),jira,mumbai,skills,automation,etl,sql,shell script,gcp,excel,datasets,data quality,oracle,data cleaning,pyspark,basic,python,ms visio,azure,power bi
Full Time