Role: Data Engineer/ETL Developer
Experience: 6 Years
Location: PAN India
Notice: Immediate joiners only
Responsibilities:
- Develops and maintains scalable data pipelines for extraction transformation and loading of data from various data sources using ETL tools.
- Collaborates with product owners to build data models that feed business intelligence tools.
- Building analytical tools to utilize the data pipeline providing actionable insight into key business performance metrics.
- Implements processes and systems to monitor data quality ensuring production data is always accurate available for key stakeholders and business processes that depend on it.
- Performs analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Configuring jobs for auto refreshes of data in the repositories.
- Optimizing queries to improve performances of jobs and pipelines.
- Following agile and SDLC ways of working in implementation of data pipelines.
- Writing necessary technical documents and keeping the documents up to date.
Required skills and qualifications:
- 6 years of experience with Python/SQL and data visualization/exploration tools.
- Experience in building complex SQL queries.
- Familiarity with the Azure Databricks Azure Data Factory.
- Experience in building or maintaining ETL pipelines.
- Experience in building Power BI dashboards.
- Familiarity with SDLC and Agile ways of working.
- Familiarity with Atlassian tools like Bitbucket JIRA Confluence.
python/sql,jira,azure data factory,data visualization,atlassian tools,sql queries,optimizing queries,bitbucket,etl tools,agile,sdlc,analysis,power bi,azure databricks,sql,python,confluence,build data models,etl pipelines