drjobs Data Engineer

Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Hyderabad - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

About NationsBenefits:

At NationsBenefits we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable secure and high-performing platforms that streamline operations for our clients. As part of our strategic growth we are focused on platform modernization transitioning legacy systems to modern cloud-native architectures that support the scalability reliability and high performance of core back- office functions in the insurance domain.

As a Data Engineer you will be responsible for the Requirement Gathering Data Analysis Development and implementation of Orchestrated data pipeline solutions to support our organizations data-driven initiatives to ensure data accuracy and enable data-driven decision-making across the organization. The ideal candidate will possess a minimum of 3-5 years of hands-on experience in data engineer on high-performing teams. Expertise in DBT Airflow Azure Databricks SQL Python Py-spark Automation is a must and knowledge of reporting tools is addon.

Key Responsibilities:

  • 3 to 5 years of hands-on experience using DBT. Airflow Azure Databricks Python Py-spark and SQL Preferred from Healthcare & Fintech Domain having Automation First Mindset.
  • Hands-on experience with Data Collection Data Analysis Data modeling Data Processing using DBT Airflow Azure Databricks Py-spark SQL Python.
  • Performance Optimization and Automation: Continuously monitor and optimize existing solutions and Debugging DAG failures and resolving.
  • Data Processing: Leverage his expertise building Robust Data pipelines using mentioned tech stack with CI/CD.
  • Collaboration: Collaborate with cross-functional teams including data scientists business analysts and stakeholders to understand their data needs and deliver solutions.
  • Data Quality: Implement data validation and cleansing processes to ensure data accuracy consistency and reliability.
  • Influence: bring right solution for use cases and convince the team to use.
  • Open to Ad hoc Data Analysis and Reporting/Dashboard Development: Perform exploration data analysis develop data visualizations and generate actionable insights to support business decision-making.
  • Stay Current: Stay up to date with emerging trends and technologies in data engineering and analytics and make recommendations for their adoption.

Requirements:

  • Bachelors degree in computer science Information Technology or a related field.
  • Minimum 3 years of hands-on experience using DBT. Airflow Azure Databricks Py-spark SQL Python Automation
  • Flexible to build Data Reports and Dashboards using SQL Python Reporting Tools
  • Strong Debugging and Automation skills
  • Strong understanding of DWH/Data Lake concepts and methodologies.
  • Experience with cloud platforms such as Azure AWS or GCP
  • Excellent communication Presentation and interpersonal skills
  • Knowledge of data quality data Validation data security and compliance standards is a plus.
  • Excellent problem-solving skills and attention to detail

Employment Type

Full-Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.