drjobs Apache Druide Support

Apache Druide Support

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Sunnyvale, CA - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Role Overview:
We are seeking a Data Platform Engineer with expertise in Apache Druid Airflow and large-scale data ingestion pipelines. The role focuses on building and optimizing real-time and batch data flows orchestrating workflows and implementing monitoring and logging solutions for reliability and performance.

Key Responsibilities:

  • Design implement and maintain real-time and batch data ingestion pipelines into Apache Druid.

  • Manage workflow orchestration and scheduling using Apache Airflow.

  • Optimize data ingestion queries and storage for performance and scalability.

  • Set up and maintain monitoring and logging frameworks (Grafana Logstash ELK) to ensure data pipeline reliability.

  • Troubleshoot and resolve performance bottlenecks across ingestion query execution and orchestration layers.

  • Collaborate with data engineering and analytics teams to support end-to-end data lifecycle.

  • Implement best practices for scalability fault tolerance and cost efficiency.

Qualifications & Skills:

  • Bachelors degree in Computer Science Engineering or related field (or equivalent practical experience).

  • 3 years of experience in data engineering platform engineering or SRE for data systems.

  • Hands-on expertise with Apache Druid (data ingestion query optimization cluster management).

  • Strong experience with Airflow for orchestration and scheduling.

  • Solid knowledge of data ingestion frameworks and optimization techniques.

  • Familiarity with monitoring and logging tools (Grafana Logstash ELK stack).

  • Strong troubleshooting and debugging skills in distributed data environments.

  • Proficiency in scripting (Python Shell) for automation and integration.

Nice-to-Have:

  • Experience with streaming systems (Kafka Flink Spark Streaming).

  • Knowledge of cloud-based data platforms (AWS Azure GCP).

  • Exposure to containerization and orchestration (Docker Kubernetes).

  • Understanding of data governance security and compliance practices.

Pi-square technologies is a Michigan (USA) Headquartered Automotive Embedded Engineering Services company Synergy Partner for major OEMs and Tier 1s and their implementation partners in Automotive Embedded Product Development Projects Requirements Analysis Software Design Software Implementation Efficient Build Release Process and turnkey software V & V Services. We have more than 20 years of industry expertise with specialization in the latest cutting-edge automotive technologies such as Infotainment connected vehicles Cyber security OTA and Advanced Safety/ Body electronics.

Employment Type

Full-time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.