Elastic Search Pipeline Developer

HEROIC.com

Not Interested
Bookmark
Report This Job

profile Job Location:

Pune - India

profile Monthly Salary: Not Disclosed
profile Experience Required: 5years
Posted on: 19 hours ago
Vacancies: 1 Vacancy

Job Summary

You will be responsible for architecting and managing high-performance Elasticsearch clusters and data ingestion pipelines that process billions of cybersecurity data points sourced from the surface deep and dark web.

This role combines strong technical expertise in Elasticsearch data engineering and distributed systems helping HEROIC achieve real-time indexing search and analytics performance at global scale. to make the internet safer through intelligent data-driven cybersecurity insights.

What you will do:
  • Design deploy and manage scalable Elasticsearch clusters supporting petabyte-scale cybersecurity datasets.
  • Build and optimize data ingestion pipelines using tools such as Logstash Beats Kafka or custom Python pipelines.
  • Develop efficient indexing and querying strategies to enable fast search and analytics across diverse data types.
  • Configure and maintain index mappings analyzers tokenizers and relevance tuning for optimized search accuracy.
  • Implement and automate data transformation and enrichment workflows for ingestion from multiple data sources.
  • Monitor and troubleshoot cluster health performance and capacity planning using Elasticsearch APIs and Kibana.
  • Manage index lifecycle policies snapshots and replication strategies to ensure high availability and reliability
  • Work with the backend team to deliver search-ready structured datasets for advanced analytics and threat detection.
  • Integrate Elasticsearch with APIs microservices and external systems to support HEROICs platform ecosystem.
  • Automate infrastructure provisioning and scaling using Docker Kubernetes and cloud platforms (AWS/GCP).
  • Continuously improve data pipeline reliability latency and throughput through proactive tuning and optimization.


Requirements

  • Bachelors Degree in Computer Science Information Technology or related field
  • Minimum 4 years of professional experience with Elasticsearch in production environments.
  • Deep knowledge of Elasticsearch architecture including shards replicas nodes and cluster scaling.
  • Hands-on experience with Logstash Beats Kafka or Python-based ETL pipelines for large-scale data ingestion.
  • Strong understanding of index design query performance optimization and relevance tuning.
  • Proficiency in Python Java or Scala for pipeline development and automation scripting.
  • Solid experience with Kibana for visualization monitoring and troubleshooting.
  • Familiarity with NoSQL/relational databases (Cassandra MongoDB PostgreSQL) and data modeling for search.
  • Experience with CI/CD pipelines Git and DevOps workflows for deployment and monitoring.
  • Strong analytical debugging and problem-solving skills in distributed data systems.
  • Excellent English communication skills (written and verbal).
  • Prior experience in cybersecurity threat intelligence or large-scale data analytics (preferred but not required).

Benefits

  • Position Type: Full-time
  • Location: India (Remote Work from anywhere)
  • Salary: Competitive salary based on experience
  • Other Benefits: PTOs & National Holidays
  • Professional Growth: Work with cutting-edge AI cybersecurity and SaaS technologies
  • Culture: Fast-paced innovative mission-driven team.

About Us: HEROIC Cybersecurity () is building the future of cybersecurity. Unlike traditional solutions HEROIC takes a predictive and proactive approach to intelligently secure users before an attack or threat occurs. Our work environment is fast-paced challenging and exciting. At HEROIC youll collaborate with a team of passionate driven individuals dedicated to making the world a safer digital place.




Required Skills:

Minimum 4 years of professional experience with Elasticsearch in production environments. Deep knowledge of Elasticsearch architecture including shards replicas nodes and cluster scaling. Hands-on experience with Logstash Beats Kafka or Python-based ETL pipelines for large-scale data ingestion. Strong understanding of index design query performance optimization and relevance tuning. Proficiency in Python Java or Scala for pipeline development and automation scripting. Solid experience with Kibana for visualization monitoring and troubleshooting. Familiarity with NoSQL/relational databases (Cassandra MongoDB PostgreSQL) and data modeling for search. Experience with CI/CD pipelines Git and DevOps workflows for deployment and monitoring. Strong analytical debugging and problem-solving skills in distributed data systems. Excellent English communication skills (written and verbal). Prior experience in cybersecurity threat intelligence or large-scale data analytics (preferred but not required).


Required Education:

Bachelors Degree in Computer Science Engineering or equivalent hands-on experience

About the Job: HEROIC Cybersecurity ( ) is seeking an experienced Elasticsearch Pipeline Developer to design optimize and maintain large-scale search and analytics infrastructures that power our AI-driven cybersecurity intelligence platform.You will be responsible for architecting and managing hi...
View more view more

Company Industry

IT Services and IT Consulting

Key Skills

  • Edp
  • Desktop Support
  • CAD
  • E Learning
  • Cardiac Anesthesia