Data Engineer- Service

Tookitaki Holding

Not Interested
Bookmark
Report This Job

profile Job Location:

Bangalore - India

profile Monthly Salary: Not Disclosed
Posted on: 2 days ago
Vacancies: 1 Vacancy

Job Summary

Position Overview
Job Title: Data Engineer
Department: Services Delivery
Reporting To: Director Client Enablement

The Data Engineer plays a pivotal role in ensuring seamless data integration and management for Tookitakis FinCense platform. This role involves working closely with clients to configure and optimize data pipelines for both on-premise and cloud-hosted (CaaS) environments. The ideal candidate possesses a strong understanding of data architecture big data technologies and integration best practices to enable smooth platform deployment and operations.

Position Purpose

The Data Engineer ensures that Tookitakis platform is seamlessly integrated into the clients data ecosystem by establishing robust data pipelines and ensuring accurate data ingestion and extraction. This role is crucial for enabling efficient compliance workflows driving data insights and ensuring the success of client deployments.


Key Responsibilities

1. Data Integration

  • Collaborate with client teams to understand data requirements and integration needs.
  • Configure data pipelines for both initial loads and delta loads.
  • Ensure upstream (data ingestion) and downstream (data extraction) processes are efficient and aligned with platform requirements.

2. Data Pipeline Management

  • Develop and maintain data ingestion workflows to enable real-time and batch data processing.
  • Set up mechanisms for extracting data from the platform for client-specific reporting and analysis.
  • Monitor data pipelines to ensure accuracy timeliness and consistency.

3. Data Validation and Optimization

  • Validate incoming data for completeness accuracy and integrity.
  • Optimize data flows to improve performance and reduce latency in data processing.
  • Conduct troubleshooting to resolve data-related issues during platform deployment and operation.

4. Collaboration and Support

  • Work closely with Deployment Engineers to align data workflows with platform configurations.
  • Provide technical expertise to support the Client Enablement team in configuring data-related workflows.
  • Collaborate with the Product and Engineering teams to improve platform data handling capabilities.

5. Documentation and Knowledge Sharing

  • Document all data integration processes configurations and troubleshooting steps.
  • Share best practices and lessons learned to enhance team efficiency and client outcomes.

Qualifications and Skills

Education:

  • Required: Bachelors degree in Computer Science Data Science or a related technical field.
  • Preferred: Masters degree in Data Engineering Big Data or Analytics.

Experience

  • Minimum: 3-5 years of experience in data engineering or a similar role.
  • Proven expertise in data integration pipeline development and big data technologies.

Technical Expertise

  • Data Integration Tools: Experience with tools like Apache Kafka Apache NiFi or similar ETL/ELT platforms.
  • Big Data Technologies: Proficiency in Hadoop Spark Hive and HDFS.
  • Cloud Platforms: Hands-on experience with AWS (preferred) or GCP for managing data pipelines. Scripting and Querying: Strong knowledge of Python or Pyspark or Java.
  • Database Management: Familiarity with relational and NoSQL databases (e.g. MariaDB SycallaDB).

Soft Skills

  • Strong problem-solving and analytical skills to diagnose and resolve data issues.
  • Excellent communication skills for interacting with clients and internal teams.
  • Ability to manage multiple data workflows while meeting tight deadlines.

Preferred

  • Certifications in AWS Big Data Apache Spark or similar technologies.
  • Experience in compliance or financial services domains.

Key Competencies

  • Data-Driven Approach: Deep understanding of data integration and the ability to create efficient workflows.
  • Collaboration: Works effectively with cross-functional teams to deliver seamless client deployments.
  • Technical Acumen: Expertise in big data and cloud platforms to handle complex data requirements.
  • Ownership: Takes responsibility for data integration activities and ensures successful outcomes.
  • Adaptability: Excels in dynamic environments with evolving client needs.

Success Metrics

1. Data Pipeline Efficiency:

  • Achieve 99% data pipeline uptime and accuracy for client deployments.

2. Timely Data Integration:

  • Complete all data ingestion and extraction workflows within deployment timelines.

3. Client Satisfaction:

  • Positive client feedback on data integration and availability during the implementation phase.

4. Knowledge Sharing:

  • Maintain comprehensive documentation for all data-related workflows and share best practices across the team.

Benefits

  • Competitive Salary: Aligned with industry standards and experience.
  • Professional Development: Access to training in big data cloud computing and data integration tools.
  • Comprehensive Benefits: Health insurance and flexible working options.
  • Growth Opportunities: Career progression within Tookitakis rapidly expanding Services Delivery team

Position OverviewJob Title: Data EngineerDepartment: Services DeliveryReporting To: Director Client EnablementThe Data Engineer plays a pivotal role in ensuring seamless data integration and management for Tookitakis FinCense platform. This role involves working closely with clients to configure and...
View more view more

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Company Logo

Tookitaki’s FinCense platform combines AI with community‑driven intelligence to deliver real‑time AML, fraud detection, smart screening, and transaction monitoring with 90%+ accuracy—trusted by global banks, fintechs & e‑wallets.

View Profile View Profile