Senior Data Engineer

Payoneer

Not Interested
Bookmark
Report This Job

profile Job Location:

Gurgaon - India

profile Monthly Salary: Not Disclosed
Posted on: 5 days ago
Vacancies: 1 Vacancy

Job Summary

About Payoneer

Founded in 2005 Payoneer is the global financial platform that removes friction from doing business across borders with a mission to connect the worlds underserved businesses to a rising global economy. Were a community with over 2500 colleagues all over the world working to serve customers and partners in over 190 countries and territories.

By taking the complexity out of the financial workflowsincluding everything from global payments and compliance to multi-currency and workforce management to providing working capital and business intelligencewe give businesses the tools they need to work efficiently worldwide and grow with confidence.


Role summary

Were looking for a Senior Data Engineer with a drive for excellence and an ownership mindset who can lead the design and delivery of scalable secure and highly reliable data platforms in a complex payments and fintech environment. You set the technical bar for your team: you architect systems make sound trade-off decisions unblock cross-team delivery and mentor engineers.

Youredeliberate about how AI-assisted development is adopted on your team setting guardrails that prevent shortcuts from becoming long-term cognitive debt while actively using AI to solve real engineering and business problems.

AI-first mindset: We value engineers who can incorporate AI and agentic development practices into how we build data systems setting patterns for responsible AI-assisted engineering across design reviews code quality testing and documentation while delivering data engineering-led AI use cases such as intelligent data quality and observability anomaly detection automated alert triage and governance.

WhatYoullDo

  • Own the technical architecture for large-scale batch and streaming data pipelines that power product risk and reporting use cases using frameworks such as Apache Beam Spark or Flink with managed runners like Google Cloud Dataflow.
  • Lead data warehouse andlakehousedesign for analytical and operationalusecases setting modelling standards and driving performance and cost optimisation.
  • Design event-driven and streaming architectures with strong correctnessguaranteesschema evolution replay and backfill strategies late-data handling idempotency and operational safety.
  • Build andoperatestorage patterns for operational and analytical workloads using wide-column stores (Bigtable Cassandra HBase or equivalents) including capacity planning and SLO definition.
  • Establish orchestration and operational excellence using tools like AirflowComposerDagster or Prefect - including CI/CD strategy automated testing pipeline observability and incident response practices.
  • Drive data quality governance and auditability through automated controls lineage/metadata practices and secure-by-default access patterns.
  • Lead through technical influence: mentor engineers run design reviewsmaintaindecision records unblock cross-team delivery and shape roadmaps through clear technical reasoning.
  • Set the standard for how your team uses AI-assisted development ensuring AI-generated code meets the same review testing and documentation bar as any other deliver AI-driven solutions for data engineering problems such as intelligent data profiling anomaly detection and automated root-cause analysis with a focus on reproducibility and governance.

Who You Are

  • You are a seasoned data engineer witha strong senseof ownership and accountability comfortable operating in complex domains and driving end-to-end delivery from design through production operations.
  • You balance long-term platform thinking with pragmatic execution and you know how to raise reliability and quality without slowing teams down.
  • You thrive in cross-team environments and influence through technical leadership clarity and strong execution.
  • You are impact-driven and measure success by how effectively you enable teams improve data trust and accelerate product outcomes.
  • You think critically about AI adoption in engineering workflows you see the leverage it provides but you also understand the risks of unchecked reliance: reduced understanding hidden errors and maintenance burden. You set patterns that capture the upside while protecting quality.

Key skills and competencies

  • Strongtrack recorddelivering production-grade data pipelines and datasets end-to-end: design implementation deployment and operations.
  • Deep experience with distributed data processing
  • Expertisewith cloud data warehouses (BigQuery Snowflake Redshift or Databricks) including strong dimensional modelling query optimisation and cost management skills.
  • Hands-on experience designing systems on event streaming platforms including schema management delivery semantics trade-offs and operational patterns like replay and backfill.
  • Experience with operational and wide-column stores (Bigtable Cassandra HBase or equivalents) with a strong understanding of access-pattern-driven design and capacity planning.
  • Strong orchestration and platform engineering experience with AirflowDagster or Prefect including CI/CD automated testing observability and incident response.
  • Demonstrated technical leadership: mentoring design reviews architectural decision records and the ability to influence stakeholders and align teams around technical direction.
  • A considered approach to AI-assisted engineering: you use AI tools to improve throughput and quality but you also set guardrails review standards testing expectations documentation requirements.

Preferred

  • Prior experience in fintech payments lending or broader financial services (e.g. reconciliation settlement risk and fraud data regulatory reporting).
  • Experience operating data systems with defined SLOs/SLAs and governance in cloud environments.
  • Familiarity with data governance and compliance standards: PII handling access controls auditing and policy-as-code patterns.
  • Experience partnering with ML/DS teams to productionise features training datasets and monitoring infrastructure.


Required Experience:

Senior IC

About PayoneerFounded in 2005 Payoneer is the global financial platform that removes friction from doing business across borders with a mission to connect the worlds underserved businesses to a rising global economy. Were a community with over 2500 colleagues all over the world working to serve cust...
View more view more

About Company

Company Logo

In today’s borderless digital world, Payoneer enables millions of businesses and professionals from more than 200 countries and territories to connect with each other and grow globally through our cross-border payments platform. With Payoneer’s fast, flexible, secure and low-cost solu ... View more

View Profile View Profile