Role: Data Engineer
Location: Cupertino CA/ Austin CA
The Data Foundations Engineer designs and scales modern data architectures powering Wallet Payments and Commerceproducts. This role focuses on building high-performance data pipelines and enabling analytics and ML use cases with strongfundamentals in data modeling and scalable systems.
KEY RESPONSIBILITIES -
Data Engineering & Architecture Design and implement scalable batch and near-real-time data pipelines. Develop ETL/ELT workflows optimized for performance and cost. Implement dimensional data models and standardize business metrics. Instrument APIs and user journeys to capture behavioral and transactional data.
Data Governance & Quality -
Ensure data integrity governance privacy and compliance.
Maintain reliability and availability of mission-critical systems.
REQUIRED QUALIFICATIONS
6 years of experience in data engineering for analytics or ML systems.
Strong SQL proficiency.
Experience in Python Scala or Java.
Hands-on experience with Spark Kafka and Airflow (or similar).
Strong understanding of data modeling and lakehouse architectures (e.g. Iceberg). Experience with AWS Azure or GCP.
Comfortable participating in rotating on-call.
Experience with Snowflake Databricks Trino OLAP/NRT systems Superset or Tableau. Familiarity with CI/CD data observability infrastructure-as-code.
Exposure to MLOps and GenAI/RAG pipelines.
Hands-on experience with LLMs (prompt engineering fine-tuning RAG).
Experience in FinTech Wallet or Payments domain.
Role: Data Engineer Location: Cupertino CA/ Austin CA The Data Foundations Engineer designs and scales modern data architectures powering Wallet Payments and Commerceproducts. This role focuses on building high-performance data pipelines and enabling analytics and ML use cases with strongfundam...
Role: Data Engineer
Location: Cupertino CA/ Austin CA
The Data Foundations Engineer designs and scales modern data architectures powering Wallet Payments and Commerceproducts. This role focuses on building high-performance data pipelines and enabling analytics and ML use cases with strongfundamentals in data modeling and scalable systems.
KEY RESPONSIBILITIES -
Data Engineering & Architecture Design and implement scalable batch and near-real-time data pipelines. Develop ETL/ELT workflows optimized for performance and cost. Implement dimensional data models and standardize business metrics. Instrument APIs and user journeys to capture behavioral and transactional data.
Data Governance & Quality -
Ensure data integrity governance privacy and compliance.
Maintain reliability and availability of mission-critical systems.
REQUIRED QUALIFICATIONS
6 years of experience in data engineering for analytics or ML systems.
Strong SQL proficiency.
Experience in Python Scala or Java.
Hands-on experience with Spark Kafka and Airflow (or similar).
Strong understanding of data modeling and lakehouse architectures (e.g. Iceberg). Experience with AWS Azure or GCP.
Comfortable participating in rotating on-call.
Experience with Snowflake Databricks Trino OLAP/NRT systems Superset or Tableau. Familiarity with CI/CD data observability infrastructure-as-code.
Exposure to MLOps and GenAI/RAG pipelines.
Hands-on experience with LLMs (prompt engineering fine-tuning RAG).
Experience in FinTech Wallet or Payments domain.
View more
View less