- Instrument APIs user journey and interaction flows to systematically collect behavioral transactional and operational data enabling robust analytics and insightful reporting- Build robust data architectures for Wallet Payments & Commerce products. - Build robust data architectures supporting large-scale Wallet Payments & Commerce (WPC) applications. - Optimize ETL workflows to enhance data processing efficiency and reliability. - Develop tools and frameworks to optimize data processing performance. - Ensure data quality and integrity across all data systems and platforms. - Collaborate closely with diverse set of partners to gather requirements prioritize use cases and ensure high-quality data products delivery.- Integrate data pipelines into the broader ML Operations (MLOps) process. This includes automating the data flow for feature engineering model retraining performance monitoring models in production drift detection and ensuring scalability.- Construct and maintain data pipelines for Gen AI/RAG solutions including processes for data extraction chunking embedding and grounding to prepare data for models and perform continuous quality and performance measurement.
Bachelors or Masters degree in Computer Science or a related technical field or equivalent experience
4 years of experience in designing developing and deploying data engineering for analytics or ML & AI pipelines.
Strong proficiency in SQL Scala Python or Java with hands-on experience in data pipeline tools (e.g. Apache Spark Kafka Airflow) CI/CD practices and version control.
Familiarity with cloud platforms (AWS Azure GCP) and data management and analytics tools like Snowflake Databricks and Tableau.
Strong understanding of data warehousing data modeling (dimensional/star schemas) and metric standardization.
Strong problem-solving skills and the ability to work in an agile environment.
Ability to create technical Specs instrumentation specs and posses the ability to understand APIs MSDs etc..
Expertise in building and refining large-scale data pipelines as well as developing tools and frameworks for data platforms.
Hands-on experience with big data technologies such as distributed querying(Trino) real-time analytics(OLAP) near-real-time data processing (NRT) and decentralized data architecture (Apache Mesh).
Familiarity with data governance security protocols and compliance in financial data systems.
Experience enabling ML pipelines including automating the data flow for feature engineering model retraining performance monitoring models in production drift detection and ensuring scalability.
Familiarity with GenAI concepts like Retrieval-Augmented Generation (RAG) Large Language Models (LLMs) prompt engineering vector embeddings and LLM fine-tuning
Works independently with minimal oversight actively builds relationships and contributes to a positive team environment.
Demonstrates sound judgment applies technical principles to complex projects evaluates solutions and proposes new ideas and process improvements.
Seeks new opportunities for growth demonstrates a thorough understanding of technical concepts exercises independence in problem-solving and delivers impactful results at the team level.
Familiarity with Fintech Wallet domain digital commerce etc..
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.