Data Engineering (GCP and Python)

Capco

Not Interested
Bookmark
Report This Job

profile Job Location:

Bengaluru - India

profile Monthly Salary: Not Disclosed
Posted on: 17 hours ago
Vacancies: 1 Vacancy

Job Summary

Job Title: Data Engineering - Hadoop SQL R or Python GCP

About Us

Capco a Wipro company is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe we support 100 clients acrossbanking financial and Energy sectors. We are recognized for our deep transformation execution and delivery.

WHY JOIN CAPCO

You will work on engaging projects with the largest international and local banks insurance companies payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant open culture that values diversity inclusivity and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco everyone has the opportunity to grow as we grow taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

Job Title: Data Engineering - Hadoop SQL R or Python GCP

Location: Bangalore Pune

Technology Stack

  • Role Description

    Deutsche Bank (DB) is committed to the highest standards of control in the areas of Anti-Money Laundering (AML) Sanctions & Embargoes Anti-Bribery and Corruption (ABC) and Anti-Fraud (collectively referred to as Anti-Financial Crime). All employees are required to adhere to these standards to protect DB and our reputation from those who may intend to use our products and services for illegal purposes including but not necessarily limited to money laundering bribery corruption fraud and/or Terrorist Financing.

    The purpose of the AFC Modeling team is to ensure that there is an appropriate AML coverage by leading AML coverage assessment; development of new models and improving existing models to mitigate existing or future AML risks; and model review and validation in accordance with the Anti-Money Laundering / Sanctions model governance framework within the bank.

    The Model Development team ensures that there is an appropriate AML or other Anti-financial crime coverage by leading development of new models and improving existing models to mitigate existing or future ML or Financial Crime risks. As part of the team your main focus will be develop models to mitigate financial crime and Money Laundering risk. You will be supporting the end-to-end model development and implementation process.

    Primary Responsibilities:

    Model Development Individual Contributor - Work independently with team lead(s) Hands-on programming and development Data Analysis Problem Solving work toward getting the optimal solution of a given problem

    Tuning & Optimization Conduct risk based tuning of AML scenarios and maintain standards for BTL (Below the Line) ATL (Above the Line) alert testing Support ongoing optimization to the effectiveness of all in scope BAU transaction monitoring systems scenarios and rules

    Model Testing & UAT Perform UAT and systems integraton testing of AML model for

    Requirements:

    • Significant experience and direct knowledge of AML Transaction Monitoring systems and preferably Oracle Mantas Prime BSA and Actimize is a big plus.
    • Strong knowledge of SQL and Hadoop
    • Hands on experience in programming language like R or Python PySpark
    • Hands-on Development experience in Hadoop technology stack Hive Impala Spark CDSW
    • Experience in Git/ BitBucket
    • Experience in Orchestration tools like Control M and AirFlow
    • Experience in UAT and systems integration testing

    Bonus Requirements:

    • Knowledge of Object Oriented programming is plus
    • Experience in Data Science and Machine Learning is a big plus
    • Strong analytical and statistical skills
    • Developing and proposing recommendations to address identified issues/risks as well as areas and processes required optimization and improvement
    • Understanding trends in underlying data and advise on modeling methodologies to detect potential suspicious activity
    • Produce reports with documented rationales for scenarios rules and threshold settings.
    • Experience in visualization tools like Tableau is a plus.
    • Experience in GCP is a plus
Job Title: Data Engineering - Hadoop SQL R or Python GCPAbout UsCapco a Wipro company is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With ou...
View more view more

Key Skills

  • APIs
  • Docker
  • Jenkins
  • REST
  • Python
  • AWS
  • NoSQL
  • MySQL
  • JavaScript
  • Postgresql
  • Django
  • GIT

About Company

Capco is a global management and technology consultancy dedicated to the financial services and energy industries.

View Profile View Profile