Role: Data Engineer / Data Modeler Payments Subsystem
location : Atlanta GA (Hybrid)
Type: Contract
Position Summary:
- We are looking for an experienced Data Engineer / Data Modeler to join our banking technology team with a focus on the payment subsystem.
- The role involves designing developing and maintaining scalable data pipelines and models to support real-time and batch payment processing settlement reconciliation and regulatory reporting.
- The ideal candidate will have strong expertise in data modeling ETL development cloud data platforms and modern data engineering practices.
Key Responsibilities:
- Design and implement logical and physical data models for the payments domain (merchant onboarding transactions clearing settlement funding fraud/risk reporting).
- Develop ETL/ELT pipelines to integrate data from core banking systems payment gateways and external partners into enterprise data stores.
- Ensure data quality lineage and governance across payment processing flows.
- Build and optimize data warehouses/lakes on cloud platforms (AWS or Azure).
- Collaborate with architects developers and business analysts to map payment business processes into data models and schemas.
- Implement real-time and batch data integration for use cases such as fraud detection reconciliations settlement reporting and regulatory compliance.
- Monitor and tune pipelines for scalability performance and cost efficiency.
- Partner with security and compliance teams to enforce data privacy and regulatory requirements (PCI-DSS SOX AML/KYC).
Required Qualifications:
- Bachelors degree in computer science Information Systems or related field (or equivalent experience).
- 7 years of data engineering / data modeling experience in financial services or banking.
- 7 years of Strong knowledge of payments domain concepts (Auth clearing settlement reconciliation chargebacks fraud).
- 7 years of Hands-on experience with data modeling (3NF star/snowflake schemas dimensional modeling).
- 7 years of Proficiency with SQL and at least one ETL/ELT tool (DataStage Informatica Talend dbt).
- 7 years of Strong experience with cloud data platforms (AWS Redshift Azure Synapse Snowflake Databricks).
- 7 years of Proficiency in a programming language such as Python for data processing and automation.
- 7 years of Experience with real-time streaming (Kafka Kinesis Event Hub) and batch data pipelines.
- 7 years of Familiarity with version control (Git) and CI/CD pipelines.
Preferred Skills:
- Knowledge of regulatory and compliance requirements in payments/banking.
- Understanding of API-based integration for payment systems.