Title: Systems Integration Senior Specialist
Location Address: Atlanta GA (Hybrid)
Type: Contract
Description:
Day to Day Job Duties: (What this person will do on a daily/weekly basis)
- Design develop and maintain scalable batch and real-time data pipelines supporting payment processing card transactions billing and banking systems
- Build and automate ETL/ELT workflows for ingestion transformation aggregation and loading of high-volume financial data
- Develop and support Kafka integrations including topics producers consumers and streaming applications for near real-time transaction processing
- Develop and optimize cloud-native data solutions in Azure leveraging services such as Azure Data Factory (ADF) Azure Synapse Azure Data Lake and Event Hubs
- Build and manage enterprise data platforms using Snowflake including data modeling performance tuning clustering and secure data sharing
- Develop ETL/ELT workflows using Python and SQL for ingestion transformation validation and aggregation of high-volume financial data
- Translate payment and banking data requirements into functional specifications mapping documents and technical designs
- Ensure data accuracy integrity security and compliance (including PCI-DSS and financial regulatory requirements)
- Optimize SQL queries and tune RDBMS performance for high-throughput transactional environments
- Design and maintain enterprise microservices (security logging APIs) using Java/Spring Boot where applicable
- Monitor data pipeline performance and implement optimization and corrective actions
- Collaborate with product owners architects QA and compliance teams in an Agile environment
- Troubleshoot and resolve complex production issues across data platforms and transaction systems
- Work on-site 3 days per week to support collaboration architecture discussions and agile ceremonies
Basic Qualifications: (Required skills and minimum experience)
- Bachelors degree in Computer Science Engineering or related technical field
- Minimum 5 years of experience in Data Engineering or Data Warehouse environments
- Minimum 5 years of Strong experience designing building and maintaining batch and real-time data pipelines
- 5 years of Proven experience in the payments banking or financial services industry supporting transaction-based systems
- 5 years of Strong expertise in SQL PL/SQL and query optimization
- Hands-on experience with any ETL/ELT tool
- Minimum 5 years of Experience working with relational databases such as Oracle DB2 Teradata SQL Server
- 5 years of Experience with Kafka or streaming platforms
- Minimum 5 years of Strong programming skills in Java and/or Python
- Minimum 5 years of Experience developing API integrations with cloud or enterprise systems
- Minimum 5 years of Experience with Unix Shell scripting
- Understanding of data modeling concepts and ETL frameworks
- Minimum 5 years of Experience working in Agile environments
Travel
Nice to Have (But Not a Must)
- Experience supporting PCI-DSS compliant environments
- Experience handling large structured and unstructured datasets
- Experience with CI/CD and DevOps practices
- Prior experience working on high-volume payment gateways or banking transaction platforms
What Were Looking For:
- A strong Data Engineer who understands transactional data systems and financial data flows
- Someone comfortable working in regulated security-sensitive banking environments
- A problem solver who can optimize high-volume systems and ensure data integrity
- A collaborative team player who can clearly communicate complex technical concepts.
Title: Systems Integration Senior Specialist Location Address: Atlanta GA (Hybrid) Type: Contract Description: Day to Day Job Duties: (What this person will do on a daily/weekly basis) Design develop and maintain scalable batch and real-time data pipelines supporting payment processing card transa...
Title: Systems Integration Senior Specialist
Location Address: Atlanta GA (Hybrid)
Type: Contract
Description:
Day to Day Job Duties: (What this person will do on a daily/weekly basis)
- Design develop and maintain scalable batch and real-time data pipelines supporting payment processing card transactions billing and banking systems
- Build and automate ETL/ELT workflows for ingestion transformation aggregation and loading of high-volume financial data
- Develop and support Kafka integrations including topics producers consumers and streaming applications for near real-time transaction processing
- Develop and optimize cloud-native data solutions in Azure leveraging services such as Azure Data Factory (ADF) Azure Synapse Azure Data Lake and Event Hubs
- Build and manage enterprise data platforms using Snowflake including data modeling performance tuning clustering and secure data sharing
- Develop ETL/ELT workflows using Python and SQL for ingestion transformation validation and aggregation of high-volume financial data
- Translate payment and banking data requirements into functional specifications mapping documents and technical designs
- Ensure data accuracy integrity security and compliance (including PCI-DSS and financial regulatory requirements)
- Optimize SQL queries and tune RDBMS performance for high-throughput transactional environments
- Design and maintain enterprise microservices (security logging APIs) using Java/Spring Boot where applicable
- Monitor data pipeline performance and implement optimization and corrective actions
- Collaborate with product owners architects QA and compliance teams in an Agile environment
- Troubleshoot and resolve complex production issues across data platforms and transaction systems
- Work on-site 3 days per week to support collaboration architecture discussions and agile ceremonies
Basic Qualifications: (Required skills and minimum experience)
- Bachelors degree in Computer Science Engineering or related technical field
- Minimum 5 years of experience in Data Engineering or Data Warehouse environments
- Minimum 5 years of Strong experience designing building and maintaining batch and real-time data pipelines
- 5 years of Proven experience in the payments banking or financial services industry supporting transaction-based systems
- 5 years of Strong expertise in SQL PL/SQL and query optimization
- Hands-on experience with any ETL/ELT tool
- Minimum 5 years of Experience working with relational databases such as Oracle DB2 Teradata SQL Server
- 5 years of Experience with Kafka or streaming platforms
- Minimum 5 years of Strong programming skills in Java and/or Python
- Minimum 5 years of Experience developing API integrations with cloud or enterprise systems
- Minimum 5 years of Experience with Unix Shell scripting
- Understanding of data modeling concepts and ETL frameworks
- Minimum 5 years of Experience working in Agile environments
Travel
Nice to Have (But Not a Must)
- Experience supporting PCI-DSS compliant environments
- Experience handling large structured and unstructured datasets
- Experience with CI/CD and DevOps practices
- Prior experience working on high-volume payment gateways or banking transaction platforms
What Were Looking For:
- A strong Data Engineer who understands transactional data systems and financial data flows
- Someone comfortable working in regulated security-sensitive banking environments
- A problem solver who can optimize high-volume systems and ensure data integrity
- A collaborative team player who can clearly communicate complex technical concepts.
View more
View less