Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailVisas Technology Organization is a community of problem solvers and innovators reshaping the future of commerce. We operate the worlds most sophisticated processing networks capable of handling more than 65k secure transactions a second across 80M merchants 15k Financial Institutions and billions of everyday people. While working with us youll get to work on complex distributed systems and solve massive scale problems centered on new payment flows business and data solutions cyber security and B2C platforms.
The Opportunity:
We are looking for Versatile curious and energetic Software Engineers who embrace solving complex challenges on a global scale. As a Visa Software Engineer you will be an integral part of a multifunctional development team inventing designing building and testing software products that reach a truly global customer base. While building components of powerful payment technology you will get to see your efforts shaping the digital future of monetary transactions.
The Work Itself:
Engineer data systems and pipelines that handle vast datasets influencing Visas internal standards for data scalability security and reusability.
Collaborate across multiple teams to create design artifacts and develop bestinclass data solutions for Visas diverse technical offerings.
Actively contribute to data quality improvements valuable service technology and new business flows in diverse agile squads.
Develop robust and scalable data products intended for various stakeholders including enduser merchants B2B and businesstogovernment solutions.
Leverage innovative technologies to build the next generation of payment services transaction platforms realtime analytics and datadriven insights.
Opportunities to make a difference on a global or local scale through mentorship and continued learning opportunities.
Essential Functions:
Demonstrates relevant technical working knowledge to understand data requirements and architectural decisions.
Identifies and contributes to solution strategies that improve the design and functionality of data pipelines and processing frameworks under minimal guidance.
Applies standard processes on the use of programming languages and tools (e.g. java Scala Python SQL Spark) to develop and optimize data workflows.
Collaborates with others to support the piloting of new data capabilities and features that enhance analytics and reporting across ecommerce products.
Analyzes data anomalies for simple issues and applies debugging tools to verify assumptions and improve data integrity.
The Skills You Bring:
Energy and Experience: A growth mindset that is curious and passionate about data technologies and enjoys challenging projects on a global scale.
Challenge the Status Quo: Comfort in pushing the boundaries hacking beyond traditional data solutions.
Language and Tool Expertise: Proficiency in one or more data processing languages and tools (e.g. Python SQL Spark).
Builder: Experience building and deploying modern data services and pipelines with quality and scalability.
Learner: Constant drive to learn new technologies such as Hadoop Kafka Kubernetes Docker etc.
Partnership: Experience collaborating with Product Test DevOps and Agile/Scrum teams.
This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager.
Qualifications :
Basic Qualifications
8 years of relevant work experience with a Bachelors Degree or at least 5 years of experience with an Advanced Degree (e.g. Masters MBA JD MD) or 2 years of work experience with a PhD OR 11 years of relevant work experience.
Preferred Qualifications
9 or more years of relevant work experience with a Bachelor Degree or 7 or more relevant years of experience with an Advanced Degree (e.g. Masters MBA JD MD) or 3 or more years of experience with a PhD
Educational Background: Bachelors or Masters degree in Computer Science Information Technology Data Science or a related field.
Experience: 6 years of handson experience in data engineering data automation or a related role.
Programming Skills: Proficiency in programming languages such as Python Java or Scala for data manipulation and automation.
Database Management: Strong experience with SQL and NoSQL databases (e.g. MySQL PostgreSQL MongoDB Cassandra).
Data Processing Frameworks: Experience with data processing frameworks such as Apache Spark Apache Hadoop or Apache Flink.
ETL Tools: Proficiency in ETL tools like Apache NiFi Talend Informatica or Microsoft SSIS.
Data Generators and Copying: Ability to build data generators and automate data copying processes.
Batch Processing Expertise:
Extensive experience in designing implementing and maintaining batch processing systems.
Proficiency in scheduling and orchestrating batch jobs using tools like Apache Airflow ControlM or Cron jobs.
Strong understanding of batch processing principles including job dependencies scheduling and error handling.
Data Automation Framework Development:
Ability to develop custom automation frameworks tailored for batch processing applications.
Experience in automating the endtoend data pipeline including data extraction transformation and loading (ETL) processes.
Creation of reusable and scalable automation scripts using Python Bash or other scripting languages.
Data Generators and Copy Automation:
Expertise in building and maintaining data generators for synthetic data creation to support testing and development environments.
Experience in automating data copying and replication processes across different environments and databases.
Performance Optimization:
Experience in optimizing batch jobs for performance and efficiency including resource allocation parallel processing and job prioritization.
Ability to monitor and troubleshoot batch job performance issues and implement necessary improvements.
Error Handling and Recovery:
Development of robust error handling and recovery mechanisms to ensure the reliability and resilience of batch processing workflows.
Implementation of alerting and notification systems to promptly address job failures and exceptions.
Data Quality and Validation:
Implementation of automated data quality checks and validation steps within batch processes to ensure data accuracy and integrity.
Experience with tools and frameworks for automated data validation such as Great Expectations or custom validation scripts.
Documentation and Best Practices:
Strong documentation skills to create comprehensive guides best practices and standard operating procedures (SOPs) for batch automation processes.
Knowledge of industry best practices for batch processing and data automation.
Tools and Technologies for Automation in Batch Applications:
Scheduling and Orchestration: Apache Airflow ControlM Cron
Scripting Languages: Python Bash Shell scripting
Data Processing Frameworks: Apache Spark Apache Hadoop
ETL Tools: Apache NiFi Talend
Data Quality and Validation: Great Expectations custom validation scripts
Monitoring and Alerting: Prometheus Grafana custom alerting systems
Additional Information :
Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race color religion sex national origin sexual orientation gender identity disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.
Remote Work :
No
Employment Type :
Fulltime
Full-time