Data Engineer III
Chandler, TX - USA
Job Summary
Job Description:
At Bank of America we are guided by a common purpose to help make financial lives better through the power of every connection. We do this by driving Responsible Growth and delivering for our clients teammates communities and shareholders every day.
Being a Great Place to Work is core to how we drive Responsible Growth. This includes our commitment to being an inclusive workplace attracting and developing exceptional talent supporting our teammates physical emotional and financial wellness recognizing and rewarding performance and how we make an impact in the communities we serve.
Bank of America is committed to an in-office culture with specific requirements for office-based attendance and which allows for an appropriate level of flexibility for our teammates and businesses based on role-specific considerations.
At Bank of America you can build a successful career with opportunities to learn grow and make an impact. Join us!
Job Description:
This job is responsible for driving efforts to develop and deliver complex data solutions to accomplish technology and business goals. Key responsibilities include leading code design and delivery tasks with the integration cleaning transformation and control of data in operational and analytical data systems. Job expectations include liaising with vendors and working with stakeholders and Product and Software Engineering teams to implement data requirements analyzing performance and researching and troubleshooting issues within system engineering domains.
Position Summary
Lead and/or participate in the design development and implementation of complex system engineering activities involving cross-functional technical support systems programming and data center capabilities. Responsible for components of highly complex engineering and/or analytical tasks and activities. Establishes input/output processes and working parameters for hardware/software compatibility coordination of subsystems design and integration of total system. Viewed as a technology subject matter expert; able to provide and communicate complex technology solutions across differing audiences including technical managerial business executives and/or vendors. Leads the resolution process for complex problems where analysis of situations or data requires an in-depth evaluation of various factors. Exercises judgment within broadly defined practices and policies in selecting methods techniques and evaluation criterion for obtaining results. Information Technology degree and/or technology certifications preferred or substantial equivalent experience. Typically 7 years of IT experience.
Responsibilities:
- Leads story refinement and delivery of requirements through the delivery lifecycle and assists team members in resolving technical complexities
- Codes complex solutions to integrate clean transform and control data builds processes supporting data transformation data structures metadata data quality controls dependency and workload management assembles complex data sets and communicates required information for deployment
- Leads documentation of system requirements collaborates with development teams to understand data requirements and feasibility and leverages architectural components to develop client requirements
- Leads testing teams to develop test plans contributes to existing test suites including integration regression and performance analyzes test reports identifies test issues and errors and leads triage of underlying causes
- Leads work efforts with technology partners and stakeholders to close gaps in data management standards adherence negotiates paths forward by thinking outside the box to identify and communicate solutions to complex problems and leverages knowledge of information systems techniques and processes
- Leads complex information technology projects to ensure on-time delivery and adherence to release processes and risk management and defines and builds data pipelines to enable data-informed decision making
- Mentors Data Engineers to enable continuous development and monitors key performance indicators and internal controls
- Data migration from legacy system to new Platform
- Design and develop ETL Datastage jobs and sequences
- Design and develop multiple reports including audit error and reconciliation reports.
- Integration of various data sources like Oracle DB2 DB2UDB SQL Server Flat files into the target tables.
- Assist in the collection of technical metadata supporting the implementation
- Utilize automation tools to schedule execute and monitor ETL Datastage jobs
- Design test and debug data flow plans to create necessary Client files and/or populated databases
- Design of the ETL application
- Performance tune the application to meet operational production requirements;
- Develop and execute in conjunction with the ETL Architect Quality and Release and Process Analyst system and integration test plans
- Analyze source data/file structures technical metadata profile source data and develop data mapping and transformation rules and detail ETL specifications based on business requirements
Required Qualifications:
- 5 years of experience in Production Support.
- Candidate must have understanding of ETL Processes and Best Practices
- Expert level knowledge and hands-on experience with DataStage Enterprise Edition/Parallel Extender (Orchestrate) for parallel processing as part of ETL conversion process integrating multiple sources
- Performance Tuning of SQL statements stored procedures and functions.
- Writing UNIX shell scripts as per the requirement and used Job scheduling tools like Autosys utility
- Firm understanding of software development Lifecycle release & change management process
- Strong analytical diagnostic and troubleshooting skills and ability to work in a dynamic team-oriented environment.
- Strong written and verbal communication skills
- Understanding and hands-on experience with XML packages in DataStage
- Diverse understanding of IT infrastructure and other technologies is a plus
- Strong written and verbal communication skills
- Team player and self-motivated work independently with minimum direction
Desired Qualifications:
- IBM Datastage IBM Streams and Apache Flink
- Understanding of Real time data consumption from Kafka
- Fundamental knowledge of IT Service Management/ITIL processes and tools
- Data Virtualization using Presto/Trino
- Experience with Jira & Bitbucket
Skills:
- Analytical Thinking
- Application Development
- Data Management
- DevOps Practices
- Solution Design
- Agile Practices
- Collaboration
- Decision Making
- Risk Management
- Test Engineering
- Architecture
- Business Acumen
- Data Quality Management
- Financial Management
- Solution Delivery Process
Shift:
1st shift (United States of America)Hours Per Week:
40Required Experience:
IC
About Company
What would you like the power to do? At Bank of America, our purpose is to help make financial lives better through the power of every connection.