Are you passionate about data Does the prospect of dealing with massive volumes of data excite you Do you want to build data engineering solutions that process billions of records a day in a scalable fashion using AWS technologies Do you want to create the nextgeneration tools for intuitive data access If so Amazon Finance Technology (FinTech) is for you!
FinTech is seeking a Data Engineer to join the team that is shaping the future of the finance data platform. The team is committed to building the next generation big data platform that will be one of the worlds largest finance data warehouse to support Amazons rapidly growing and dynamic businesses and use it to deliver the BI applications which will have an immediate influence on daytoday decision making. Amazon has culture of datadriven decisionmaking and demands data that is timely accurate and actionable. Our platform serves Amazons finance tax and accounting functions across the globe.
As a Data Engineer you should be an expert with data warehousing technical components (e.g. Data Modeling ETL and Reporting) infrastructure (e.g. hardware and software) and their integration. You should have deep understanding of the architecture for enterprise level data warehouse solutions using multiple platforms (RDBMS Columnar Cloud). You should be an expert in the design creation management and business use of large datasets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions and to build data sets that answer those questions. The candidate is expected to be able to build efficient flexible extensible and scalable ETL and reporting solutions. You should be enthusiastic about learning new technologies and be able to implement solutions using them to provide new functionality to the users or to scale the existing platform. Excellent written and verbal communication skills are required as the person will work very closely with diverse teams. Having strong analytical skills is a plus. Above all you should be passionate about working with huge data sets and someone who loves to bring datasets together to answer business questions and drive change.
Our ideal candidate thrives in a fastpaced environment relishes working with large transactional volumes and big data enjoys the challenge of highly complex business contexts (that are typically being defined in realtime) and above all is a passionate about data and analytics. In this role you will be part of a team of engineers to create worlds largest financial data warehouses and BI tools for Amazons expanding global footprint.
Key job responsibilities
Design implement and support a platform providing secured access to large datasets.
Interface with tax finance and accounting customers gathering requirements and delivering complete BI solutions.
Model data and metadata to support adhoc and prebuilt reporting.
Own the design development and maintenance of ongoing metrics reports analyses dashboards etc. to drive key business decisions.
Recognize and adopt best practices in reporting and analysis: data integrity test design analysis validation and documentation.
Tune application and query performance using profiling tools and SQL.
Analyze and solve problems at their root stepping back to understand the broader context.
Learn and understand a broad range of Amazons data resources and know when how and which to use and which not to use.
Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWS.
Continually improve ongoing reporting and analysis processes automating or simplifying selfservice support for datasets.
Triage many possible courses of action in a highambiguity environment making use of both quantitative analysis and business judgment.
Experience with SQL
1 years of data engineering experience
Experience with data modeling warehousing and building ETL pipelines
Experience with one or more query language (e.g. SQL PL/SQL DDL MDX HiveQL SparkSQL Scala)
Experience with one or more scripting language (e.g. Python KornShell)
Experience with big data technologies such as: Hadoop Hive Spark EMR
Experience with any ETL tool like Informatica ODI SSIS BODI Datastage etc.
Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process including support for the interview or onboarding process please visit
for more information. If the country/region youre applying in isnt listed please contact your Recruiting Partner.