The BDT/eCF team is looking for a passionate and innovative engineer with a solid technical background to join the engineering team. Amazons technology connects millions of businesses of all sizes to hundreds of millions of Customers within the Amazon marketplaces worldwide. Our platform at Amazonscale enables customers to process native SQL machine learning (ML) and other functional transformations using Apache Spark Iceberg Scala Java Python ML and related technologies to build unified compute for batch streaming and ML processing executing over schemad data stored in S3 and to seamlessly write those curated datasets out to front end caches like Dynamo Redis and ElasticSearch. Additionally we enable these same sets of functional transforms over streaming data enabling customers to transition seamlessly between Streaming Batch Cache and Analytics as needed to meet customer demand. The successful candidate will have a background in the development of distributed systems a solid technical ability good communication skills and a motivation to achieve results in a fast paced environment.
Key job responsibilities
If you are looking for building big data scalable engines with cutting edge technology and ML stack (with Spark Iceberg Java Scala Notebooks Python AWS EMR EKS Kinesis Dynamo SQS and ML) processing & transforming data across data lakes at Exabytes scale at Amazon then look no further. If you are looking to work with team of engineers that relentlessly innovate and push the envelope keeping customers at the center of its universe continually insisting raising the bar on their higher standards and delivering results with velocity then this is the space and place to be in.
Specifically within the team the SDE is responsible for ensuring the teams software maintains a high bar with respect to quality security architecture and operational excellence. They take the lead on the design and delivery of major features and takes the lead on rearchitecting significant technology components while engaging with and influencing their team external teams partners and leadership along the way. They are able to identify the root cause of widespread/pervasive issues including areas where it limits innovation and prevents accelerated delivery of projects while navigating several systems and components they may or may not own. They are able to effectively communicate with their team and others take calculated risks anticipate and mitigate longterm risk and make reasonable tradeoffs when situation demands it. mentoring less experienced engineers and providing career development opportunities while providing constructive feedback to their peers. They understand the business impact of decisions and are able to exhibit good judgment while making tradeoffs between the teams shortterm technology or operational needs and longterm business needs. Ultimately they display strong technical acumen and ownership while providing strong leadership for the rest of the team.
A day in the life
This includes attending a daily standup managing/contributing on your goals projects deliverables innovations operational excellence taking turns every 1216 weeks with operations helping improving customer experience.
About the team
The scope of the primary product Cradle involves working at the architecture level solving ambiguous problems taking risks and failing fast and orchestrating larger more complex projects with partners both internal and external to the organization. Within the product teams Triton is responsible for the core engine and related services/components. Team ownership includes Dryad Spark Engine (DSE) Spark connectors to Amazon data sources Engine Release Label (EaRL) Service and Dryad Streaming for processing batch and streaming jobs. These services are imperative to the platforms success. This drive and impact the key metrics such as job reliability adherence to SLAs accessibility and compatibility with Amazon data sources and overall IMR spend for the platform. Cradle executes an average of 2MM jobs each day in clusters spread across >20k instances. Cradle jobs produce data consumed by data engineers SDEs subsequent data flows and by S Teamlevel reporting processes.
3 years of noninternship professional software development experience
2 years of noninternship design or architecture (design patterns reliability and scaling) of new and existing systems experience
3 years of programming experience with at least one modern language such as Java Scala including objectoriented design
4 years of professional software development experience
2 years of experience as a mentor tech lead OR leading an engineering team
3 years of full software development life cycle including coding standards code reviews source control management build processes testing and operations experience
Bachelors degree in computer science or equivalent
Experience with Spark Hadoop REST
Experience with AWS tech such as Rredshift EMR Dynamo Kinesis EKS
Experience working with large commercial relational database systems (Oracle SQL Server).
Experience developing Unit Tests using tools such as JUnit NUnit MSTest to verify your code quality.
Experience in the building complex software systems that have been successfully delivered to customers.
Knowledge of professional software engineering practices & best practices for the full software development life cycle including coding standards code reviews source control management build processes testing and operations.
Strong written and verbal communication skills preferred.
Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race national origin gender gender identity sexual orientation protected veteran status disability age or other legally protected status.
Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process including support for the interview or onboarding process please visit
for more information. If the country/region youre applying in isnt listed please contact your Recruiting Partner.