We design build and maintain infrastructure to support features that empower billions of Apple users. Our team processes billions of requests every day across our web crawl stack. We rely on our microservice based architecture to support multimodal crawl which allows us to download billions of records to support various search features across Apple. We take full end-to-end ownership of our services driving them through every stage meticulously encompassing conception design implementation deployment and maintenance. As a result each one of us takes our responsibilities this team youll have the opportunity to work on incredibly complex large scale systems with trillions of records and petabytes of data.
BS or MS in Computer Science or equivalent experience
Strong coding skills and experience with data structures and algorithms
Experience designing and building scalable distributed services
Experience with petabyte scale data processing & building data pipelines
Experience with AWS Services such as Amazon S3 EC2 EKS
Excellent interpersonal skills able to work independently as well as cross-functionally
Experience in Web Crawl is a plus
Excellent interpersonal skills able to work independently as well as in a team
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.