We design and build infrastructures to support features that empowers billions of Apple users. Our team processes trillions of web links to select and crawl the most relevant content to surface to users through search. As part of the team youll be responsible building a diverse set of models for a variety of tasks including but not limited to url selection document processing and extraction document selection and ablations while analyzing the results to help drive this team youll have the opportunity to work on incredibly complex large scale systems with trillions of records and petabytes of data.
MS or PhD in Computer Science or equivalent experience
Strong coding skills and experience with data structures and algorithms
Proficiency in one of following languages: Python Go Java C
Experience in LLM machine learning deep learning information retrieval natural language processing or data mining
Experience with petabyte scale data and machine learning systems
Ability to understand/clarify product requirements and translate them into technical tasks in ML modeling and engineering
Experience with AWS Services such as Amazon S3 EC2 EKS / Kubernetes
Excellent data analytical skills
Experience in Web Crawling is a plus
Apple is an equal opportunity employer that is committed to inclusion and diversity. We seek to promote equal opportunity for all applicants without regard to race color religion sex sexual orientation gender identity national origin disability Veteran status or other legally protected characteristics.
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.