Employer Active
Title Spark/Scala Engineer
Duration Months Contract
Location Sunnyvale CA (Hybrid Mode) Days Onsite in a week
(Hiring )
Note The person has to be in commutable distance to Sunnyvale CA so they can attend meetings or be in person whenever there is a need (up to days per week)
Required skills/experience years experience including years Spark or Scala years Hadoop/Big Data using tools like Hive Spark PySpark Scala and RDBMS/SQL
Strongly Preferred GCP including GCS (Google Cloud Storage) Dataproc and BigQuery
Full Job Description
Designs develops and implements Hadoop ecosystem based applications to support business requirements Follows approved life cycle methodologies creates design documents and performs program coding and testing Resolves technical issues through debugging research and investigation Experience/Skills Required
years experience in computer programming software development or related
years of solid Scala or Spark and years experience in design implementation and support of big data solutions in Hadoop using Hive Spark Scala SQL
Handson experience with Unix Teradata and other relational databases Experience with @Scale a plus
Strong communication and problemsolving skills
Strongly Preferred GCP including GCS (Google Cloud Storage) Dataproc and BigQuery
Full Time