Job Summary Data Engineer with GCP
Role: Senior Data Engineer (GCP)
Location: Bentonville AR or Sunnyvale CA (Day 1 onsite)
Citizenship: US Citizen only
Experience: 12 years overall; 4 years recent GCP
Rate: $60/hr on C2C
### MustHave Skills
Scala Python Spark
GCP Cloud Services (Dataproc GCS BigQuery)
Apache Airflow
Experience in Retail (preferred)
### Key Responsibilities
Design develop and maintain big data applications using open source technologies.
Build and automate data pipelines (Hive Spark Kafka Airflow).
Develop logical and physical data models for big data platforms.
Automate workflows using Apache Airflow.
Ongoing maintenance enhancements and oncall support for existing systems.
Mentor junior engineers and lead design reviews/standups.
Groom and prioritize backlog using JIRA.
Act as point of contact for assigned business domain.
Work in an offshore and managed outcome delivery model.
Share domain and technical knowledge within the team.
### Requirements
4 years of recent handson GCP experience.
Experience with GCP Dataproc GCS BigQuery.
10 years experience in data warehouse solutions/products.
6 years with Hadoop Hive or Spark Airflow or workflow orchestration.
5 years in data modeling and schema design for data lakes or RDBMS.
Programming: Python Java Scala; Scripting: Perl Shell.
Handling and processing large datasets (multiTB/PB scale).
Testdriven development and automated testing frameworks.
Agile/Scrum methodology experience.
Ability to manage multiple priorities independently.
Excellent verbal and written communication skills.
Bachelors degree in Computer Science or equivalent.
### Preferred/Additional Skills
Experience with Gitflow BitBucket JIRA Confluence.
Familiarity with CI tools: Bamboo Jenkins TFS.