drjobs
GCP Data Engineer W2 - Remote
drjobs
GCP Data Engineer W2....
drjobs GCP Data Engineer W2 - Remote العربية

GCP Data Engineer W2 - Remote

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Job Location

drjobs

- USA

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2580528
Position: GCP Data Engineer (W2 Position) 313708
Location: Dearborn MI (Remote)
Duration: 12 Months
MOI: Phone & WebEx
Direct Client: FORD MOTORS
Note: 1. U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are NOT able to sponsor H1B at this time.
H1B Consultant who are willing to WORK ON OUR W2 (H1B TRANSFER) are welcome
Were seeking a Data Engineer who has experience building data products on a cloud analytics platform.
You will work on ingesting transforming and analyzing large datasets to support the Enterprise in the Data Factory on Google Cloud Platform (GCP). Experience with large scale solution and operationalization of data lakes data warehouses and analytics platforms on Google Cloud Platform or other cloud environments is a must. We are looking for candidates who have a broad set of technical skills across these areas.
Skills Required:
Experience in working in an implementation team from concept to operations providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production. Experience in analyzing complex data organizing raw data and integrating massive datasets from multiple data sources to build analytical domains and reusable data products. Experience in working with architects to evaluate and productionalize data pipelines for data ingestion curation and consumption. Experience in working with stakeholders to formulate business problems as technical data requirements identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management.
Skills Preferred:
Strong drive for results and ability to multitask and work independently.
Selfstarter with proven innovation skills.
Ability to communicate and work with crossfunctional teams and all levels of management. Demonstrated commitment to quality and project timing.
Demonstrated ability to document complex systems. Experience in creating and executing detailed test plans.
Experience Required:
5 years of SQL development experience 5 years of analytics/data product development experience required.
3 years of cloud experience (GCP preferred) with solutions designed and implemented at production scale.
Experience working in GCP native (or equivalent) services like Big Query Google Cloud Storage PubSub Dataflow Dataproc Cloud Build etc.
Experience working with Airflow for scheduling and orchestration of data pipelines. Experience working with Terraform to provision Infrastructure as Code 2 years professional development experience in Java or Python
Experience Preferred:
Indepth understanding of Googles product technology (or other cloud platform) and underlying architectures Experience working with DBT/Dataform. Experience with DataPlex or other data catalogs is preferred. Experience with development ecosystem such as Tekton Git Jenkins for CI/CD pipelines. Exceptional problem solving and communication skills. Experience in working with Agile and Lean methodologies. Team player and attention to detail. Experience with performance tuning SQL queries.
Education Required:
Bachelors degree in computer science or related scientific field.
Education Preferred:
GCP Professional Data Engineer Certified Masters degree in computer science or related field 2 years mentoring engineers. Indepth software engineering knowledge
Additional Safety Training/Licensing/Personal Protection Requirements:
Additional Information :
POSITION IS HYBRID: The candidate can be fully remote for the right person but we will give preference to local SE MI candidates who can follow the mandated Ford Credit hybrid schedule of 3 days in the office.

Employment Type

Remote

Company Industry

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.