drjobs
GCP Data Architect
drjobs GCP Data Architect العربية

GCP Data Architect

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Job Location

drjobs

Phoenix - USA

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2683893
GCP Data Architect
Onsite Locations: Phoenix first then NYC or Sunrise FL
Client : Pacific Consultancy Services
Rate $90$95/hr on C2C
Visa : USCGCH4EAD
Exp : 12 Years
Job Description:


Mandatory Skills:


Extensive experience working with GCP Datarelated Services such as Cloud Storage Dataflow Dataproc BigQuery Bigtable
Very strong experience with Google Composer and Apache Airflow; ability to set up monitor and debug a complex environment running a large number of concurrent tasks
Good Exposure to RDBMS / SQL fundamentals
Exposure to Spark Hive GCP Data Fusion GCP Astronomer Pub/Sub Messaging Vertex and the Python Programming Language

Minimum Qualifications:

Bachelor degree in Engineering or Computer Science or equivalent OR Master in Computer Applications or equivalent.

A solid experience and understanding of considerations for large scale architecting solutioning and operationalization of data warehouses data lakes and analytics platforms on GCP is a must.

Create detailed target state technical security data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition



Minimum of 12 years of designing building and operationalizing largescale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties Spark Hive Cloud DataProc Cloud Dataflow Apache Beam/ composer Big Table Cloud BigQuery Cloud PubSub Cloud storage Cloud Functions & GitHub performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloudA solid experience and understanding of considerations for large scale architecting solutioning and operationalization of data warehouses data lakes and analytics platforms on GCP is a must.
Create detailed target state technical security data and operational architecture and design blueprints incorporating modern data technologies and cloud data services demonstrating modernization value proposition Minimum of 8 years of designing building and operationalizing largescale enterprise data solutions and applications using one or more of GCP data and analytics services in combination with 3rd parties Spark Hive Cloud DataProc Cloud Dataflow Apache Beam/ composer Big Table Cloud BigQuery Cloud PubSub Cloud storage Cloud Functions & GitHub performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloudExperience with Data lake data warehouse ETL build and design Experience with Google Cloud Services such as Streaming Batch Cloud Storage Cloud Dataflow Data Proc DFunc Big Query & Big TableProven ability in one or more of the following programming or scripting languages Python JavaScript Javavvvvvvvvvvvvv

Employment Type

Full Time

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.