drjobs GCP ML Architect - Data

GCP ML Architect - Data

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Chaska, MN - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Role: GCP ML Architect - Data

Location: Chaska MN

Hire type: Full-TIme

Detailed JD:

  • Responsible for designing implementing and managing data and machine learning solutions on Google Cloud Platform
  • Key Responsibilities:
  • Design end-to-end data solutions including data ingestion storage processing and analysis pipelines as well as machine learning model development deployment and monitoring pipelines.
  • Design and implement scalable secure and cost-optimized cloud infrastructure using GCP services like BigQuery Dataflow Dataproc Cloud Storage and Kubernetes Engine.
  • Design and implement data models ensuring data consistency accuracy and accessibility for various applications and users.
  • Establish MLOps practices enabling the automation of machine learning model training deployment and monitoring.
  • Ensure that all data solutions adhere to security and compliance standards implementing access controls encryption and other security measures.
  • Monitor and optimize the performance of data and machine learning systems ensuring they meet business requirements and SLAs.
  • Develop and implement strategies for managing and optimizing cloud costs ensuring efficient resource utilization.
  • They provide technical guidance and mentorship to other team members fostering a culture of best practices and continuous improvement.

Key Skills:

  • 10 years of experience designing and developing production grade data architectures using google cloud data services and solutions
  • Proficiency in BigQuery Dataflow Dataproc Cloud Storage pub-sub Kubernetes Engine and other relevant GCP services.
  • Strong Experience with data warehousing ETL processes data modeling and data pipeline development.
  • Strong hands on experience in Python and SQL
  • Strong experience of model development deployment and monitoring using Vertex AI
  • Good experience of LLM agents and agentic AI Agent Space and hands on RAG experience
  • Experience with cloud computing concepts including infrastructure as code (IaC) scalability security and cost optimization.

Employment Type

Full-time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.