Hi We have a job opportunity of a Role Senior GCP Data Engineer With AI/LLM with given job description on W2. Please forward updated profile to or 1 .
Role: Senior GCP Data Engineer With AI/LLM (W2 Position)
Location: Dearborn MI (Hybrid)
Duration: 12 Months
Experience: 10 Years
Note : please dont share profiles who are not comfortable to Work On W2
JD:
Skills Required:
GCP Big Data Data Warehousing Artificial Intelligence & Expert Systems API
- GCP Experience deploying and managing services on Google Cloud Platform including Compute Engine Cloud Storage IAM and Cloud Functions. For example designing and implementing a cloud-native application architecture using GKE (Google Kubernetes Engine) with Cloud SQL and Pub/Sub.
- Big Data Experience working with large-scale data processing frameworks such as Apache Spark Dataflow or BigQuery. For example building ETL pipelines that process terabytes of daily event data and transform it for downstream analytics.
- Data Warehousing Experience designing and maintaining data warehouse solutions (e.g. BigQuery Snowflake Redshift). For example modeling a star schema for a retail analytics platform that supports reporting on sales inventory and customer behavior.
- Artificial Intelligence & Expert Systems Experience developing or integrating AI/ML models and rule-based expert systems. For example building a classification model using Vertex AI to predict customer churn or implementing a rule engine that automates underwriting decisions.
- API Experience designing building and consuming RESTful or gRPC APIs. For example developing a versioned REST API with OAuth 2.0 authentication that serves as the integration layer between a mobile application and backend microservices.
Skills Preferred:
Google Cloud Platform
- Google Cloud Platform Familiarity with advanced GCP services beyond core compute and storage such as Vertex AI Dataflow Cloud Composer (Airflow) and BigQuery ML. For example using Cloud Composer to orchestrate scheduled data pipelines that feed into a BigQuery data warehouse.
Experience Required:
Senior Engineer Exp: 10 years Data Engineering work experience
Experience Preferred:
- Operational Excellence: Using Terraform Git and Airflow to ensure reproducible secure and cost-optimized cloud infrastructure.
- Governance & Quality: Prioritizing data lineage PII protection and observability to maintain high trust in data assets.
- Collaboration: Acting as a bridge between technical teams (Data Science Security) and business stakeholders to deliver self-service analytics. Strong understanding of Generative AI principles and architectures including Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) systems.
- Proven experience in building and deploying RAG systems including the use of **Vector Databases**.
- Proficiency in Python programming.
- Solid experience with SQL for data manipulation and querying.
- Hands-on experience with Google Cloud Platform (GCP) services relevant to AI/ML.
- Basic understanding and practical experience with Machine Learning model fine-tuning.
- Familiarity with data engineering concepts and practices.
- Expertise in prompt engineering techniques for interacting with LLMs.
- Experience with the OpenAI SDK.
- Experience developing robust APIs preferably with **FastAPI**.
- Proficiency with **version control systems (e.g. Git)**.
- Experience with **containerization technologies (e.g. Docker)**.
Thanks & Regards
Azmeera koti
Megan Soft Inc.
17177 N. Laurel Park Dr;
Suite #337 Livonia MI - 48152.
Direct No: 1
E Mail :
Website :
Linked In ID :
Hi We have a job opportunity of a Role Senior GCP Data Engineer With AI/LLM with given job description on W2. Please forward updated profile to or 1 . Role: Senior GCP Data Engineer With AI/LLM (W2 Position) Location: Dearborn MI (Hybrid) Duration: 12 Months Experience: 10 Years Note : please do...
Hi We have a job opportunity of a Role Senior GCP Data Engineer With AI/LLM with given job description on W2. Please forward updated profile to or 1 .
Role: Senior GCP Data Engineer With AI/LLM (W2 Position)
Location: Dearborn MI (Hybrid)
Duration: 12 Months
Experience: 10 Years
Note : please dont share profiles who are not comfortable to Work On W2
JD:
Skills Required:
GCP Big Data Data Warehousing Artificial Intelligence & Expert Systems API
- GCP Experience deploying and managing services on Google Cloud Platform including Compute Engine Cloud Storage IAM and Cloud Functions. For example designing and implementing a cloud-native application architecture using GKE (Google Kubernetes Engine) with Cloud SQL and Pub/Sub.
- Big Data Experience working with large-scale data processing frameworks such as Apache Spark Dataflow or BigQuery. For example building ETL pipelines that process terabytes of daily event data and transform it for downstream analytics.
- Data Warehousing Experience designing and maintaining data warehouse solutions (e.g. BigQuery Snowflake Redshift). For example modeling a star schema for a retail analytics platform that supports reporting on sales inventory and customer behavior.
- Artificial Intelligence & Expert Systems Experience developing or integrating AI/ML models and rule-based expert systems. For example building a classification model using Vertex AI to predict customer churn or implementing a rule engine that automates underwriting decisions.
- API Experience designing building and consuming RESTful or gRPC APIs. For example developing a versioned REST API with OAuth 2.0 authentication that serves as the integration layer between a mobile application and backend microservices.
Skills Preferred:
Google Cloud Platform
- Google Cloud Platform Familiarity with advanced GCP services beyond core compute and storage such as Vertex AI Dataflow Cloud Composer (Airflow) and BigQuery ML. For example using Cloud Composer to orchestrate scheduled data pipelines that feed into a BigQuery data warehouse.
Experience Required:
Senior Engineer Exp: 10 years Data Engineering work experience
Experience Preferred:
- Operational Excellence: Using Terraform Git and Airflow to ensure reproducible secure and cost-optimized cloud infrastructure.
- Governance & Quality: Prioritizing data lineage PII protection and observability to maintain high trust in data assets.
- Collaboration: Acting as a bridge between technical teams (Data Science Security) and business stakeholders to deliver self-service analytics. Strong understanding of Generative AI principles and architectures including Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) systems.
- Proven experience in building and deploying RAG systems including the use of **Vector Databases**.
- Proficiency in Python programming.
- Solid experience with SQL for data manipulation and querying.
- Hands-on experience with Google Cloud Platform (GCP) services relevant to AI/ML.
- Basic understanding and practical experience with Machine Learning model fine-tuning.
- Familiarity with data engineering concepts and practices.
- Expertise in prompt engineering techniques for interacting with LLMs.
- Experience with the OpenAI SDK.
- Experience developing robust APIs preferably with **FastAPI**.
- Proficiency with **version control systems (e.g. Git)**.
- Experience with **containerization technologies (e.g. Docker)**.
Thanks & Regards
Azmeera koti
Megan Soft Inc.
17177 N. Laurel Park Dr;
Suite #337 Livonia MI - 48152.
Direct No: 1
E Mail :
Website :
Linked In ID :
View more
View less