Benefits:
- Dental insurance
- Health insurance
- Paid time off
Greetings of the day
Position:Google Cloud Platform Data Engineer (Mid-Level)
Location:2-3 days a week on-site at Ashburn VA
Work Authorization:US Citizen
Start Date:10/01/2025 (if cleared before that date)
Required Clearance: DHS/CBP or TS clearance
Were looking for a highly skilled Google Cloud Platform (GCP) Data Engineer to design build and maintain secure scalable and intelligent cloud data infrastructure. Youll be the go-to expert for GCP services real-time data processing and integration with AI/ML solutions. If you have deep knowledge of GCPs networking storage database and data engineering capabilities along with Kafka and AI/ML experience we want to hear from you.
Responsibilities:
- Design and implement secure scalable and high-performing data pipelines and infrastructure on Google Cloud.
- Manage and optimize real-time streaming platforms such as Apache Kafka Pub/Sub and Dataflow.
- Build manage and tune diverse GCP database services (Cloud SQL Cloud Spanner BigQuery Bigtable Firestore Memorystore) to support transactional and analytical workloads.
- Implement ETL/ELT pipelines for structured semi-structured and unstructured data using GCP native and open-source tools.
- Collaborate with data scientists and ML engineers to operationalize AI/ML models on Vertex AI integrating data pipelines with predictive and generative AI services.
- Oversee identity and access management (IAM) and ensure security compliance.
- Monitor system and pipeline performance troubleshoot issues and optimize infrastructure and data costs.
- Manage container orchestration platforms such as Google Kubernetes Engine (GKE) for data workloads.
- Develop and maintain Infrastructure as Code (IaC) with Terraform or Cloud Deployment Manager.
- Collaborate with application and analytics teams to ensure data availability governance and reliability.
Required Qualifications:
- Proven experience as a Data Engineer Cloud Engineer or Infrastructure Engineer.
- 6 years of experience with cloud and data technologies including at least 2 years on GCP.
- Hands-on experience with streaming technologies (Kafka Pub/Sub) and batch data processing (Dataflow Dataproc).
- Strong knowledge of multiple database systems (SQL NoSQL relational distributed).
- Experience with BigQuery and data warehouse optimization.
- Exposure to AI/ML pipelines and tools on GCP (Vertex AI AI Platform TensorFlow or PyTorch integration).
- Strong understanding of networking firewalls VPNs and cloud security.
- Solid experience with Infrastructure as Code (Terraform Deployment Manager).
- Proficiency in scripting (Python Bash) and CI/CD pipelines (Git GitLab Jenkins or Harness).
- GCP certification (e.g. Professional Data Engineer or Professional Cloud Architect) is a plus.
- Ability to obtain and maintain a Public Trust or Suitability/Fitness determination based on client requirements.
- Existing DHS/CBP or TS clearance is a strong plus.
If you are interested in this position please send a copy of your latest resume to with the information requested below. Also let us know the best time and number to call to discuss this case this position is not the right fit for you feel free to share this opportunity with your network. Thank you!
- Availability for technical interview
- Updated Resume
- Best rates
- Contact number
Please dont hesitate to reach out with any questions. All employment decisions are based on qualifications merit and business needs.
Regards
Flexible work from home options available.
Compensation: $140000.00 per year
Benefits:Dental insuranceHealth insurancePaid time offGreetings of the dayPosition:Google Cloud Platform Data Engineer (Mid-Level)Location:2-3 days a week on-site at Ashburn VAWork Authorization:US CitizenStart Date:10/01/2025 (if cleared before that date)Required Clearance: DHS/CBP or TS clearance ...
Benefits:
- Dental insurance
- Health insurance
- Paid time off
Greetings of the day
Position:Google Cloud Platform Data Engineer (Mid-Level)
Location:2-3 days a week on-site at Ashburn VA
Work Authorization:US Citizen
Start Date:10/01/2025 (if cleared before that date)
Required Clearance: DHS/CBP or TS clearance
Were looking for a highly skilled Google Cloud Platform (GCP) Data Engineer to design build and maintain secure scalable and intelligent cloud data infrastructure. Youll be the go-to expert for GCP services real-time data processing and integration with AI/ML solutions. If you have deep knowledge of GCPs networking storage database and data engineering capabilities along with Kafka and AI/ML experience we want to hear from you.
Responsibilities:
- Design and implement secure scalable and high-performing data pipelines and infrastructure on Google Cloud.
- Manage and optimize real-time streaming platforms such as Apache Kafka Pub/Sub and Dataflow.
- Build manage and tune diverse GCP database services (Cloud SQL Cloud Spanner BigQuery Bigtable Firestore Memorystore) to support transactional and analytical workloads.
- Implement ETL/ELT pipelines for structured semi-structured and unstructured data using GCP native and open-source tools.
- Collaborate with data scientists and ML engineers to operationalize AI/ML models on Vertex AI integrating data pipelines with predictive and generative AI services.
- Oversee identity and access management (IAM) and ensure security compliance.
- Monitor system and pipeline performance troubleshoot issues and optimize infrastructure and data costs.
- Manage container orchestration platforms such as Google Kubernetes Engine (GKE) for data workloads.
- Develop and maintain Infrastructure as Code (IaC) with Terraform or Cloud Deployment Manager.
- Collaborate with application and analytics teams to ensure data availability governance and reliability.
Required Qualifications:
- Proven experience as a Data Engineer Cloud Engineer or Infrastructure Engineer.
- 6 years of experience with cloud and data technologies including at least 2 years on GCP.
- Hands-on experience with streaming technologies (Kafka Pub/Sub) and batch data processing (Dataflow Dataproc).
- Strong knowledge of multiple database systems (SQL NoSQL relational distributed).
- Experience with BigQuery and data warehouse optimization.
- Exposure to AI/ML pipelines and tools on GCP (Vertex AI AI Platform TensorFlow or PyTorch integration).
- Strong understanding of networking firewalls VPNs and cloud security.
- Solid experience with Infrastructure as Code (Terraform Deployment Manager).
- Proficiency in scripting (Python Bash) and CI/CD pipelines (Git GitLab Jenkins or Harness).
- GCP certification (e.g. Professional Data Engineer or Professional Cloud Architect) is a plus.
- Ability to obtain and maintain a Public Trust or Suitability/Fitness determination based on client requirements.
- Existing DHS/CBP or TS clearance is a strong plus.
If you are interested in this position please send a copy of your latest resume to with the information requested below. Also let us know the best time and number to call to discuss this case this position is not the right fit for you feel free to share this opportunity with your network. Thank you!
- Availability for technical interview
- Updated Resume
- Best rates
- Contact number
Please dont hesitate to reach out with any questions. All employment decisions are based on qualifications merit and business needs.
Regards
Flexible work from home options available.
Compensation: $140000.00 per year
View more
View less