Position:Big DataDeveloper (GCP&Dataflow)
Location:Alpharetta GA (Onsite)
Minimum Experience: 10
Job Description
We are seeking an experiencedSeniorBig dataDeveloperwith strong expertise inBig DatatechnologiesGoogle Cloud Platform(GCP) andDataflow. The ideal candidate will have around8 years of hands-on experiencedesigning building and maintaining scalabledatapipelines and cloud-baseddatasolutions.
Key Responsibilities
- Design develop and optimize scalableBig DatapipelinesonGCP
- Build and managedataprocessing workflows usingGoogle CloudDataflow
- Work withlargedatasets for batch and real-timedataprocessing
- Ensuredataquality reliability and performance across pipelines
- Collaborate with cross-functional teams including analytics product and engineering
- Monitor troubleshoot and resolvedatapipeline issues
- Implement best practices fordatasecurity and governance onGCP
Required Skills & Qualifications
- 8 years of experience inBig Data
- Strong hands-on experience withGCPservices
- Expertise inGoogle CloudDataflow
- Experience withBig Datatools and frameworks (e.g. Spark Hadoop Kafka)
- Strong knowledge of SQL anddatamodeling
- Proficiency in programming languages such asPython or Java
- Experience building ETL/ELT pipelines
Position:Big DataDeveloper (GCP&Dataflow)Location:Alpharetta GA (Onsite)Minimum Experience: 10 Job Description We are seeking an experiencedSeniorBig dataDeveloperwith strong expertise inBig DatatechnologiesGoogle Cloud Platform(GCP) andDataflow. The ideal candidate will have around8 years of hands...
Position:Big DataDeveloper (GCP&Dataflow)
Location:Alpharetta GA (Onsite)
Minimum Experience: 10
Job Description
We are seeking an experiencedSeniorBig dataDeveloperwith strong expertise inBig DatatechnologiesGoogle Cloud Platform(GCP) andDataflow. The ideal candidate will have around8 years of hands-on experiencedesigning building and maintaining scalabledatapipelines and cloud-baseddatasolutions.
Key Responsibilities
- Design develop and optimize scalableBig DatapipelinesonGCP
- Build and managedataprocessing workflows usingGoogle CloudDataflow
- Work withlargedatasets for batch and real-timedataprocessing
- Ensuredataquality reliability and performance across pipelines
- Collaborate with cross-functional teams including analytics product and engineering
- Monitor troubleshoot and resolvedatapipeline issues
- Implement best practices fordatasecurity and governance onGCP
Required Skills & Qualifications
- 8 years of experience inBig Data
- Strong hands-on experience withGCPservices
- Expertise inGoogle CloudDataflow
- Experience withBig Datatools and frameworks (e.g. Spark Hadoop Kafka)
- Strong knowledge of SQL anddatamodeling
- Proficiency in programming languages such asPython or Java
- Experience building ETL/ELT pipelines
View more
View less