Job Description Data Architect
Location: Sunrise FL
Work Mode: Onsite from Day 1
Experience: 12 18 years
Employment Type: Contract
Role Summary
We are seeking a seasoned Data Architect with deep expertise in enterprise data platforms to design build and optimize large-scale data solutions. The role requires strong hands-on capabilities across Big Data technologies cloud-native data services (GCP) and customer-centric data architectures such as Customer 360.
Key Responsibilities
- Architect and lead enterprise data warehouse and data lake solutions.
- Design and implement scalable Big Data pipelines using Hadoop Spark SQL and Python.
- Build and support Customer Data Management / Customer 360 architectures.
- Perform deep data exploration profiling and advanced analytics to support business use cases.
- Design and deploy solutions on Google Cloud Platform (GCP) leveraging:
- Dataflow
- Dataproc
- Kubernetes / containerized workloads
- Collaborate closely with business analytics and engineering teams in an onsite environment.
- Ensure data quality performance scalability and governance best practices.
Required Qualifications
- 12 18 years of overall IT experience with strong focus on data architecture.
- Extensive hands-on experience with Big Data and Hadoop ecosystem.
- Strong programming and query skills in SQL Python and/or Spark.
- Solid understanding of data warehousing concepts dimensional modeling and analytics.
- Proven experience with GCP data services (Dataflow Dataproc) and containers (Kubernetes).
- Excellent communication and interpersonal skills.
Job Description Data Architect Location: Sunrise FL Work Mode: Onsite from Day 1 Experience: 12 18 years Employment Type: Contract Role Summary We are seeking a seasoned Data Architect with deep expertise in enterprise data platforms to design build and optimize large-scale data solut...
Job Description Data Architect
Location: Sunrise FL
Work Mode: Onsite from Day 1
Experience: 12 18 years
Employment Type: Contract
Role Summary
We are seeking a seasoned Data Architect with deep expertise in enterprise data platforms to design build and optimize large-scale data solutions. The role requires strong hands-on capabilities across Big Data technologies cloud-native data services (GCP) and customer-centric data architectures such as Customer 360.
Key Responsibilities
- Architect and lead enterprise data warehouse and data lake solutions.
- Design and implement scalable Big Data pipelines using Hadoop Spark SQL and Python.
- Build and support Customer Data Management / Customer 360 architectures.
- Perform deep data exploration profiling and advanced analytics to support business use cases.
- Design and deploy solutions on Google Cloud Platform (GCP) leveraging:
- Dataflow
- Dataproc
- Kubernetes / containerized workloads
- Collaborate closely with business analytics and engineering teams in an onsite environment.
- Ensure data quality performance scalability and governance best practices.
Required Qualifications
- 12 18 years of overall IT experience with strong focus on data architecture.
- Extensive hands-on experience with Big Data and Hadoop ecosystem.
- Strong programming and query skills in SQL Python and/or Spark.
- Solid understanding of data warehousing concepts dimensional modeling and analytics.
- Proven experience with GCP data services (Dataflow Dataproc) and containers (Kubernetes).
- Excellent communication and interpersonal skills.
View more
View less