Role: Solution Architect
Location: Phoenix AZ Locals only (Onsite)
Type: Contract
Job Description
- Data is responsible for contributing to the design modernization and optimization of enterprise-scale data systems as well as the maintenance and operations strategy for CHP.
- This role involves designing and implementing data systems that organize store and manage data within our cloud data platform.
- The architect will perform continuous maintenance and operations work for CHP in the cloud environment.
- They will review and analyze CHPs data infrastructure plan future database solutions and implement systems to support data management for CHP users.
- Additionally this role is accountable for ensuring data integrity making sure the CHP team adheres to data governance standards to maintain accuracy consistency and reliability across all systems.
- The architect will identify data discrepancies and quality issues and work to resolve them. This position requires a strong blend of architectural leadership technical depth and the ability to collaborate with business stakeholders data engineers machine learning practitioners and domain experts to deliver scalable secure and reliable AI-driven solutions.
- The ideal candidate will have a proven track record of delivering end-to-end ETL/ELT pipelines across Databricks Azure and AWS environments.
Key Responsibilities
- Design scalable data lake and data architectures using Databricks and cloud-native services.
- Develop metadata-driven parameterized ingestion frameworks and multi-layer data architectures.
- Optimize data workloads and performance.
- Define data governance frameworks for CHP.
- Design and develop robust data pipelines.
- Architect AI systems including RAG workflows and prompt engineering.
- Lead cloud migration initiatives from legacy systems to modern data platforms.
- Provide architectural guidance best practices and technical leadership across teams.
- Build documentation reusable modules and standardized patterns. Required Skills and Experience
- Strong expertise in cloud platforms primarily Azure or AWS.
- Hands-on experience with Databricks.
- Deep proficiency in Python and SQL.
- Expertise in building ETL/ELT pipelines and ADF workflows.
- Experience architecting data lakes and implementing data governance frameworks. Hands-on experience with CI/CD DevOps and Git-based development.
- Ability to translate business requirements into technical architecture. Technical
Additional Information :
All your information will be kept confidential according to EEO guidelines.
Remote Work :
No
Employment Type :
Contract
Role: Solution ArchitectLocation: Phoenix AZ Locals only (Onsite)Type: Contract Job DescriptionData is responsible for contributing to the design modernization and optimization of enterprise-scale data systems as well as the maintenance and operations strategy for CHP.This role involves designing ...
Role: Solution Architect
Location: Phoenix AZ Locals only (Onsite)
Type: Contract
Job Description
- Data is responsible for contributing to the design modernization and optimization of enterprise-scale data systems as well as the maintenance and operations strategy for CHP.
- This role involves designing and implementing data systems that organize store and manage data within our cloud data platform.
- The architect will perform continuous maintenance and operations work for CHP in the cloud environment.
- They will review and analyze CHPs data infrastructure plan future database solutions and implement systems to support data management for CHP users.
- Additionally this role is accountable for ensuring data integrity making sure the CHP team adheres to data governance standards to maintain accuracy consistency and reliability across all systems.
- The architect will identify data discrepancies and quality issues and work to resolve them. This position requires a strong blend of architectural leadership technical depth and the ability to collaborate with business stakeholders data engineers machine learning practitioners and domain experts to deliver scalable secure and reliable AI-driven solutions.
- The ideal candidate will have a proven track record of delivering end-to-end ETL/ELT pipelines across Databricks Azure and AWS environments.
Key Responsibilities
- Design scalable data lake and data architectures using Databricks and cloud-native services.
- Develop metadata-driven parameterized ingestion frameworks and multi-layer data architectures.
- Optimize data workloads and performance.
- Define data governance frameworks for CHP.
- Design and develop robust data pipelines.
- Architect AI systems including RAG workflows and prompt engineering.
- Lead cloud migration initiatives from legacy systems to modern data platforms.
- Provide architectural guidance best practices and technical leadership across teams.
- Build documentation reusable modules and standardized patterns. Required Skills and Experience
- Strong expertise in cloud platforms primarily Azure or AWS.
- Hands-on experience with Databricks.
- Deep proficiency in Python and SQL.
- Expertise in building ETL/ELT pipelines and ADF workflows.
- Experience architecting data lakes and implementing data governance frameworks. Hands-on experience with CI/CD DevOps and Git-based development.
- Ability to translate business requirements into technical architecture. Technical
Additional Information :
All your information will be kept confidential according to EEO guidelines.
Remote Work :
No
Employment Type :
Contract
View more
View less