Job Summary:
The Solution Architect Data is responsible for designing modernizing and optimizing enterprise-scale data platforms within a cloud environment. This role involves building scalable data architectures implementing robust data pipelines and ensuring data integrity through strong governance practices. The architect will collaborate with various teams to deliver secure reliable and high-performance data solutions including those supporting AI-driven and advanced analytics use cases.
Location:Phoenix Arizona United States
Responsibilities:
- Design and implement scalable data lake and cloud-native data architectures.
- Develop metadata-driven ingestion frameworks and multi-layer data models.
- Build and optimize ETL/ELT pipelines and data workflows.
- Define and enforce data governance quality and integrity standards.
- Lead cloud migration initiatives from legacy systems to modern platforms.
- Architect AI-enabled data solutions including RAG workflows and prompt engineering.
- Provide architectural leadership best practices and technical guidance.
- Create reusable components standardized patterns and technical documentation.
Required Skills & Certifications:
- Strong experience with cloud platforms (Azure or AWS).
- Hands-on expertise with Databricks.
- Advanced proficiency in Python and SQL.
- Experience building ETL/ELT pipelines and orchestration workflows (ADF or equivalent).
- Experience designing data lakes and governance frameworks.
- CI/CD DevOps and Git-based development experience.
- Ability to translate business requirements into technical architecture.
- Bachelors degree in computer science Information Technology Engineering or related field.
- 6 years of experience in data engineering data architecture or related development roles.
Preferred Skills & Certifications:
- Experience with Spark Hadoop Kafka Hive.
- Experience with Redshift SQL Server.
- Experience with ADF SSIS.
Special Considerations:
- In-person interview required.
Scheduling:
- Not specified.
Job Summary: The Solution Architect Data is responsible for designing modernizing and optimizing enterprise-scale data platforms within a cloud environment. This role involves building scalable data architectures implementing robust data pipelines and ensuring data integrity through strong gove...
Job Summary:
The Solution Architect Data is responsible for designing modernizing and optimizing enterprise-scale data platforms within a cloud environment. This role involves building scalable data architectures implementing robust data pipelines and ensuring data integrity through strong governance practices. The architect will collaborate with various teams to deliver secure reliable and high-performance data solutions including those supporting AI-driven and advanced analytics use cases.
Location:Phoenix Arizona United States
Responsibilities:
- Design and implement scalable data lake and cloud-native data architectures.
- Develop metadata-driven ingestion frameworks and multi-layer data models.
- Build and optimize ETL/ELT pipelines and data workflows.
- Define and enforce data governance quality and integrity standards.
- Lead cloud migration initiatives from legacy systems to modern platforms.
- Architect AI-enabled data solutions including RAG workflows and prompt engineering.
- Provide architectural leadership best practices and technical guidance.
- Create reusable components standardized patterns and technical documentation.
Required Skills & Certifications:
- Strong experience with cloud platforms (Azure or AWS).
- Hands-on expertise with Databricks.
- Advanced proficiency in Python and SQL.
- Experience building ETL/ELT pipelines and orchestration workflows (ADF or equivalent).
- Experience designing data lakes and governance frameworks.
- CI/CD DevOps and Git-based development experience.
- Ability to translate business requirements into technical architecture.
- Bachelors degree in computer science Information Technology Engineering or related field.
- 6 years of experience in data engineering data architecture or related development roles.
Preferred Skills & Certifications:
- Experience with Spark Hadoop Kafka Hive.
- Experience with Redshift SQL Server.
- Experience with ADF SSIS.
Special Considerations:
- In-person interview required.
Scheduling:
- Not specified.
View more
View less