Industry Group: Automotive. (MUST HAVE Manufacturing industry)
Job Title : Senior Data Bricks Solution Architect - R1012520
Location : Plano TX (Local to Dallas area in client office 3days/wk.)
Duration : 12 Months Contract (Potential for extension)
Pay Rate : $85 - $90
Custom Skill Requirements:
- Data Bricks: Strong hands-on experience with Databricks (clusters notebooks Delta Lake ML flow Unity Catalog).
- Cloud Platforms: Experience with at least one cloud provider (AWS Azure GCP).
- Data Engineering: Strong proficiency in Spark Python SQL and distributed data processing.
- Architecture: Experience designing large-scale data solutions including ingestion transformation storage and analytics.
- Streaming: Experience with streaming technologies (Structured Streaming Kafka Kinesis EventHub).
- DevOps: CI/CD practices for data pipelines (Azure DevOps GitHub Actions Jenkins etc.).
Qualifying Questions:
- Are you able to design end-to-end data architectures using Databricks Lakehouse Platform
- Do you have strong communication skills with the ability to engage both technical and business teams
- Do you have experience in Manufacturing industry
Job Description:
We are looking for a highly skilled Databricks Solution Architect to lead the design and implementation of scalable enterprise-grade data platforms using Databricks. The ideal candidate will combine strong technical expertise in data engineering and cloud platforms (AWS/Azure/GCP) with architectural leadership solution design capability and strong stakeholder engagement skills.
Key Responsibilities:
- Solution Architecture & Design
- Design end-to-end data architectures using Databricks Lakehouse Platform.
- Architect scalable ETL/ELT pipelines real-time streaming solutions and advanced analytics platforms.
- Define data models storage strategies and integration patterns aligned with business and enterprise architecture standards.
- Provide guidance on cluster configuration performance optimization cost management and workspace governance.
- Technical Leadership
- Lead technical discussions and design workshops with engineering teams and business stakeholders.
- Provide best practices frameworks and reusable component designs for consistent delivery.
- Perform code reviews and provide technical mentoring to data engineers and developers.
- Stakeholder & Project Engagement
- Collaborate with product owners business leaders and analytics teams to translate business requirements into scalable technical solutions.
- Create and present solution proposals architectural diagrams and implementation strategies.
- Support pre-sales or discovery phases with technical input when needed.
- Data Governance Security & Compliance
- Define and implement governance standards across Databricks workspaces (data lineage cataloging access control etc.).
- Ensure compliance with regulatory and organizational security frameworks.
- Implement best practices for monitoring auditing and data quality management.
- Continuous Improvement & Innovation
- Stay updated on Databricks features roadmap and industry trends.
- Recommend improvements optimizations and modernization opportunities across the data ecosystem.
- Evaluate integration of complementary technologies (Delta Live Tables MLflow Unity Catalog streaming frameworks etc.).
Required Skills & Experience:
Technical Skills:
- Databricks Expertise: Strong hands-on experience with Databricks (clusters notebooks Delta Lake MLflow Unity Catalog).
- Cloud Platforms: Experience with at least one cloud provider (AWS Azure GCP).
- Data Engineering: Strong proficiency in Spark Python SQL and distributed data processing.
- Architecture: Experience designing large-scale data solutions including ingestion transformation storage and analytics.
- Streaming: Experience with streaming technologies (Structured Streaming Kafka Kinesis EventHub).
- DevOps: CI/CD practices for data pipelines (Azure DevOps GitHub Actions Jenkins etc.).
Preferred Qualifications:
- Databricks Certified Data Engineer Professional / Architect certification.
- AWS/Azure/GCP cloud architect certifications.
- Experience with BI tools (Tableau Power BI Looker).
- Experience in machine learning workflows and ML operations.
- Background in large-scale data modernization or cloud migration projects.
Soft Skills:
- Strong communication skills with the ability to engage both technical and business teams.
- Experience working in Agile environments.
- Ability to simplify complex technical concepts for non-technical audiences.
- Strong analytical problem-solving and decision-making abilities.
Industry Group: Automotive. (MUST HAVE Manufacturing industry) Job Title : Senior Data Bricks Solution Architect - R1012520 Location : Plano TX (Local to Dallas area in client office 3days/wk.) Duration : 12 Months Contract (Potential for extension) Pay Rate : $85 - $90 Custom Skill Requiremen...
Industry Group: Automotive. (MUST HAVE Manufacturing industry)
Job Title : Senior Data Bricks Solution Architect - R1012520
Location : Plano TX (Local to Dallas area in client office 3days/wk.)
Duration : 12 Months Contract (Potential for extension)
Pay Rate : $85 - $90
Custom Skill Requirements:
- Data Bricks: Strong hands-on experience with Databricks (clusters notebooks Delta Lake ML flow Unity Catalog).
- Cloud Platforms: Experience with at least one cloud provider (AWS Azure GCP).
- Data Engineering: Strong proficiency in Spark Python SQL and distributed data processing.
- Architecture: Experience designing large-scale data solutions including ingestion transformation storage and analytics.
- Streaming: Experience with streaming technologies (Structured Streaming Kafka Kinesis EventHub).
- DevOps: CI/CD practices for data pipelines (Azure DevOps GitHub Actions Jenkins etc.).
Qualifying Questions:
- Are you able to design end-to-end data architectures using Databricks Lakehouse Platform
- Do you have strong communication skills with the ability to engage both technical and business teams
- Do you have experience in Manufacturing industry
Job Description:
We are looking for a highly skilled Databricks Solution Architect to lead the design and implementation of scalable enterprise-grade data platforms using Databricks. The ideal candidate will combine strong technical expertise in data engineering and cloud platforms (AWS/Azure/GCP) with architectural leadership solution design capability and strong stakeholder engagement skills.
Key Responsibilities:
- Solution Architecture & Design
- Design end-to-end data architectures using Databricks Lakehouse Platform.
- Architect scalable ETL/ELT pipelines real-time streaming solutions and advanced analytics platforms.
- Define data models storage strategies and integration patterns aligned with business and enterprise architecture standards.
- Provide guidance on cluster configuration performance optimization cost management and workspace governance.
- Technical Leadership
- Lead technical discussions and design workshops with engineering teams and business stakeholders.
- Provide best practices frameworks and reusable component designs for consistent delivery.
- Perform code reviews and provide technical mentoring to data engineers and developers.
- Stakeholder & Project Engagement
- Collaborate with product owners business leaders and analytics teams to translate business requirements into scalable technical solutions.
- Create and present solution proposals architectural diagrams and implementation strategies.
- Support pre-sales or discovery phases with technical input when needed.
- Data Governance Security & Compliance
- Define and implement governance standards across Databricks workspaces (data lineage cataloging access control etc.).
- Ensure compliance with regulatory and organizational security frameworks.
- Implement best practices for monitoring auditing and data quality management.
- Continuous Improvement & Innovation
- Stay updated on Databricks features roadmap and industry trends.
- Recommend improvements optimizations and modernization opportunities across the data ecosystem.
- Evaluate integration of complementary technologies (Delta Live Tables MLflow Unity Catalog streaming frameworks etc.).
Required Skills & Experience:
Technical Skills:
- Databricks Expertise: Strong hands-on experience with Databricks (clusters notebooks Delta Lake MLflow Unity Catalog).
- Cloud Platforms: Experience with at least one cloud provider (AWS Azure GCP).
- Data Engineering: Strong proficiency in Spark Python SQL and distributed data processing.
- Architecture: Experience designing large-scale data solutions including ingestion transformation storage and analytics.
- Streaming: Experience with streaming technologies (Structured Streaming Kafka Kinesis EventHub).
- DevOps: CI/CD practices for data pipelines (Azure DevOps GitHub Actions Jenkins etc.).
Preferred Qualifications:
- Databricks Certified Data Engineer Professional / Architect certification.
- AWS/Azure/GCP cloud architect certifications.
- Experience with BI tools (Tableau Power BI Looker).
- Experience in machine learning workflows and ML operations.
- Background in large-scale data modernization or cloud migration projects.
Soft Skills:
- Strong communication skills with the ability to engage both technical and business teams.
- Experience working in Agile environments.
- Ability to simplify complex technical concepts for non-technical audiences.
- Strong analytical problem-solving and decision-making abilities.
View more
View less