Client Name: Nous
End Client Name: Client is of chip manufacturing domain
Job Title: Data Architect Databricks
Location: Santa Clara CA (Fully Onsite)- Local needed (Min 50 miles)
Work Type: Onsite
Job Type: Contract (6 Months)
Rate: $70/hr on w2 (USC/ GC only)
Independent candidates only or own corp or W2
Strong Experience in Databricks AWS and Snowflake is required.
Notes:
- Fully onsite from day one
- Duration: 6 months
- Must have strong Databricks and Snowflake experience
- LinkedIn is a MUST
About the Role
Were seeking a visionary Data Architect with deep expertise in Databricks to lead the design implementation and optimization of our enterprise data architecture. Youll be instrumental in shaping scalable data solutions that empower analytics AI and business intelligence across the organization.
If you thrive in a fast-paced environment love solving complex data challenges and have a passion for cloud-native platforms like AWS Databricks we want to hear from you.
Key Responsibilities
- Design and implement robust scalable and secure data architectures using Databricks Spark Delta Lake and cloud-native tools.
- Collaborate with data engineers analysts and business stakeholders to define data models pipelines and governance strategies.
- Develop and maintain data lakehouses ensuring optimal performance and cost-efficiency.
- Define best practices for data ingestion transformation and storage using Databricks notebooks jobs and workflows.
- Architect solutions for real-time and batch data processing.
- Ensure data quality lineage and compliance with internal and external standards.
- Lead migration efforts from legacy systems to modern cloud-based data platforms.
- Mentor junior team members and evangelize data architecture principles across the organization.
Required Skills & Qualifications
- 12 years of experience in data architecture with 5 years hands-on in Databricks.
- Strong Experience in Snowflake
- Experience in cloud platforms AWS especially AWS Databricks.
- Strong proficiency in Apache Spark Delta Lake and PySpark.
- Experience with data modeling ETL/ELT pipelines and data warehousing.
- Familiarity with CI/CD DevOps and Infrastructure as Code (Terraform ARM templates).
- Knowledge of data governance security and compliance frameworks.
- Excellent communication and stakeholder management skills.
Preferred Qualifications
- Databricks Certified Data Engineer or Architect.
- Experience with MLflow Unity Catalog and Lakehouse architecture.
- Background in machine learning AI or advanced analytics.
- Experience with tools like Apache Airflow dbt or Power BI/Tableau.