Position: Databricks Architect with GCP
Location: Bolingbrook IL (Onsite role 3 days biweekly)
Duration: 1 Years
Mandatory Skills: Strong expertise in GCP Platform services Apache Spark Delta Lake Databricks SQL Python SQL.
Job Description:
10 years of experience in data engineering or data architecture in Big Data platforms.
3 years hands-on experience with Databricks platform architecture
Strong expertise in Apache SparkDelta LakeDatabricks SQLPython SQL and GCP Platform services.
Responsibilities:
Architect and implement scalable Lakehouse data platforms using Databricks and Delta Lake.
Design robust batch and streaming data pipelines leveraging Apache Spark structured streaming and modern ELT patterns.
Lead migration of Jobs from other cloud data platform to Databricks.
Implement secure data governance access control and lineage using Unity Catalog.
Architect integrations with cloud platforms such as Google Cloud Platform.
Optimize performance and manage compute costs through efficient cluster configuration and Spark workload tuning.
Collaborate with data engineers analytics teams and ML engineers to enable scalable data products and analytics.
Define best practices for data quality reliability and observability across the data platform.
Provide technical leadership architecture guidance and mentorship to data engineering teams.
Position: Databricks Architect with GCP Location: Bolingbrook IL (Onsite role 3 days biweekly) Duration: 1 Years Mandatory Skills: Strong expertise in GCP Platform services Apache Spark Delta Lake Databricks SQL Python SQL. Job Description: 10 years of experience in data engineering or data...
Position: Databricks Architect with GCP
Location: Bolingbrook IL (Onsite role 3 days biweekly)
Duration: 1 Years
Mandatory Skills: Strong expertise in GCP Platform services Apache Spark Delta Lake Databricks SQL Python SQL.
Job Description:
10 years of experience in data engineering or data architecture in Big Data platforms.
3 years hands-on experience with Databricks platform architecture
Strong expertise in Apache SparkDelta LakeDatabricks SQLPython SQL and GCP Platform services.
Responsibilities:
Architect and implement scalable Lakehouse data platforms using Databricks and Delta Lake.
Design robust batch and streaming data pipelines leveraging Apache Spark structured streaming and modern ELT patterns.
Lead migration of Jobs from other cloud data platform to Databricks.
Implement secure data governance access control and lineage using Unity Catalog.
Architect integrations with cloud platforms such as Google Cloud Platform.
Optimize performance and manage compute costs through efficient cluster configuration and Spark workload tuning.
Collaborate with data engineers analytics teams and ML engineers to enable scalable data products and analytics.
Define best practices for data quality reliability and observability across the data platform.
Provide technical leadership architecture guidance and mentorship to data engineering teams.
View more
View less