Data Architect (Dremio Lakehouse)

Not Interested
Bookmark
Report This Job

profile Job Location:

Delhi - India

profile Monthly Salary: Not Disclosed
Posted on: 19 hours ago
Vacancies: 1 Vacancy

Job Summary

Role & Responsibilities

You will be responsible for architecting implementing and optimizing Dremio-based data lakehouse environments integrated with cloud storage BI and data engineering ecosystems. The role requires a strong balance of architecture design data modeling query optimization and governance enablement in large-scale analytical environments.

  • Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
  • Define data ingestion curation and semantic modeling strategies to support analytics and AI workloads.
  • Optimize Dremio reflections caching and query performance for diverse data consumption patterns.
  • Collaborate with data engineering teams to integrate data sources via APIs JDBC Delta/Parquet and object storage layers (S3/ADLS).
  • Establish best practices for data security lineage and access control aligned with enterprise governance policies.
  • Support self-service analytics by enabling governed data products and semantic layers.
  • Develop reusable design patterns documentation and standards for Dremio deployment monitoring and scaling.
  • Work closely with BI and data science teams to ensure fast reliable and well-modeled access to enterprise data.
Ideal Candidate
  • Bachelors or Masters in Computer Science Information Systems or related field.
  • 5 years in data architecture and engineering with 3 years in Dremio or modern lakehouse platforms.
  • Strong expertise in SQL optimization data modeling and performance tuning within Dremio or similar query engines (Presto Trino Athena).
  • Hands-on experience with cloud storage (S3 ADLS GCS) Parquet/Delta/Iceberg formats and distributed query planning.
  • Knowledge of data integration tools and pipelines (Airflow DBT Kafka Spark etc.).
  • Familiarity with enterprise data governance metadata management and role-based access control (RBAC).
  • Excellent problem-solving documentation and stakeholder communication skills.

Preferred:

  • Experience integrating Dremio with BI tools (Tableau Power BI Looker) and data catalogs (Collibra Alation Purview).
  • Exposure to Snowflake Databricks or BigQuery environments.
  • Experience in high-tech manufacturing or enterprise data modernization programs.
Role & Responsibilities You will be responsible for architecting implementing and optimizing Dremio-based data lakehouse environments integrated with cloud storage BI and data engineering ecosystems. The role requires a strong balance of architecture design data modeling query optimization and go...
View more view more

Key Skills

  • Fund Management
  • Drafting
  • End User Support
  • Infrastructure
  • Airlines
  • Catia