| Job Title Data Engineer Location Bangalore (Hybrid) Interview Mode- Video Duration: 6 Months Contract to hire/Full Time Core Responsibilities 1. Data Architecture & Strategy Define enterprise-wide data models standards and integration patterns. Build data platforms on cloud/on-prem (Azure AWS GCP Databricks Snowflake etc.). Establish guidelines for scalability security and cost optimization. 2. Data Pipeline & Integration Design high-performance ETL/ELT pipelines. Implement real-time streaming (Kafka Kinesis Pub/Sub) and batch processing. Enable interoperability across operational analytical and AI systems. 3. Governance & Quality Ensure compliance with data regulations (GDPR HIPAA SAMA etc.). Build frameworks for metadata management lineage and observability. Work closely with Data Quality Management and Master Data Management (MDM). 4. Collaboration Partner with business stakeholders to translate use cases into technical designs. Guide data engineers analysts and scientists on best practices. Align with enterprise architects and solution architects. 5. Innovation & Future-Readiness Evaluate new technologies (e.g. lakehouse vector databases RAG). Architect solutions for AI/ML enablement. Drive automation and adoption of agentic AI in data workflows. Typical Skill Set Tech Foundations: SQL Python/Scala/Java distributed systems. Cloud & Data Platforms: Azure Synapse AWS Redshift GCP BigQuery Snowflake Databricks Hadoop/Spark. Data Modeling: OLTP OLAP dimensional modeling Data Vault Knowledge Graphs. Integration Tools: Informatica Talend dbt Airflow Kafka. Architecture Practices: TOGAF microservices event-driven design. Soft Skills: Stakeholder management solutioning and leadership |