Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailJob Title: Data Architect Data Migration & Compliance Platform Modernization
Location: Lakshimi Colony T. Nagar Chennai (This is work from office for 3 months initially
Duration: Longterm (C2H)
Role Summary:
We are seeking a highly experienced and handson Data Architect to lead and deliver the endtoend data migration from Databricks and Teradata to BigQuery including complete ETL/ELT pipeline design dataset migration and platform modernization. The candidate will also be responsible for understanding the existing audit compliance monitoring application and building a new data pipeline into the nextgeneration audit platform ensuring high data quality governance and traceability.
Key Responsibilities:
Data Migration & Platform Delivery
Lead and execute data migration strategy from Databricks and Teradata to Google BigQuery
Analyze and migrate existing data models views datasets and ETL pipelines
Redesign schemas for optimized performance and compliance in BigQuery
Identify and address data quality format lineage and validation issues
Audit Compliance Application Transition
Understand the current audit compliance monitoring application
Identify data flows lineage metrics alerts and critical data sets used
Redesign and build pipelines to feed data into the new audit compliance platform
Ensure compliance with audit trails data retention and security policies
ETL/ELT Architecture
Design and implement scalable and efficient ETL/ELT workflows using tools like Apache Airflow dbt Dataflow or custom frameworks
Ensure reusability observability and resilience of data pipelines
Work closely with DevOps for CI/CD of data pipelines
Data Design & Governance
Build logical and physical data models define data contracts and standards
Enable data quality rules validation frameworks and governance mechanisms
Define partitioning clustering cost optimization strategies for BigQuery
Stakeholder Management & Leadership
Collaborate with engineering QA product data analysts compliance and audit teams
Capture requirements translate into technical designs estimate workloads
Lead and mentor a team of data engineers and assign responsibilities effectively
Communicate status risks dependencies and results to senior leadership
Required Skills & Experience:
Data Platforms:
Deep handson expertise in BigQuery Teradata Databricks (Delta Lake Spark)
Good understanding of Google Cloud Platform (GCP) services: GCS Pub/Sub Dataflow Composer
Tools & Frameworks:
Proficiency in SQL Python and orchestration tools (Airflow dbt etc.)
Experience with data pipeline frameworks batch & streaming architectures
Familiarity with audit logging compliance monitoring data security
Strategic & Functional:
Strong data architecture modeling (3NF star snowflake) and schema design experience
Ability to reverse engineer data flows and audit systems
Proven experience in data migration and modernization programs
Experience working with audit compliance or risk applications is a strong plus
Communication & Leadership:
Experience working with crossfunctional teams in Agile/DevOps environment
Excellent skills in requirement analysis effort estimation and task allocation
Ability to lead technically while being handson in design and delivery
Preferred Qualifications:
10 years in data engineering data architecture or BI
Certifications in GCP (Professional Data Engineer or Architect)
Background in audit compliance banking or regulated environments
Deliverables Expected:
Migration plan and roadmap for all datasets
Target data models dictionaries and lineage documentation
ETL/ELT pipeline code and documentation
Data validation test scripts
Migration success criteria and signoff reports
Soft Skills:
Detailoriented organized and analytical thinker
Strong communication and presentation skills
Proactive adaptable and thrives in a fastmoving environment
Full-Time