Job Title: Data Engineer
Experience: 7 Years
Location: Chennai (On-site)
Employment Type: Full-time
Role Overview
We are looking for an experienced Data Engineer to build optimize and manage scalable data pipelines and architectures. The ideal candidate will have strong expertise in modern data platforms with hands-on experience in Snowflake and cloud-based data solutions.
Key Responsibilities
- Design build and maintain scalable data pipelines and ETL/ELT processes
- Develop and optimize data models and data warehouse solutions
- Work extensively with Snowflake for data storage transformation and performance tuning
- Collaborate with BI analytics and product teams to deliver clean and reliable datasets
- Ensure data quality integrity and governance across systems
- Optimize query performance and cost efficiency in Snowflake
- Integrate data from multiple sources (APIs databases third-party systems)
- Implement data security and access controls
Requirements
Required Skills & Qualifications
- 7 years of experience in Data Engineering or related roles
- Strong expertise in Snowflake (data modeling performance tuning optimization)
- Advanced SQL skills and experience with large-scale data processing
- Hands-on experience with ETL/ELT tools (e.g. Airflow Informatica dbt or similar)
- Experience with cloud platforms such as AWS / Azure / GCP
- Strong understanding of data warehousing concepts (star schema snowflake schema etc.)
- Experience with Python or Scala for data processing
- Knowledge of data pipeline orchestration and scheduling
Benefits
Preferred Skills
- Experience with big data technologies (Spark Hadoop)
- Familiarity with streaming tools (Kafka Kinesis)
- Experience with CI/CD pipelines and DevOps practices
- Exposure to data governance and data security best practices
Key Competencies
- Strong problem-solving and analytical skills
- Ability to work with cross-functional teams
- Good communication and stakeholder management
- High ownership and accountability
Required Skills:
Required Skills & Experience Core Technical Skills Strong proficiency in Python SQL PySpark. Hands-on expertise with Kafka Kafka Connect Debezium Airflow Databricks. Deep experience with BigQuery Snowflake MySQL Postgres MongoDB. Solid understanding of vector data stores and search indexing. Knowledge of GCP services like Big Query Cloud Functions Cloud Run Data Flow Data Proc Data Stream etc.. Good to have Certifications: GCP Professional Data Engineer Elastic Certified Engineer AI Gemini Enterprise Vertex AI Agent Builder ADK Non-Technical & Leadership Skills Communication: Exceptional verbal and written communication skills with the ability to articulate complex technical concepts to both technical and non-technical audiences. Mentorship & Coaching: Proven experience in mentoring junior and mid-level engineers fostering a culture of continuous learning and growth. Problem-Solving: Strong analytical and debugging skills with a proactive approach to identifying and resolving technical roadblocks. Ownership & Accountability: Demonstrates a high level of responsibility for project outcomes system reliability and code quality. Agile Proficiency: Deep understanding and practical experience with Agile methodologies (Scrum/Kanban). Stakeholder Management: Ability to effectively manage expectations and build consensus across different teams. Qualifications Bachelors or Masters degree in Computer Science Engineering or a related field (or equivalent practical experience). Typically 7 years of progressive experience in data engineering with 2 years in a technical leadership or lead engineer role.
Job Title: Data EngineerExperience: 7 Years Location: Chennai (On-site) Employment Type: Full-timeRole OverviewWe are looking for an experienced Data Engineer to build optimize and manage scalable data pipelines and architectures. The ideal candidate will have strong expertise in modern data platfor...
Job Title: Data Engineer
Experience: 7 Years
Location: Chennai (On-site)
Employment Type: Full-time
Role Overview
We are looking for an experienced Data Engineer to build optimize and manage scalable data pipelines and architectures. The ideal candidate will have strong expertise in modern data platforms with hands-on experience in Snowflake and cloud-based data solutions.
Key Responsibilities
- Design build and maintain scalable data pipelines and ETL/ELT processes
- Develop and optimize data models and data warehouse solutions
- Work extensively with Snowflake for data storage transformation and performance tuning
- Collaborate with BI analytics and product teams to deliver clean and reliable datasets
- Ensure data quality integrity and governance across systems
- Optimize query performance and cost efficiency in Snowflake
- Integrate data from multiple sources (APIs databases third-party systems)
- Implement data security and access controls
Requirements
Required Skills & Qualifications
- 7 years of experience in Data Engineering or related roles
- Strong expertise in Snowflake (data modeling performance tuning optimization)
- Advanced SQL skills and experience with large-scale data processing
- Hands-on experience with ETL/ELT tools (e.g. Airflow Informatica dbt or similar)
- Experience with cloud platforms such as AWS / Azure / GCP
- Strong understanding of data warehousing concepts (star schema snowflake schema etc.)
- Experience with Python or Scala for data processing
- Knowledge of data pipeline orchestration and scheduling
Benefits
Preferred Skills
- Experience with big data technologies (Spark Hadoop)
- Familiarity with streaming tools (Kafka Kinesis)
- Experience with CI/CD pipelines and DevOps practices
- Exposure to data governance and data security best practices
Key Competencies
- Strong problem-solving and analytical skills
- Ability to work with cross-functional teams
- Good communication and stakeholder management
- High ownership and accountability
Required Skills:
Required Skills & Experience Core Technical Skills Strong proficiency in Python SQL PySpark. Hands-on expertise with Kafka Kafka Connect Debezium Airflow Databricks. Deep experience with BigQuery Snowflake MySQL Postgres MongoDB. Solid understanding of vector data stores and search indexing. Knowledge of GCP services like Big Query Cloud Functions Cloud Run Data Flow Data Proc Data Stream etc.. Good to have Certifications: GCP Professional Data Engineer Elastic Certified Engineer AI Gemini Enterprise Vertex AI Agent Builder ADK Non-Technical & Leadership Skills Communication: Exceptional verbal and written communication skills with the ability to articulate complex technical concepts to both technical and non-technical audiences. Mentorship & Coaching: Proven experience in mentoring junior and mid-level engineers fostering a culture of continuous learning and growth. Problem-Solving: Strong analytical and debugging skills with a proactive approach to identifying and resolving technical roadblocks. Ownership & Accountability: Demonstrates a high level of responsibility for project outcomes system reliability and code quality. Agile Proficiency: Deep understanding and practical experience with Agile methodologies (Scrum/Kanban). Stakeholder Management: Ability to effectively manage expectations and build consensus across different teams. Qualifications Bachelors or Masters degree in Computer Science Engineering or a related field (or equivalent practical experience). Typically 7 years of progressive experience in data engineering with 2 years in a technical leadership or lead engineer role.
View more
View less