Work Location: Pan India
Work Mode: Hybrid (2 3 Days WFO per Week)
Shift Timing: 11:00 AM 9:00 PM
Mode of Interview: Face-to-Face & Video Interview
Job Description We are looking for a highly skilled Senior Data Engineer responsible for designing implementing optimizing and maintaining large-scale data solutions across cloud and on-premise environments. The ideal candidate will have strong expertise in Python SQL Azure and a solid understanding of data warehouse concepts. Experience in Snowflake cloud-based data warehousing and the Retail/CPG domain will be an added advantage.
Roles & Responsibilities 1. Data Engineering & Development -
Develop maintain and optimize data pipelines and workflows.
-
Write high-quality code using Python SQL and Object-Oriented Programming principles.
-
Work across all phases of SDLC including design development testing and implementation.
-
Build and maintain enterprise-level Data Warehouses and Data Marts.
2. Data Warehousing & Cloud -
Strong understanding of data warehousing concepts and architecture.
-
Experience working on cloud-based data warehousing environments (Azure preferred).
-
Integrate data solutions with Azure services such as ADF Blob Storage etc.
-
Work with Snowflake BigQuery SQL Server MySQL Oracle DB2 etc.
3. Snowflake Engineering & Administration (Desired) -
Perform Snowflake performance tuning and cost optimization.
-
Manage metadata monitoring and usage tracking within Snowflake.
-
Implement and manage RBAC (roles users access controls).
-
Troubleshoot data ingestion transformation and query issues.
-
Automate administrative tasks using Python or DBT.
4. Additional Technical Responsibilities -
Work with scripting technologies such as React JS (good to have).
-
Collaborate with architecture product and business teams to understand data requirements.
-
Support large-scale data transformations and ensure high data quality.
-
Follow Agile or Waterfall methodologies as required.
Mandatory Skills Desired Skills
Work Location: Pan India Work Mode: Hybrid (2 3 Days WFO per Week) Shift Timing: 11:00 AM 9:00 PM Mode of Interview: Face-to-Face & Video Interview Job Description We are looking for a highly skilled Senior Data Engineer responsible for designing implementing optimizing and maintaining large-s...
Work Location: Pan India
Work Mode: Hybrid (2 3 Days WFO per Week)
Shift Timing: 11:00 AM 9:00 PM
Mode of Interview: Face-to-Face & Video Interview
Job Description We are looking for a highly skilled Senior Data Engineer responsible for designing implementing optimizing and maintaining large-scale data solutions across cloud and on-premise environments. The ideal candidate will have strong expertise in Python SQL Azure and a solid understanding of data warehouse concepts. Experience in Snowflake cloud-based data warehousing and the Retail/CPG domain will be an added advantage.
Roles & Responsibilities 1. Data Engineering & Development -
Develop maintain and optimize data pipelines and workflows.
-
Write high-quality code using Python SQL and Object-Oriented Programming principles.
-
Work across all phases of SDLC including design development testing and implementation.
-
Build and maintain enterprise-level Data Warehouses and Data Marts.
2. Data Warehousing & Cloud -
Strong understanding of data warehousing concepts and architecture.
-
Experience working on cloud-based data warehousing environments (Azure preferred).
-
Integrate data solutions with Azure services such as ADF Blob Storage etc.
-
Work with Snowflake BigQuery SQL Server MySQL Oracle DB2 etc.
3. Snowflake Engineering & Administration (Desired) -
Perform Snowflake performance tuning and cost optimization.
-
Manage metadata monitoring and usage tracking within Snowflake.
-
Implement and manage RBAC (roles users access controls).
-
Troubleshoot data ingestion transformation and query issues.
-
Automate administrative tasks using Python or DBT.
4. Additional Technical Responsibilities -
Work with scripting technologies such as React JS (good to have).
-
Collaborate with architecture product and business teams to understand data requirements.
-
Support large-scale data transformations and ensure high data quality.
-
Follow Agile or Waterfall methodologies as required.
Mandatory Skills Desired Skills
View more
View less