MUST HAVE
Hands on experience with writing Complex SQL queries using Joins Self Joins Views Materialized Views Cursor also Recursive use of GROUP BY PARTITION BY functions and SQL Performance tuning
Hands on experience with ETL and Dimensional Data Modelling Slowly Changing Dimensions like Type 1 2 3
o Good understanding of concepts like schema types table types (fact dimension etc.)
Proficiency in Python scripting and programming using Pandas PyParsing Airflow.
o Pandas Tableau server modules NumPy Datetime Apache Airflow related modules APIs
o Setting up Python scripts on DataLab scheduling processes connecting with DataLake S3 etc.
o Data Pipeline automation
o Strong Python programming skills
o Apache Kafka and Python (using client libraries like Confluents librdkafka or kafka python to produce and consume messages from Kafka topics.
o Experience building streaming applications data pipelines and microservices etc.
Should have understanding on Snowflake Architecture experience with designing and building solutions.
o Architecture design aspects performance tuning time travel warehouse concepts scaling clustering micro partitioning
o Experience with SnowSQL SnowPipe
Good to Have Experience with Snowflake performance optimization techniques
Experience with Vertica Single store
Lead Experience in interacting with business and independently develop and lead data projects. Collaborating with Offshore and owning overall project delivery.
Actively participating in discussions with business to understand requirements perform thorough impact analysis and provide suitable solutions.
Role Descriptions: MUST HAVE Hands on experience with writing Complex SQL queries using Joins Self Joins Views Materialized Views Cursor also Recursive use of GROUP BY PARTITION BY functions and SQL Performance tuning Hands on experience with ETL and Dimensional Data Modelling Slowly Changing Dimensions like Type 1 2 3o Good understanding of concepts like schema types table types (fact dimension etc.) Proficiency in Python scripting and programming using Pandas PyParsing Airflow.o Pandas Tableau server modules NumPy Datetime Apache Airflow related modules APIso Setting up Python scripts on DataLab scheduling processes connecting with DataLake S3 etc.o Data Pipeline automationo Strong Python programming skillso Apache Kafka and Python (using client libraries like Confluents librdkafka or kafka python to produce and consume messages from Kafka topics.o Experience building streaming applications data pipelines and microservices etc. Should have understanding on Snowflake Architecture experience with designing and building solutions.o Architecture design aspects performance tuning time travel warehouse concepts scaling clustering micro partitioningo Experience with SnowSQL SnowPipe Good to Have Experience with Snowflake performance optimization techniques Experience with Vertica Single store Lead Experience in interacting with business and independently develop and lead data projects. Collaborating with Offshore and owning overall project delivery. Actively participating in discussions with business to understand requirements perform thorough impact analysis and provide suitable solutions.
Essential Skills: MUST HAVE Hands on experience with writing Complex SQL queries using Joins Self Joins Views Materialized Views Cursor also Recursive use of GROUP BY PARTITION BY functions and SQL Performance tuning Hands on experience with ETL and Dimensional Data Modelling Slowly Changing Dimensions like Type 1 2 3o Good understanding of concepts like schema types table types (fact dimension etc.) Proficiency in Python scripting and programming using Pandas PyParsing Airflow.o Pandas Tableau server modules NumPy Datetime Apache Airflow related modules APIso Setting up Python scripts on DataLab scheduling processes connecting with DataLake S3 etc.o Data Pipeline automationo Strong Python programming skillso Apache Kafka and Python (using client libraries like Confluents librdkafka or kafka python to produce and consume messages from Kafka topics.o Experience building streaming applications data pipelines and microservices etc. Should have understanding on Snowflake Architecture experience with designing and building solutions.o Architecture design aspects performance tuning time travel warehouse concepts scaling clustering micro partitioningo Experience with SnowSQL SnowPipe Good to Have Experience with Snowflake performance optimization techniques Experience with Vertica Single store Lead Experience in interacting with business and independently develop and lead data projects. Collaborating with Offshore and owning overall project delivery. Actively participating in discussions with business to understand requirements perform thorough impact analysis and provide suitable solutions.
MUST HAVE Hands on experience with writing Complex SQL queries using Joins Self Joins Views Materialized Views Cursor also Recursive use of GROUP BY PARTITION BY functions and SQL Performance tuning Hands on experience with ETL and Dimensional Data Modelling Slowly Changing Dimensions like T...
MUST HAVE
Hands on experience with writing Complex SQL queries using Joins Self Joins Views Materialized Views Cursor also Recursive use of GROUP BY PARTITION BY functions and SQL Performance tuning
Hands on experience with ETL and Dimensional Data Modelling Slowly Changing Dimensions like Type 1 2 3
o Good understanding of concepts like schema types table types (fact dimension etc.)
Proficiency in Python scripting and programming using Pandas PyParsing Airflow.
o Pandas Tableau server modules NumPy Datetime Apache Airflow related modules APIs
o Setting up Python scripts on DataLab scheduling processes connecting with DataLake S3 etc.
o Data Pipeline automation
o Strong Python programming skills
o Apache Kafka and Python (using client libraries like Confluents librdkafka or kafka python to produce and consume messages from Kafka topics.
o Experience building streaming applications data pipelines and microservices etc.
Should have understanding on Snowflake Architecture experience with designing and building solutions.
o Architecture design aspects performance tuning time travel warehouse concepts scaling clustering micro partitioning
o Experience with SnowSQL SnowPipe
Good to Have Experience with Snowflake performance optimization techniques
Experience with Vertica Single store
Lead Experience in interacting with business and independently develop and lead data projects. Collaborating with Offshore and owning overall project delivery.
Actively participating in discussions with business to understand requirements perform thorough impact analysis and provide suitable solutions.
Role Descriptions: MUST HAVE Hands on experience with writing Complex SQL queries using Joins Self Joins Views Materialized Views Cursor also Recursive use of GROUP BY PARTITION BY functions and SQL Performance tuning Hands on experience with ETL and Dimensional Data Modelling Slowly Changing Dimensions like Type 1 2 3o Good understanding of concepts like schema types table types (fact dimension etc.) Proficiency in Python scripting and programming using Pandas PyParsing Airflow.o Pandas Tableau server modules NumPy Datetime Apache Airflow related modules APIso Setting up Python scripts on DataLab scheduling processes connecting with DataLake S3 etc.o Data Pipeline automationo Strong Python programming skillso Apache Kafka and Python (using client libraries like Confluents librdkafka or kafka python to produce and consume messages from Kafka topics.o Experience building streaming applications data pipelines and microservices etc. Should have understanding on Snowflake Architecture experience with designing and building solutions.o Architecture design aspects performance tuning time travel warehouse concepts scaling clustering micro partitioningo Experience with SnowSQL SnowPipe Good to Have Experience with Snowflake performance optimization techniques Experience with Vertica Single store Lead Experience in interacting with business and independently develop and lead data projects. Collaborating with Offshore and owning overall project delivery. Actively participating in discussions with business to understand requirements perform thorough impact analysis and provide suitable solutions.
Essential Skills: MUST HAVE Hands on experience with writing Complex SQL queries using Joins Self Joins Views Materialized Views Cursor also Recursive use of GROUP BY PARTITION BY functions and SQL Performance tuning Hands on experience with ETL and Dimensional Data Modelling Slowly Changing Dimensions like Type 1 2 3o Good understanding of concepts like schema types table types (fact dimension etc.) Proficiency in Python scripting and programming using Pandas PyParsing Airflow.o Pandas Tableau server modules NumPy Datetime Apache Airflow related modules APIso Setting up Python scripts on DataLab scheduling processes connecting with DataLake S3 etc.o Data Pipeline automationo Strong Python programming skillso Apache Kafka and Python (using client libraries like Confluents librdkafka or kafka python to produce and consume messages from Kafka topics.o Experience building streaming applications data pipelines and microservices etc. Should have understanding on Snowflake Architecture experience with designing and building solutions.o Architecture design aspects performance tuning time travel warehouse concepts scaling clustering micro partitioningo Experience with SnowSQL SnowPipe Good to Have Experience with Snowflake performance optimization techniques Experience with Vertica Single store Lead Experience in interacting with business and independently develop and lead data projects. Collaborating with Offshore and owning overall project delivery. Actively participating in discussions with business to understand requirements perform thorough impact analysis and provide suitable solutions.
View more
View less