DescriptionJob Title: Technical Specialist
Education: graduate
Experience: 7 Years
Location: Bangalore/Hyderabad/Mumbai
Key Skills: Snowflake Matillion Datawarehouse & SQL.
Required Skills:
- 7 years of ETL and/or Business Intelligence experience
- Proficient with SQL writing skills.
- Strong Snowflake Developer with Extensive Development experience and Data Analysis required to develop a new complex data warehouse.
- At least 3 full years of recent Snowflake development experience
- Hands-on experience with Snowflake utilities SnowSQL SnowPipe Able to administer and monitor Snowflake computing platform.
- Develop and implement ETL workflows using Matillion including data extraction transformation and loading.
- Integrate data from diverse sources (e.g. databases APIs cloud storage) into Snowflake.
- Optimize data pipelines for performance efficiency and cost-effectiveness.
- Work with data analysts engineers and other stakeholders to understand data requirements and ensure data quality.
- Troubleshoot issues perform maintenance tasks and ensure data integrity.
- Understand data modeling concepts and data warehousing architecture to design robust solutions.
- Hands on experience with data load and manage cloud DB
- Evaluate Snowflake Design considerations for any change in the application
- Build the Logical and Physical data model for snowflake as per the changes required
- Define roles privileges required to access different database objects.
- Define virtual warehouse sizing for Snowflake for different type of workloads.
- Design and code required Database structures and components
- Build the Logical and Physical data model for snowflake as per the changes required
- Deploy fully operational data warehouse solutions into production on Snowflake
- Experience in creation and modification of user accounts and security groups per request
- Handling large and complex sets of XML JSON and CSV from various sources and databases
- Solid grasp of database engineering and design
- Experience using Matillion Understanding of Data integration tools.
- Good knowledge on Cloud Computing AWS and/or Azzure
- Experience with any scripting languages preferably Python
- Experience writing code that aggregates and transforms data from multiple data sources
- Experience designing building and optimizing analytics models in support of downstream BI platforms
- Experience with relational databases.
- Knowledge of GIT Source Control CI/CD
- Strong technical writing/documentation skills.
- Effective written and oral communication skills.
- Experience with processes that extract data from different sources transform the data into a usable and trusted resource and load that data into the systems end-users can access and use downstream to solve business problems (ETL/ELT processes).
Nice to have:
- Scripting with Python.
- SnowPro Certification.
- Experience with an ETL tool like Informatica Datastage etc.
Required Experience:
Unclear Seniority