drjobs Mid-Data Engineer with Python & Snowflake Jefferies

Mid-Data Engineer with Python & Snowflake Jefferies

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Pune - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Job Title: Data Engineer with Python & Snowflake- Pune

About Us
Capco a Wipro company is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe we support 100 clients across banking financial and Energy sectors. We are recognized for our deep transformation execution and delivery.
WHY JOIN CAPCO
You will work on engaging projects with the largest international and local banks insurance companies payment service providers and other key players in the industry. The projects that will transform the financial services industry.
MAKE AN IMPACT
Innovative thinking delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners we deliver disruptive work that is changing energy and financial services.
#BEYOURSELFATWORK
Capco has a tolerant open culture that values diversity inclusivity and creativity.
CAREER ADVANCEMENT
With no forced hierarchy at Capco everyone has the opportunity to grow as we grow taking their career into their own hands.
DIVERSITY & INCLUSION
We believe that diversity of people and perspective gives us a competitive advantage.

Role Description:
Key Skills: Data Engineering Python Snowflake AWS Git/ Bitbucket
Exp: 9yrs
Location Hinjewadi Pune
Shift timings: 12:30PM- 9:30PM
3 days WFO (Tues Wed Thurs)

Technical Requirement
Job Summary
Job Description: Python & Snowflake Engineer with AI/Cortex Development
4 years of experience in developing Data Engineering and data science projects using Snowflake/AI Cloud platform on AWS cloud. Snow Park experience preferred. Experience with different data modeling techniques is required.
4 yrs experience with Python development. Used tools like VS Code or anaconda version control using Git or Bitbucket and Python unit testing frameworks.
Experience in building snowflake applications using Snowflake AI/Cortex platform (specifically cortex agents cortex search and cortex LLM with understanding of context enrichment using Prompts or Retrieval-Augmented-Generation methods).
Deep understanding of implementing Object oriented programming in the Python data structures like Pandas data frames and writing clean and maintainable Engineering code.
Understanding multi-threading concepts concurrency implementation using Python server-side python custom modules.
Implementing Object-Relational mapping in the python using frameworks like SQLAlchemy or equivalent.
Good at developing and deploying Python applications like lamda on AWS Cloud platform.
Good at deploying web applications on AWS Cloud using docker containers or Kubernetes with experience of using CI/CD pipelines.
Good at developing applications Snowpipe and Snowpark and moving the data from Cloud sources like AWS S3 and handling unstructured data from data lakes.
Good at Snowflake Account hierarchy models Account-role-permissions strategy.
Good at Data sharing using preferably Internal Data Marketplace and Data Exchanges for various Listings.
Good at the Data Governance/Security concepts within Snowflake Row/Column level dynamic data masking concepts using Snowflake Tags.
Good understanding of input query enrichment using Snowflake YAMLs and integrating with LLMs within Snowflake.
Candidate is good at understanding of Relevance search and building custom interaction applications with LLMs.
Nice to have experience in building Snowflake native applications using Streamlit and deploy onto AWS Cloud instances (EC2 or docker containers).
Candidate continuously improving functionality through experimentation performance tuning and customer feedback.
Nice to have any application Cache implementation experience within Python web applications. Nice to have duckdb with Apache arrow experience.
Nice to have implementing CI/CD pipelines within Snowflake applications.
Good at analytical skills problem solving and communicate technical concepts clearly.
Experience using Agile and SCRUM methodologies and preferably with JIRA.

If you are keen to join us you will be part of an organization that values your contributions recognizes your potential and provides ample opportunities for growth. For more information visit . Follow us on Twitter Facebook LinkedIn and YouTube.

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.