drjobs
Snowflake Data Engineer
drjobs
Snowflake Data Engin....
drjobs Snowflake Data Engineer العربية

Snowflake Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Job Location

drjobs

- India

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2741063
Role: Snowflakes Data Engineer
Location: India Remote Work from Anywhere in India
Minimum Qualification: Bachelors degree or higher in the technology field.
Indicative Experience: 3 Years
Domain: US HealthCare.
Customer Profile: Captive research and development pods for a 500 millionmember group of pharma data research companies that help patients gain access to lifesaving therapies. We help our clients navigate the complexities at each step of the drug development life cycle from pipeline to patient.
Other benefits: Health Insurance Provident Fund Life Insurance Reimbursement of Certification Expenses Gratuity 24x7 Health Desk

About the Company
We are headquartered in Pittsburgh USA with locations across the globe. We are a team of thoughtful experts driven by the power of our clients unique ideas. We also have micro offices in Hyderabad Chennai Bengaluru and Delhi NCR in India.
While technical expertise is ingrained in Agilites DNA we are more than just engineers and developers; we are trusted product strategists. We pride ourselves on being a ready resource for critical market insights with the knowledge and experience required to design build and scale big ideas to serve our growing list of customers in the USA and Europe.
Our preferred working model is working from anywhere. In addition you can also decide on your work schedule. All we need is the outcome. Our peoplecentric culture is built on the belief that extraordinary employees create amazing things. Work with us and attain your Ikigai in a place where your aspirations and business objectives intersect.

Key Roles and Responsibilities:
As a Data Engineer ensure that data is cleansed mapped transformed and otherwise optimized for storage and use according to business and technical requirements
Develop and maintain innovative Data Solutions
Able to automate tasks (reusability of code) and deploy production standard code (with unit testing continuous integration versioning etc.)
Load transformed data into storage and reporting structures in destinations including data warehouse highspeed indexes realtime reporting systems and analytics applications
Build data pipelines to collectively bring data together
Other responsibilities include extracting data troubleshooting and maintaining the data warehouse
Use best practice coding standards enhance and adapt to new standards as seen fit in a fastpaced and agile environment.
Works in a team environment that includes collaborating with Product Owners Stakeholders and Software engineers.
Ability to Mentor interns and new hires to help them get up to speed while performing their own responsibilities.
Optionally having the ability to become a Data leader and enable Data culture.

Key Requirements
Bachelors degree or higher in the technology field.
37 years of experience in the space of data engineering.
At least 24 years of experience with Snowflake Data warehouse is a must.
o Recent project work should include usage and deploying production code to Snowflake. Out of touch in Snowflake is not preferred or recommended.
At least 24 years of experience with Python development is a must.
o Experience with pyspark snowpark apis would be a plus.
o Experience with basics of data frames and how to around managing data transformations and using custom sdk s on top of dataframes would be a plus.
o Ability to write unit tests using pytests or spark testing base would be a plus.
Experience in the following areas with Snowflake as datastore would be preferred:
o Managing Data modeling using Liquibase
o Managing Data pipelines via SnowSQL / SnowPipe
o Orchestrating data pipelines within Snowflake or outside of Snowflake.
o Experience with using/coding data quality frameworks like Soad/Great Expectations etc.
o Experience with using AirFlow / AWS Stepfunctions / Glue / Spark would be preferred and a plus.
Experience in one or more below:
o Strong in SQL SQL Analytics ELT/ETL ing and Data warehousing.
o Experience with SSIS (nice to have)
o Experience with Azure Synapse / ADF / Cloud ETL tool. (nice to have)
Experience with Code Repository CI/CD test driven development is preferred.
o Creating Unit Testing and Automated integration tests is strongly preferred.
Experience with multicloud AWS/Azure/GCP cloud is a plus. Azure is required/Others preferred.
Experience in handling big data at PB scale (preferred)
Data processing experience at scale generating and consuming large feeds.
Experience with other cloud data warehouses like Redshift/Azure SQLWH etc. (nice to have).
Experience with data governance programs and data offices would be a plus.
Exposure to claims payer formulary RWE EHR/ EMR OR life science Healthcare data is recommended and would be a plus.

data frames,ssis,code repository,data engineering,ci/cd,big data processing,custom sdks,aws stepfunctions,data governance,python development,sql,snowpipe,snowflake data warehouse,pyspark,unit tests,elt/etl,azure data factory,multi-cloud experience,spark,azure synapse,liquibase,python,airflow,aws glue,snowflake,snowsql,extract, transform, load (etl),snowpark apis

Employment Type

Full Time

Company Industry

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala
Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.