drjobs
Snowflake Engineer for Amazon Data Firehouse
drjobs Snowflake Engineer for Amazon Data Firehouse العربية

Snowflake Engineer for Amazon Data Firehouse

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs

Job Location

drjobs

Us - France

Monthly Salary

drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Req ID : 2741035
Role: Snowflake Engineer for Amazon Data Firehouse
Location: India Remote Work from Anywhere in India
Minimum Qualification: BS degree in Computer Science Information Systems or equivalent experience
Indicative Experience: 4 Years
Domain: US HealthCare.
Customer Profile: Captive research and development pods for a 500 millionmember group of pharma data research companies that help patients gain access to lifesaving therapies. We help our clients navigate the complexities at each step of the drug development life cycle from pipeline to patient.
Other benefits: Health Insurance Provident Fund Life Insurance Reimbursement of Certification Expenses Gratuity 24x7 Health Desk
About the Company
We are headquartered in Pittsburgh USA with locations across the globe. We are a team of thoughtful experts driven by the power of our clients unique ideas. We also have micro offices in Hyderabad Chennai Bengaluru and Delhi NCR in India.
While technical expertise is ingrained in Agilites DNA we are more than just engineers and developers; we are trusted product strategists. We pride ourselves on being a ready resource for critical market insights with the knowledge and experience required to design build and scale big ideas to serve our growing list of customers in the USA and Europe.
Our preferred working model is working from anywhere. In addition you can also decide on your work schedule. All we need is the outcome. Our peoplecentric culture is built on the belief that extraordinary employees create amazing things. Work with us and attain your Ikigai in a place where your aspirations and business objectives intersect.

ABOUT THE ROLE
Seeking an experienced Data Engineer with a background in Data Warehouse & Technology to manage and deliver our expanding program of work for integrating business applications to the data warehouse. The Data Engineer will be responsible for designing implementing deploying and supporting various data management technologies and architectures.
You will be a critical team member and developing the means to collect and ingest data developing data models and data engines creating automated data pipelines and taking the lead in making them production ready. You will assist with integrating with existing applications and will have the opportunity to accelerate the delivery of and improve the quality of the enterprise data & insight.

Key Responsibilities:
KEY RESPONSIBILITIES
Analyse data sources and prepare data ingestion pipelines.
Stage data assess the quality of the data cleanse data.
Design and implement ETL jobs and transformations to populate a data lake and data warehouse.
Integrate data pipelines to the enterprise data warehouse following the architecture principles.
Perform and monitor data loads and optimize data for extraction and reporting use.
Administer the data warehouse by performing appropriate database management functions (e.g. maintain space availability set indexing monitor utilization jobs performance check databases integrity) to ensure optimum capacity and data warehouse performance.
Monitor report and analyze usage trends to maintain quality control and high performance of the data retrieval from a database or other data storage.
Administer users access with data governance policies.
Participating in data workshops as necessary
Collaborating with business and technology partners to grow and develop the data engineering practice.

Essential:
ESSENTIAL
4 years of experience working in cloud technology especially AWS stack
Aws data stack experience is a must (AWS Glue) Apache airflow experience.
Strong Python TSQL C# development skills (min 4 years handson experience) for building and redefining data pipelines is required.
Data modelling experience in data warehouse concepts like fact tables lookup tables both designing and creating database objects like tables views stored procedures functions row level security is a must.
4 years of experience working with Data Lakehouse especially Snowflake
Good analytical skills ability to learn quick and synthesize the information.
Experience in fast moving projects with tight deadlines.
Strong communication skills both verbal and written.
The successful candidates will be expected to communicate effectively with both business and technical teams when troubleshooting issues.
Excellent debugging problem solving and testing skills.
Good interpersonal skills and capable of working individually and as part of a team.
BS degree in Computer Science Information Systems or equivalent experience

NoN EssentialESSENTIAL
Realtime data processing with Apache Kafka
Experience in working with / supplying data to visualization tools such as Qlik Tableau Power BI or similar.
Good understanding of data integration patterns.
Experience with / exposure to software development for analytic applications.

c#,debugging,communication skills,data warehouse,aws glue,problem solving,data lakehouse,snowflake,python,transact-sql (t-sql),data modelling,information systems,aws,bs degree in computer science,data lakes,apache airflow,t-sql,extract, transform, load (etl),analytical skills

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.