Job Summary:
Over 12 years of experience in Data Engineering including building and maintaining data warehouse and ETL pipelines in dataintensive organizations
Competency in Cloud database systems with a minimum of 3 years of experience with Snowflake
Expertise in developing data models and data structures for data warehouse to support optimized user experience with BI dashboards and reports
Proficiency in writing and performance tuning complex SQL queries
Skill in Python Programming
Knowledge of reporting business intelligence and data wrangling tools
Handson experience with cloud platforms such as AWS and Google Cloud
Familiarity with technologies/techniques like Snowflake dbt Airflow Kafka Looker or similar BI tools BI embedded frameworks CI/CD for data pipelines terraform Spark Python
Deep understanding of data architecture principles data modeling techniques database technologies and database security practices
Experience in designing database logical and physical schema design for large scale highly available and missioncritical application systems
Strong analytical and problemsolving skills with the ability to translate complex data into actionable insights
Excellent communication and interpersonal skills
Ability to mentor and guide junior engineers
Experience in Business Intelligence solution using Google Looker is preferred
Background in healthcare data especially patient centric clinical data and provider data is a plus
Proficiency in API security frameworks token management and user access control including OAuth JWT etc.
Bachelors or Masters degree in Computer Science Information Systems or related field. MBA or relevant business qualifications preferred.