- Bachelors or Masters degree in computer science Engineering Data Science Business Analytics or a related field.
- 8 years of hands-on experience in business intelligence analytics or data engineering with a focus on data systems dashboarding and product analytics.
- Proven track record of building and scaling data pipelines APIs and real-time analytics solutions in cloud environments particularly Azure.
- Experience in leading cross-functional teams and collaborating with Product Strategy Marketing and Engineering teams to deliver data-driven strategies and solutions.
- Expertise in SQL and experience with advanced data modeling and data manipulation techniques.
- Technical Expertise:
Strong proficiency in Python or Scala with hands-on experience in building distributed systems and complex data pipelines to support high-volume real-time data workflows.
Proven experience in architecting and implementing robust scalable data pipelines for data extraction transformation and loading (ETL/ELT) workflows. Ability to design develop and deploy APIs to integrate and automate data flows across systems.
In-depth knowledge of building scalable systems capable of handling large datasets and high transaction volumes. Strong experience in system architecture ensuring high availability performance and reliability of data infrastructure.
Must have hands-on experience with Azure-based tools like Azure Data Factory Power BI and Azure ML enabling the design and implementation of cloud-based data solutions. Familiarity with Azure Databricks or similar platforms (e.g. Snowflake) is highly desirable.
Deep understanding and hands-on experience with modern data warehousing solutions (e.g. Snowflake BigQuery Redshift) including data modeling data lake integration and performance optimization.
Strong experience working with big data technologies such as Spark Hadoop and distributed processing tools optimizing data processing and query performance.
Expertise in advanced SQL for complex data manipulation optimization and data modeling. Proficiency in writing efficient queries for large datasets ensuring data quality and applying advanced techniques like window functions subqueries and indexing for performance optimization.
Experience with tools like dbt Airflow Prefect and Fivetran to manage data workflows and automate tasks.
Familiarity with RESTful API design for integrating various data sources and ensuring seamless flow across data pipelines.
Strong experience in building interactive and insightful data visualizations using tools like Power BI Tableau or custom web-based solutions.
Proficiency in JavaScript for implementing custom visualizations building dynamic dashboards and interacting with APIs in the front-end environment.
Knowledge and hands-on experience with messaging queues (e.g. Kafka RabbitMQ SQS) for ensuring reliable communication and data processing in distributed systems. Ability to manage real-time data ingestion and event-driven architectures effectively.