- Design develop and maintain scalable data models and pipelines using SQL and Google BigQuery
- Support the MLOps lifecycle including model versioning monitoring for data drift and automated retraining loops.
- Manage data warehousing best practices within the Google Cloud Platform (GCP) environment to ensure scalability and cost-efficiency
- Maintain CI/CD pipelines to ensure seamless deployments across development and production environments
- Extract clean and transform large datasets from multiple sources within the Google Cloud Platform (GCP) ecosystem
- Build automated data workflows for advanced data manipulation statistical analysis and reporting systems using Python-based automation
- Build automated data pipelines to eliminate manual data handling and ensure real-time insight delivery.
- Support the execution of end-to-end data science projects: including data cleaning feature engineering machine learning model selection and post-deployment monitoring
- Collaborate with cross-functional teams (Analytics Product Engineering Finance Marketing) to define data requirements.
- Ensure data quality integrity and governance across all analytics processes.
- Bachelors or Masters degree in Data Science Statistics Computer Science Mathematics or related field
- Minimum of 5 years of experience in data analytics or a similar role.
- Strong proficiency in SQL (complex queries optimization data modelling)
- Hands-on experience with Google BigQuery and the GCP ecosystem
- Advanced proficiency in Python for data analytics and automation
- Familiarity with ML libraries like Scikit-learn TensorFlow or PyTorch
- Solid understanding of data warehousing concepts and ETL processes
- Knowledge of cloud-based data pipelines and orchestration tools (e.g. Airflow)
- Familiarity with version control systems (e.g. Git)
- Experience working in Agile environments
- Exposure to data governance and security best practices
- Attention to detail and data accuracy
- Business acumen and strategic thinking
- Strong communication and stakeholder management skills
- Ability to work independently and lead initiatives
Required Experience:
IC
Design develop and maintain scalable data models and pipelines using SQL and Google BigQuerySupport the MLOps lifecycle including model versioning monitoring for data drift and automated retraining loops.Manage data warehousing best practices within the Google Cloud Platform (GCP) environment to ens...
- Design develop and maintain scalable data models and pipelines using SQL and Google BigQuery
- Support the MLOps lifecycle including model versioning monitoring for data drift and automated retraining loops.
- Manage data warehousing best practices within the Google Cloud Platform (GCP) environment to ensure scalability and cost-efficiency
- Maintain CI/CD pipelines to ensure seamless deployments across development and production environments
- Extract clean and transform large datasets from multiple sources within the Google Cloud Platform (GCP) ecosystem
- Build automated data workflows for advanced data manipulation statistical analysis and reporting systems using Python-based automation
- Build automated data pipelines to eliminate manual data handling and ensure real-time insight delivery.
- Support the execution of end-to-end data science projects: including data cleaning feature engineering machine learning model selection and post-deployment monitoring
- Collaborate with cross-functional teams (Analytics Product Engineering Finance Marketing) to define data requirements.
- Ensure data quality integrity and governance across all analytics processes.
- Bachelors or Masters degree in Data Science Statistics Computer Science Mathematics or related field
- Minimum of 5 years of experience in data analytics or a similar role.
- Strong proficiency in SQL (complex queries optimization data modelling)
- Hands-on experience with Google BigQuery and the GCP ecosystem
- Advanced proficiency in Python for data analytics and automation
- Familiarity with ML libraries like Scikit-learn TensorFlow or PyTorch
- Solid understanding of data warehousing concepts and ETL processes
- Knowledge of cloud-based data pipelines and orchestration tools (e.g. Airflow)
- Familiarity with version control systems (e.g. Git)
- Experience working in Agile environments
- Exposure to data governance and security best practices
- Attention to detail and data accuracy
- Business acumen and strategic thinking
- Strong communication and stakeholder management skills
- Ability to work independently and lead initiatives
Required Experience:
IC
View more
View less