AI Data Scientist
Job Summary
Gen AI Job Description:
o Experience Level: 3 to 5 Years
o Design implement and manage workflows for integrating and deploying GenAI applications from Azure Amazon or Snowflake. Analyse systems and applications and provide recommendations for design enhancement and development and play an active part in their execution.
o Platform engineering: Collaborate with other teams to integrate AI solutions into existing workflows and systems to get the platform running and available. They configure and manage the underlying infrastructure that supports the platform ensuring scalability reliability and high availability.
o Develop and implement best practices for managing the lifecycle of large language AI models including version control testing and validation.
o Troubleshoot and resolve issues related to the performance and deployment of large language AI models.
o Stay up to date with the latest advancements in large language AI models and operations technologies to continuously improve our AI infrastructure.
o Ability to develop suggest best practices of designing infrastructures that support fine-tuning of models to improve performance and efficiency and troubleshoot any issues that arise during development or deployment.
o Creating and maintaining documentation: Ensure clear and comprehensive documentation of AI/ ML / LLM
o Security integration: GenAI platform engineers weave security best practices throughout the development lifecycle to safeguard the platform from vulnerabilities and data breaches.
o Monitoring and logging: Implementation of robust monitoring and logging systems LLMOps best practices that allows for proactive identification and resolution of potential issues.
o Responsible AI Guardrails: GenAI platform engineers are responsible for ensuring all Responsible AI metrics are governed through proper system infrastructure and monitoring.
o Data privacy and governance: Ensuring user data privacy and adhering to data governance regulations are paramount considerations for GenAI platform engineers.
Requirements:
Bachelors or masters degree in statistics / economics / operation Research / data science / computer science / related field.
2 years of relevant experience in managing Gen AI applications model monitoring model validation implementing I/O guardrails & FinOps monitoring
Strong cross-cultural communication and negotiation skills including the demonstrated ability to solicit opinions and accept feedback and the ability to effectively manage collaboration across time zones.
Understanding of OpenAI Llama Claude Arctic Mistral large language models how to deploy them on cloud/ on-premises and use APIs to build Industry solutions.
Experience with AI/ML frameworks and tools (e.g. Langchain Semantic Kernel TensorFlow PyTorch).
Experience in using LLM models on cloud i.e. OpenAI @ Azure Amazon Bedrock Snowflake Cortex AI
Familiarity with cloud platforms (e.g. AWS Azure Snowflake) and containerization technologies (e.g. Docker Kubernetes).
Advanced & secure coding experience in at least one language (Python PySpark TypeScript)
Exposure to Vector/Graph/SQL Databases non-deterministic automated testing workflow platforms
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills and experience in operating effectively as part of cross-functional teams.
o Experience Level: 3 to 5 Years
o Design implement and manage workflows for integrating and deploying GenAI applications from Azure Amazon or Snowflake. Analyse systems and applications and provide recommendations for design enhancement and development and play an active part in their execution.
o Platform engineering: Collaborate with other teams to integrate AI solutions into existing workflows and systems to get the platform running and available. They configure and manage the underlying infrastructure that supports the platform ensuring scalability reliability and high availability.
o Develop and implement best practices for managing the lifecycle of large language AI models including version control testing and validation.
o Troubleshoot and resolve issues related to the performance and deployment of large language AI models.
o Stay up to date with the latest advancements in large language AI models and operations technologies to continuously improve our AI infrastructure.
o Ability to develop suggest best practices of designing infrastructures that support fine-tuning of models to improve performance and efficiency and troubleshoot any issues that arise during development or deployment.
o Creating and maintaining documentation: Ensure clear and comprehensive documentation of AI/ ML / LLM
o Security integration: GenAI platform engineers weave security best practices throughout the development lifecycle to safeguard the platform from vulnerabilities and data breaches.
o Monitoring and logging: Implementation of robust monitoring and logging systems LLMOps best practices that allows for proactive identification and resolution of potential issues.
o Responsible AI Guardrails: GenAI platform engineers are responsible for ensuring all Responsible AI metrics are governed through proper system infrastructure and monitoring.
o Data privacy and governance: Ensuring user data privacy and adhering to data governance regulations are paramount considerations for GenAI platform engineers.
Requirements:
Bachelors or masters degree in statistics / economics / operation Research / data science / computer science / related field.
2 years of relevant experience in managing Gen AI applications model monitoring model validation implementing I/O guardrails & FinOps monitoring
Strong cross-cultural communication and negotiation skills including the demonstrated ability to solicit opinions and accept feedback and the ability to effectively manage collaboration across time zones.
Understanding of OpenAI Llama Claude Arctic Mistral large language models how to deploy them on cloud/ on-premises and use APIs to build Industry solutions.
Experience with AI/ML frameworks and tools (e.g. Langchain Semantic Kernel TensorFlow PyTorch).
Experience in using LLM models on cloud i.e. OpenAI @ Azure Amazon Bedrock Snowflake Cortex AI
Familiarity with cloud platforms (e.g. AWS Azure Snowflake) and containerization technologies (e.g. Docker Kubernetes).
Advanced & secure coding experience in at least one language (Python PySpark TypeScript)
Exposure to Vector/Graph/SQL Databases non-deterministic automated testing workflow platforms
Excellent problem-solving skills and attention to detail.
Strong communication and collaboration skills and experience in operating effectively as part of cross-functional teams.