Manifold Bio builds AI models for protein therapeutic design trained on proprietary experimental data generated at unprecedented scale. Our in vivo-centric discovery platform produces millions of experimentally validated protein designs per campaign creating the datasets that make our models possible and our approach uniquely powerful. We combine high-throughput protein engineering with computational design to create antibody-like drugs and other biologics. Our world-class team of protein engineers biologists and computational scientists are working together to aim the platform at therapeutic opportunities where precise targeting is the key to overcoming clinical challenges.
Position
Manifolds AI team is actively training protein foundation models on our proprietary experimental datasets. Our generative antibody design model mBER has already demonstrated controllable de novo binder design across multiple million-scale screening campaigns and the team is now scaling foundation model capabilities to push well beyond current performance. We are looking for an AI/ML Scientist to join this effort. You will work alongside our existing model training team to accelerate the development of foundation models fine-tuned on Manifolds data bringing additional depth in pre-training methodology architecture development and large-scale training. Your work will directly improve mBERs design capabilities and unlock new modeling paradigms for the broader team. Youll own foundation model projects end-to-end from architecture selection and training infrastructure to evaluation against real experimental outcomes while contributing to the teams shared research agenda.
Responsibilities
- Advance the teams ongoing foundation model training effortspretraining fine-tuning and evaluating folding docking language and generative design models on Manifolds proprietary experimental data
- Bring depth in training methodology architecture selection and optimization to complement the existing teams expertise
- Develop and scale training pipelines for distributed multi-GPU and multi-node training runs
- Integrate foundation model outputs into mBER to improve binder design success rates and enable new design capabilities
- Design and execute ML experiments with clear hypotheses rigorous evaluation frameworks and systematic analysis
- Establish best practices for mixed-precision training gradient checkpointing and computational efficiency at scale
- Produce clear documentation and analysis supporting architecture and training decisions
Required Qualifications
- Demonstrated experience pretraining and/or fine-tuning protein foundation models (folding docking language models or generative design) with published or otherwise demonstrable results
- Strong familiarity with AlphaFold architecture and training methodology
- 2 years of hands-on experience with PyTorch and/or JAX for deep learning
- Experience with large-scale model training: distributed training multi-GPU/multi-node setups mixed precision gradient checkpointing
- Solid understanding of deep learning architectures (transformers attention mechanisms diffusion/flow matching) and optimization techniques
- Experience working with protein structure data (PDB mmCIF) and/or protein sequence datasets
- Strong statistical analysis and experimental design skills
- Proficiency in Python scientific computing stack (NumPy Pandas scikit-learn)
- Self-directed researcher who can balance guidance with independence
- Excellent written and verbal communication skills for cross-functional collaboration
Preferred Qualifications
- Experience with protein generative design methods (e.g. RFdiffusion ProteinMPNN flow matching approaches)
- Experience with protein language models (e.g. ESM family)
- Published research in computational biology protein design or structural biology
- Experience training on proprietary or domain-specific biological datasets
- Familiarity with Ray for distributed computing
- Experience with Kubernetes (EKS) and cloud computing platforms (AWS)
- Knowledge of protein engineering directed evolution or structural biology wet lab techniques
- Experience working with agentic AI coding tools for fast parallelized execution of modeling experiments
- Previous biotech/pharma industry experience
This Role Might Be Perfect For You If
- You have deep experience training protein foundation models and want to apply that expertise to some of the richest proprietary experimental datasets in the field
- Youre excited about pushing beyond public model performance by leveraging unique large-scale in vivo screening data
- You thrive in high-ownership roles where you can drive research direction while collaborating with a tight-knit world-class team
- You want your models to directly impact real drug discovery programs
If youre excited to train the next generation of protein foundation models on uniquely powerful experimental data please reach out to
We value different experiences and ways of thinking and believe the most talented teams are built by bringing together people of diverse cultures genders and backgrounds.
Required Experience:
IC
Manifold Bio builds AI models for protein therapeutic design trained on proprietary experimental data generated at unprecedented scale. Our in vivo-centric discovery platform produces millions of experimentally validated protein designs per campaign creating the datasets that make our models possibl...
Manifold Bio builds AI models for protein therapeutic design trained on proprietary experimental data generated at unprecedented scale. Our in vivo-centric discovery platform produces millions of experimentally validated protein designs per campaign creating the datasets that make our models possible and our approach uniquely powerful. We combine high-throughput protein engineering with computational design to create antibody-like drugs and other biologics. Our world-class team of protein engineers biologists and computational scientists are working together to aim the platform at therapeutic opportunities where precise targeting is the key to overcoming clinical challenges.
Position
Manifolds AI team is actively training protein foundation models on our proprietary experimental datasets. Our generative antibody design model mBER has already demonstrated controllable de novo binder design across multiple million-scale screening campaigns and the team is now scaling foundation model capabilities to push well beyond current performance. We are looking for an AI/ML Scientist to join this effort. You will work alongside our existing model training team to accelerate the development of foundation models fine-tuned on Manifolds data bringing additional depth in pre-training methodology architecture development and large-scale training. Your work will directly improve mBERs design capabilities and unlock new modeling paradigms for the broader team. Youll own foundation model projects end-to-end from architecture selection and training infrastructure to evaluation against real experimental outcomes while contributing to the teams shared research agenda.
Responsibilities
- Advance the teams ongoing foundation model training effortspretraining fine-tuning and evaluating folding docking language and generative design models on Manifolds proprietary experimental data
- Bring depth in training methodology architecture selection and optimization to complement the existing teams expertise
- Develop and scale training pipelines for distributed multi-GPU and multi-node training runs
- Integrate foundation model outputs into mBER to improve binder design success rates and enable new design capabilities
- Design and execute ML experiments with clear hypotheses rigorous evaluation frameworks and systematic analysis
- Establish best practices for mixed-precision training gradient checkpointing and computational efficiency at scale
- Produce clear documentation and analysis supporting architecture and training decisions
Required Qualifications
- Demonstrated experience pretraining and/or fine-tuning protein foundation models (folding docking language models or generative design) with published or otherwise demonstrable results
- Strong familiarity with AlphaFold architecture and training methodology
- 2 years of hands-on experience with PyTorch and/or JAX for deep learning
- Experience with large-scale model training: distributed training multi-GPU/multi-node setups mixed precision gradient checkpointing
- Solid understanding of deep learning architectures (transformers attention mechanisms diffusion/flow matching) and optimization techniques
- Experience working with protein structure data (PDB mmCIF) and/or protein sequence datasets
- Strong statistical analysis and experimental design skills
- Proficiency in Python scientific computing stack (NumPy Pandas scikit-learn)
- Self-directed researcher who can balance guidance with independence
- Excellent written and verbal communication skills for cross-functional collaboration
Preferred Qualifications
- Experience with protein generative design methods (e.g. RFdiffusion ProteinMPNN flow matching approaches)
- Experience with protein language models (e.g. ESM family)
- Published research in computational biology protein design or structural biology
- Experience training on proprietary or domain-specific biological datasets
- Familiarity with Ray for distributed computing
- Experience with Kubernetes (EKS) and cloud computing platforms (AWS)
- Knowledge of protein engineering directed evolution or structural biology wet lab techniques
- Experience working with agentic AI coding tools for fast parallelized execution of modeling experiments
- Previous biotech/pharma industry experience
This Role Might Be Perfect For You If
- You have deep experience training protein foundation models and want to apply that expertise to some of the richest proprietary experimental datasets in the field
- Youre excited about pushing beyond public model performance by leveraging unique large-scale in vivo screening data
- You thrive in high-ownership roles where you can drive research direction while collaborating with a tight-knit world-class team
- You want your models to directly impact real drug discovery programs
If youre excited to train the next generation of protein foundation models on uniquely powerful experimental data please reach out to
We value different experiences and ways of thinking and believe the most talented teams are built by bringing together people of diverse cultures genders and backgrounds.
Required Experience:
IC
View more
View less