Skip to content

Latest commit

 

History

History
28 lines (18 loc) · 954 Bytes

README.md

File metadata and controls

28 lines (18 loc) · 954 Bytes

BioMamba

BioMamba is a pre-trained Mamba architecture model specifically designed for the biomedical domain.

Resources

Project Status

For this draft version of the project, please follow the instructions in pretrain/readme.md to run the pretrain script step by step

This project is currently under development. The pretraining code, larger model versions, and more detailed experiments will be made publicly available as soon as possible.

Stay tuned for updates!

How to Cite

If you use BioMamba in your research, please cite the following paper:

@article{yue2024biomamba,
  title={Biomamba: A pre-trained biomedical language representation model leveraging mamba},
  author={Yue, Ling and Xing, Sixue and Lu, Yingzhou and Fu, Tianfan},
  journal={arXiv preprint arXiv:2408.02600},
  year={2024}
}