BioBridge: Bridging biomedical foundation models via knowledge graphs
2024
Foundation models (FMs) learn from large volumes of unlabeled data to demonstrate superior performance across a wide range of tasks. However, FMs developed for biomedical domains have largely remained unimodal, i.e., independently trained and used for tasks on protein sequences alone, small-molecule structures alone, or clinical data alone. To overcome this limitation, we present BioBRIDGE, a parameter-efficient learning framework, to bridge independently trained unimodal FMs to establish multimodal behavior. BioBRIDGE achieves this by utilizing Knowledge Graphs (KG) to learn transformations between one unimodal FM and another without fine-tuning any underlying unimodal FMs. Our results demonstrate that BioBRIDGE can beat the best baseline KG embedding methods (on average by ∼76.3%) in cross-modal retrieval tasks. BioBRIDGE also demonstrates out-of-domain generalization ability by extrapolating to unseen modalities or relations. Additionally, we show that BioBRIDGE presents itself as a general-purpose retriever that can aid biomedical multimodal question answering as well as enhance the guided generation of novel drugs.
Research areas