Skip to content

Commit

Permalink
update arxiv link
Browse files Browse the repository at this point in the history
  • Loading branch information
CiaoHe committed Nov 29, 2023
1 parent 70c61bf commit 454b761
Show file tree
Hide file tree
Showing 2 changed files with 20 additions and 16 deletions.
18 changes: 10 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# InstructMol: Multi-Modal Integration for Building a Versatile and Reliable Molecular Assistant in Drug Discovery
Codes for our paper
Codes for our paper *InstructMol: Multi-Modal Integration for Building a Versatile and Reliable Molecular Assistant in Drug Discovery*

<!-- *Visual instruction tuning towards large language and vision models with GPT-4 level capabilities.*-->

<!--[[Project Page](https://llava-vl.github.io/)] [[Paper](https://arxiv.org/abs/2304.08485)] [[Demo](https://llava.hliu.cc/)] [[Data](https://github.com/haotian-liu/LLaVA/blob/main/docs/Data.md)] [[Model Zoo](https://github.com/haotian-liu/LLaVA/blob/main/docs/MODEL_ZOO.md)] -->
[[Project Page](https://idea-xl.github.io/InstructMol/)] [[Paper](https://arxiv.org/pdf/2311.16208.pdf)]

## Overview
<p align="center">
Expand Down Expand Up @@ -120,14 +120,16 @@ See [Evaluation.md](Evaluation.md) for detailed instructions on how to evaluate

## Citation
If you find InstructMol useful for your your research and applications, please cite using this BibTeX:
<!-- ```bibtex
@misc{liu2023llava,
title={Visual Instruction Tuning},
author={Liu, Haotian and Li, Chunyuan and Wu, Qingyang and Lee, Yong Jae},
publisher={arXiv:2304.08485},
```bibtex
@misc{cao2023instructmol,
title={InstructMol: Multi-Modal Integration for Building a Versatile and Reliable Molecular Assistant in Drug Discovery},
author={He Cao and Zijing Liu and Xingyu Lu and Yuan Yao and Yu Li},
year={2023},
eprint={2311.16208},
archivePrefix={arXiv},
primaryClass={q-bio.BM}
}
``` -->
```
## Acknowledgement
Expand Down
18 changes: 10 additions & 8 deletions docs/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ <h1 class="title is-1 publication-title">InstructMol: Multi-Modal Integration fo
<div class="publication-links">
<!-- PDF Link. -->
<span class="link-block">
<a href="https://github.com/IDEA-XL/InstructMol"
<a href="https://arxiv.org/pdf/2311.16208.pdf"
class="external-link button is-normal is-rounded is-dark">
<span class="icon">
<i class="fas fa-file-pdf"></i>
Expand All @@ -124,7 +124,7 @@ <h1 class="title is-1 publication-title">InstructMol: Multi-Modal Integration fo
</a>
</span>
<span class="link-block">
<a href="https://github.com/IDEA-XL/InstructMol"
<a href="https://arxiv.org/abs/2311.16208"
class="external-link button is-normal is-rounded is-dark">
<span class="icon">
<i class="ai ai-arxiv"></i>
Expand Down Expand Up @@ -341,11 +341,13 @@ <h2 class="title is-3">Related Links</h2>
<section class="section" id="BibTeX">
<div class="container is-max-desktop content">
<h2 class="title">BibTeX</h2>
<pre><code>@article{park2021nerfies,
author = {Park, Keunhong and Sinha, Utkarsh and Barron, Jonathan T. and Bouaziz, Sofien and Goldman, Dan B and Seitz, Steven M. and Martin-Brualla, Ricardo},
title = {Nerfies: Deformable Neural Radiance Fields},
journal = {ICCV},
year = {2021},
<pre><code>@misc{cao2023instructmol,
title={InstructMol: Multi-Modal Integration for Building a Versatile and Reliable Molecular Assistant in Drug Discovery},
author={He Cao and Zijing Liu and Xingyu Lu and Yuan Yao and Yu Li},
year={2023},
eprint={2311.16208},
archivePrefix={arXiv},
primaryClass={q-bio.BM}
}</code></pre>
</div>
</section>
Expand All @@ -355,7 +357,7 @@ <h2 class="title">BibTeX</h2>
<div class="container">
<div class="content has-text-centered">
<a class="icon-link"
href="./static/videos/nerfies_paper.pdf">
href="https://arxiv.org/pdf/2311.16208.pdf">
<i class="fas fa-file-pdf"></i>
</a>
<a class="icon-link" href="https://github.com/CiaoHe" class="external-link" disabled>
Expand Down

0 comments on commit 454b761

Please sign in to comment.