- I am currently a postdoctoral researcher at the Gaoling School of Artificial Intelligence, Renmin University of China. I earned my PhD from University of Montreal, where I was mentored by Prof. Jian-Yun Nie.
- I completed my master's (2019) and bachelor's (2016) degrees at Renmin University of China, under the guidance of Prof. Zhicheng Dou and Prof. Ji-Rong Wen, delving into various NLP challenges.
- Research interests: Retrieval-augmented generation, large language models for information retrieval, session-based document ranking
- Personal page: https://daod.github.io/
- Google scholar: https://scholar.google.com/citations?user=tBqVOWsAAAAJ
- DBLP: https://dblp.org/pid/71/9704-1.html
- 2024.5: We publish a new toolkit ⚡FlashRAG, which can help implement RAG methods quickly! See more details.
- 2024.5: Congrats! Our three papers have been accepted by ACL 2024!
- 2024.4: We write a new survey about generative information retrieval. See more details.
- 2024.1: We propose a new instruction tuning dataset (INTERS) for unlocking the power of LLMs on search tasks. See more details.
- 2023.11: We analyze the risk of data leakage in LLM pre-training and write a new paper to alert this problem. See more details.
- 2023.8: We write a new survey about applying large language models for information retrieval. See more details.
- 2023.8: We publish a new version of YuLan-Chat. It achieves better performance than the official LLaMA-2 and LLaMA-2-Chat on MMLU, C-Eval, and AGI-Gaokao benchmarks!