Detecting and Mitigating Self Contradictory Hallucinations in LLMs using a Multi-Agent System and Stepback Prompting
-
Updated
May 30, 2024 - Jupyter Notebook
Detecting and Mitigating Self Contradictory Hallucinations in LLMs using a Multi-Agent System and Stepback Prompting
Add a description, image, and links to the stepback-prompting topic page so that developers can more easily learn about it.
To associate your repository with the stepback-prompting topic, visit your repo's landing page and select "manage topics."