Skip to content

jhyland01/StreamlitLLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Local LLM Chat UI

Basic Usage

This repo uses Ollama, which should be downloaded for your OS. Install the requirements:

pip install -r requirements.txt

run the app:

streamlit run app.py

Contents

Using a local LLM of your choice you can perform the usual chat, obtain help with code etc. You can also use the RAG page to generate responses based on the text.

Details

  • LLM it made available with Ollama
  • LLM indegration uses llama-index
  • Vector store for document embeddings with chromadb

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages