token visualizer is a token-level visualization tool to visualize LLM.
Run the following command to install the package.
git clone git@github.com:FateScript/token_visualizer.git
cd token_visualizer
pip install -v -e . # or python3 setup.py develop
If you could see the version of token_visualizer
by running
python3 -c "import token_visualizer; print(token_visualizer.__file__)"
Run the following command to start inference visualizer.
python3 visualizer.py
The command will start a OpenAIProxy model, to use it without exception, user should fill in the value of BASE_URL
and OPENAI_KEY
.
token_visualizer
also support OpenAIModel
and HuggingFace TransformerModel
in models.py, feel free to modify the code.
After inputing your prompt, you will see the large language model's answer and the answer's visualization result.
The redder the color of the token, the lower the corresponding probability. The greener the color of the token, the higher the corresponding probability.
Run the following command to start perplexity visualizer, then click the ppl
tab.
python3 visualizer.py
After inputing your text, you will see the perplexity and visualization result of the text.
Run the following command to start interactive tokenizer encoding web demo.
python3 visual_tokenizer.py
User could select tokenizer to interacte with and text to encode. For speical string
- Support ppl visualization.
- Select transformers/openai/TGI with cli.
- Support OpenAI tokenizer visualization.
- Support TGI inference visualization.
- Support multi-turn chat visualization.
- Support dark mode.
- Use front-end setting from https://perplexity.vercel.app/
- Color algorithm from post by thesephist.