Skip to content

iceray00/LLM_deploy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 

Repository files navigation

LLM_deploy

  • Usage about deploy LLM in Computing Power Platform

In Terminal 1

  • This process is responsible for listening to the memory consumption ratio of CUDA
nvidia-smi -l 1

In Terminal 2

  • Run Ollama serve
ollama serve

In Terminal 3

  • This process is responsible for running the Open-WebUI
open-webui serve

In Terminal 4

  • This process is responsible for running Ngrok for Intranet penetration
ngrok http 8080

Be sure to turn off Academic acceleration(VPN)

About

Usage about deploy LLM in Computing Power Platform

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published