-
Notifications
You must be signed in to change notification settings - Fork 8.9k
Issues: ollama/ollama
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Ollama: torch.OutOfMemoryError: CUDA out of memory
bug
Something isn't working
#8616
opened Jan 27, 2025 by
kennethwork101
Problems with deepseek-r1:671b, ollama keeps crashing on long answers
bug
Something isn't working
#8614
opened Jan 27, 2025 by
fabiounixpi
[v0.5.4] Download timeouts cause download cache corruption. Any download that needs to be retried by re-running ollama ends up corrupted at 100% download(file sha256-sha256hash-partial-0 not found).
bug
Something isn't working
#8613
opened Jan 27, 2025 by
esperanza-esperanza
Error fetching ANY model locally
bug
Something isn't working
#8605
opened Jan 27, 2025 by
devroopsaha744
Deepseek-R1 671B - Segmentation Fault Bug
bug
Something isn't working
#8602
opened Jan 27, 2025 by
Notbici
Error: an error was encountered while running the model: unexpected EOF (8x H100, deepseek-r1:671b)
bug
Something isn't working
#8599
opened Jan 27, 2025 by
jwatte
Error Running Mistral Nemo Imported from .safetensors
bug
Something isn't working
#8598
opened Jan 26, 2025 by
aallgeier
Error: llama runner process has terminated: error loading model: unable to allocate CUDA0 buffer (4x L40S, 384GB system RAM, Deepseek-R1)
bug
Something isn't working
#8597
opened Jan 26, 2025 by
orlyandico
Ollama on WSL2 detects GPU but timesout when running inference
bug
Something isn't working
#8596
opened Jan 26, 2025 by
rz1027
High idle power consumption due to persistent CUDA initialization
bug
Something isn't working
#8591
opened Jan 26, 2025 by
SvenMeyer
Deepseek R1 throwing weird generation DDDDDDDDDDDDDDDDDDDDDDDDDDDDDDD
bug
Something isn't working
#8583
opened Jan 25, 2025 by
amrrs
ollama version is 0.5.7-0-ga420a45-dirty
bug
Something isn't working
#8582
opened Jan 25, 2025 by
Mario4272
Ollama cannot start because it try to create an existing directory
bug
Something isn't working
#8572
opened Jan 25, 2025 by
brianhuster
running deepseek r1 671b on 64GB / 128GB ram mac gives Something isn't working
Error: llama runner process has terminated: signal: killed
bug
#8571
opened Jan 25, 2025 by
duttaoindril
can not run this on intel Xe gpu
bug
Something isn't working
#8570
opened Jan 24, 2025 by
1009058470
Error when trying to download deepseek-r1:7b
bug
Something isn't working
#8565
opened Jan 24, 2025 by
makhlwf
Error: server metal not listed in available servers map
bug
Something isn't working
#8564
opened Jan 24, 2025 by
felix021
When LLM generates empty string response, Something isn't working
eval_duration
is missing.
bug
#8553
opened Jan 23, 2025 by
wch
deepseek-r1 Something isn't working
qwen
variants use a new pre-tokenizer, which is not implemented in the llama.cpp version used
bug
#8547
opened Jan 23, 2025 by
sealad886
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.