-
Notifications
You must be signed in to change notification settings - Fork 8.6k
Issues: ollama/ollama
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Unable to acess ollama model hosted on a raspberry pi 5 from an other device
bug
Something isn't working
needs more info
More information is needed to assist
#8384
opened Jan 11, 2025 by
Simonko-912
ollama not working
bug
Something isn't working
needs more info
More information is needed to assist
#8371
opened Jan 10, 2025 by
Rachit199
can't use gpu after update
bug
Something isn't working
gpu
needs more info
More information is needed to assist
nvidia
Issues relating to Nvidia GPUs and CUDA
#8349
opened Jan 8, 2025 by
Heart-eartH
I/O error on POST request for "http://localhost:11434/v1/chat/completions\
bug
Something isn't working
needs more info
More information is needed to assist
#8327
opened Jan 7, 2025 by
OnceCrazyer
pull model manifest: open /usr/local/bin/ollama/.ollama/xxx: not a directory
bug
Something isn't working
needs more info
More information is needed to assist
#8290
opened Jan 3, 2025 by
18279811184
I can not connect to 11434 port
bug
Something isn't working
needs more info
More information is needed to assist
#8154
opened Dec 18, 2024 by
1760842797
When models don't fit in VRAM, Issue alert/confirmation instead of running and freezing computer for hours
bug
Something isn't working
needs more info
More information is needed to assist
#8144
opened Dec 17, 2024 by
Mugane
Error: llama runner process has terminated: error:/opt/rocm/lib/libhipblas.so.2: undefined symbol: rocblas_sgbmv_64
amd
Issues relating to AMD GPUs and ROCm
bug
Something isn't working
linux
needs more info
More information is needed to assist
#8108
opened Dec 15, 2024 by
dernikolas
Using structured output with tools always produces empty tool_calls array
bug
Something isn't working
needs more info
More information is needed to assist
#8095
opened Dec 14, 2024 by
grabbou
goroutine 7 [running]
bug
Something isn't working
needs more info
More information is needed to assist
#7879
opened Nov 29, 2024 by
yimdonghyun
Deepseek (various) 236b crashes on run
bug
Something isn't working
needs more info
More information is needed to assist
#7867
opened Nov 27, 2024 by
Maltz42
Instant closure when using shell input with piped output.
bug
Something isn't working
needs more info
More information is needed to assist
#7820
opened Nov 24, 2024 by
WyvernDotRed
GPU radeon not used
amd
Issues relating to AMD GPUs and ROCm
bug
Something isn't working
install
needs more info
More information is needed to assist
#7729
opened Nov 19, 2024 by
alphaonex86
The fine tuned codegemma model exhibits abnormal performance
bug
Something isn't working
needs more info
More information is needed to assist
#7679
opened Nov 15, 2024 by
TheSongg
vram usage does not go back down after model unloads
amd
Issues relating to AMD GPUs and ROCm
bug
Something isn't working
needs more info
More information is needed to assist
#7606
opened Nov 11, 2024 by
CraftMaster163
ollama runner process has terminated: exit status 127
bug
Something isn't working
linux
needs more info
More information is needed to assist
#7550
opened Nov 7, 2024 by
SimpleYj
Error: could not connect to ollama app, is it running?
bug
Something isn't working
needs more info
More information is needed to assist
nvidia
Issues relating to Nvidia GPUs and CUDA
windows
#7524
opened Nov 6, 2024 by
BongozGoBOOM
HIP_VISIBLE_DEVICES vs ROCR_VISIBLE_DEVICES
amd
Issues relating to AMD GPUs and ROCm
bug
Something isn't working
needs more info
More information is needed to assist
#7480
opened Nov 3, 2024 by
nathan-skynet
Fails to build on macOS with "fatal error: {'string','cstdint'} file not found"
bug
Something isn't working
build
Issues relating to building ollama from source
macos
needs more info
More information is needed to assist
#7392
opened Oct 28, 2024 by
efd6
after some time idle / phone standby , getting to the termux ollama run cmd makes it restart the dl from 0
bug
Something isn't working
needs more info
More information is needed to assist
#7344
opened Oct 24, 2024 by
fxmbsw7
Last character being truncated by stop sequence
bug
Something isn't working
needs more info
More information is needed to assist
#7256
opened Oct 18, 2024 by
someone13574
Long responses can corrupt the model until unloaded
bug
Something isn't working
needs more info
More information is needed to assist
#7123
opened Oct 7, 2024 by
ragibson
Downloading models too slow
bug
Something isn't working
needs more info
More information is needed to assist
networking
Issues relating to ollama pull and push
windows
#7109
opened Oct 6, 2024 by
rubenmejiac
GPU Tesla at 100% and ollama don't work, is hung
bug
Something isn't working
needs more info
More information is needed to assist
nvidia
Issues relating to Nvidia GPUs and CUDA
#7061
opened Oct 1, 2024 by
Domi31tls
ollama does not detect Quadro RTX 4000 - cuda driver library failed to get device context 801
bug
Something isn't working
linux
needs more info
More information is needed to assist
nvidia
Issues relating to Nvidia GPUs and CUDA
#7049
opened Sep 30, 2024 by
mfzhsn
Previous Next
ProTip!
no:milestone will show everything without a milestone.