Skip to content

Issues: ollama/ollama

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[Feature] Support Intel GPUs feature request New feature or request
#8414 opened Jan 14, 2025 by NeoZhangJianyu
Running the same model on all GPUs feature request New feature or request gpu
#8404 opened Jan 13, 2025 by ZanMax
Support Code Actions feature request New feature or request
#8368 opened Jan 9, 2025 by asmith26
we need Ollama Video-LLaVA feature request New feature or request
#8355 opened Jan 9, 2025 by ixn3rd3mxn
Dynamic context size in OpenAI API compatibility. feature request New feature or request
#8354 opened Jan 9, 2025 by x0wllaar
[feature] start ollama automatically on startup feature request New feature or request
#8341 opened Jan 7, 2025 by remco-pc
Allow set the type of K/V cache separately feature request New feature or request
#8332 opened Jan 7, 2025 by ag2s20150909
Add a CUDA+AVX2(VNNI) runner to the Docker image. feature request New feature or request
#8324 opened Jan 6, 2025 by x0wllaar
Improve speed on cpu-only feature request New feature or request
#8306 opened Jan 4, 2025 by ErfolgreichCharismatisch
Ollama - Gentoo Linux support feature request New feature or request
#8293 opened Jan 3, 2025 by jaypeche
disable cpu offload for runing llm feature request New feature or request
#8291 opened Jan 3, 2025 by verigle
Allow use of locally installed CUDA or ROCm feature request New feature or request
#8286 opened Jan 2, 2025 by erkinalp
version aware linux upgrade feature request New feature or request install linux
#8233 opened Dec 24, 2024 by lamyergeier
Support llama.cpp's Control Vector Functionality feature request New feature or request
#8110 opened Dec 16, 2024 by amyb-asu
ProTip! no:milestone will show everything without a milestone.