-
Notifications
You must be signed in to change notification settings - Fork 8.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
allow for a configurable ollama model storage directory #897
Conversation
dd31140
to
5417659
Compare
5417659
to
8143c66
Compare
This is looking great. Given the main use case here is offloading model storage, we should update it to |
- set OLLAMA_MODELS in the environment that ollama is running in to change where model files are stored - update docs Co-Authored-By: Jeffrey Morgan <jmorganca@gmail.com> Co-Authored-By: Jay Nakrani <dhananjaynakrani@gmail.com> Co-Authored-By: Akhil Acharya <akhilcacharya@gmail.com> Co-Authored-By: Sasha Devol <sasha.devol@protonmail.com>
5fcc1c5
to
c7116f6
Compare
- caller handles models dir valdiation
@@ -18,10 +18,6 @@ import ( | |||
"github.com/jmorganca/ollama/version" | |||
) | |||
|
|||
const DefaultHost = "127.0.0.1:11434" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unused
hello, is the change merged and available in the latest release? Even after having the environment variable set in linux Any ideas? Note: I am trying to save the models to an external drive which is in NTFS. I don't think it should matter, but it gave me an error when I tried to move the blobs as it contained |
@jikkuatwork are you running Ollama as a system service (ie: how it is installed by default using the linux install script)? If so you'll need to add the environment variable to the system service and restart it. Here's what that looks like:
sudo nano /etc/systemd/system/ollama.service
[Service]
...
Environment="PATH=$PATH"
Environment="OLLAMA_MODELS=/path/to/models"
...
sudo systemctl daemon-reload
sudo systemctl restart ollama |
I am just copying the executable to |
Hmm, that should work, try running ollama serve with the environment variable directly |
I'm encountering the same problem as @jikkuatwork on v0.1.8. Explicitly setting the variable doesn't seem to be working.
Update: It seems like an ollama folder is correctly being created and used to store blobs at OLLAMA_MODELS. The home directory .ollama folder is only being used to store the ssh key pair. |
Ah, that is it, thanks @yatinlala, the part about public keys is a typo from the original behavior in this issue. I'll edit that and leave it for #228 |
Does this work around work when running the macOS ollama app? I tried setting the environment in my shell and command line.
However the model is still getting downloaded to ~/.ollama/models How do I get this working on macOS? |
Great addition - Should probably be added to the README.md |
Sadly, its still giving me error. This is how I am running it:
But, when I try to pull a model, it gives me an error:
|
I am also having exactly same problem as jikkuatwork, |
I'm running Archlinux and setting |
I'm using ollama version 0.1.17 on Ubuntu 20.04. It was not working using an export |
Any idea why this works and not the variable approach? |
It might be because systemctl does not run a shell with our user environment. So if you export it in your bashrc, systemctl has no knowledge of it. We could use EnvironmentFile= to specify it in a file and simply edit this file. It's well described here. |
It could be helpful to have a cache (with a configurable cache size in terms of number of models, or size on disk) for commonly used models. It is nice to store models on external storage but is better to avoid downloading them everytime I call the ollama server. |
On my system, adding the environment line did not work. Ollama simply did not start with that line added. But this did work: sudo nano /etc/systemd/system/ollama.service.d Add the line: And now it works. |
@sagaholdennoren What's your /etc/systemd/system/ollama.service right now? How do you set it ? |
@nps798: it is in the actual post how to set it in Linux. Please note the additional .d in /etc/systemd/system/ollama.service.d - the actual ollama.service file is unchanged. Open terminal: Now add the line (file is most probably empty): OLLAMA_MODELS=/path/to/models ollama serve I'm unaware how to do this on other systems like mac or win, but I'm sure you can ask ChatGPT 3.5.
|
Is it necessary to install the executable via the script? Won't copy/pasting the binary work? I am using the latest version (0.1.28) and I have set
I have a hunch that this is somehow related with the media, the HDD is in NTFS. I suspect this because it had failed when I copied the It would be super helpful if any one can help. I can't test models because my home folder doesn't have the space to do it. |
I am trying to set the OLLAMA_MODELS to a directory in an external disk where I have space. When I point OLLAMA_MODELS to a directory on external disk ollama failed to start. sudo mkdir -p /media/kenneth/T9/myollama/models I put this line under [Service] in /etc/systemd/system/ollama.service and run the following commands: ollama failed to start with the following error: But using the directory under /tmp works. sudo mkdir -p /tmp/myollama/models How can I get it to work on external disk? |
same issue here... trying to use a folder on a mounted usb disk fails, ollama tries to "mkdir" my mountpoint. |
Ollama does not work with environment variable, if I use systemctl, service by default.
It tries to create folder. However, if i use Ollama with |
Ran into the same problem, this just seems a Linux permission error though. When running in CLI, you run ollama with your own user account so you probably don't have no permission issues because the folders are probably owned by your own user account. In my case, I want the models in Added the model path to the service file. Environment="OLLAMA_MODELS=/home/ollama/Ollama" And restarted the service: As the |
I just created a symbolic link |
My ollama broke today after happily using it for 1 month.. for this reason. $ ollama create -f modelfile my/model
transferring model data
Error: write /tmp/ollama-tf3716824791: no space left on device And I tried everything I can find to fix it.. nope! |
Followup to my problem: |
OLLAMA_MODELS
in the environment that ollama is running in to change where models are stored$ OLLAMA_MODELS=/Users/bruce/ollama_models ollama serve # store models in /Users/bruce/ollama_models
Resolves
#228#153I'll hold off on merging this until #847 is in to avoid causing that PR pain.