Closed
Description
Problem
Integrating Ollama with Jan using the single OpenAI endpoint feels challenging. It’s also a hassle to ‘download’ the model.
Success Criteria
- Make it easier to add Ollama endpoints.
- Automatically find available Ollama models and settings.
- Allow multiple Ollama instances (e.g., local for small models, server/cloud for models).
Additional context
Related Reddit comment to be updated: https://www.reddit.com/r/LocalLLaMA/comments/1d8n9wr/comment/l77ifd1/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button
Metadata
Assignees
Type
Projects
Status
Completed