Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Ollama AI Provider #215

Closed
miurla opened this issue Jun 9, 2024 · 6 comments
Closed

Support Ollama AI Provider #215

miurla opened this issue Jun 9, 2024 · 6 comments

Comments

@miurla
Copy link
Owner

miurla commented Jun 9, 2024

Update (9/30): #215 (comment)

Overview

Currently, Ollama support is an unstable and experimental feature. This feature is implemented using the Ollama AI Provider. It is explicitly stated that it is unstable in Object generation and Tool usage. Additionally, Tool streaming is not supported. Morphic is very unstable because it requires these capabilities. Please use it with the understanding of these limitations.

Environment Variables

  • OLLAMA_MODEL=[YOUR_OLLAMA_MODEL]
    • The main model to use. Recommended: mistral or openherms
    • Object generation, Tool usage
  • OLLAMA_SUB_MODEL=[YOUR_OLLAMA_SUB_MODEL]
    • The sub model to use. Recommended: phi3 or llama3
    • Text generation
  • OLLAMA_BASE_URL=[YOUR_OLLAMA_URL]
    • The base URL to use. e.g. http://localhost:11434

PR

@arsaboo
Copy link

arsaboo commented Jul 26, 2024

@miurla Ollama now supports tool call - https://ollama.com/blog/tool-support

@miurla
Copy link
Owner Author

miurla commented Jul 27, 2024

Ollama AI Provider needs to support tool calls: sgomez/ollama-ai-provider#11

@arsaboo
Copy link

arsaboo commented Jul 27, 2024

Looks like we may get that soon....fingers crossed 🤞

@peperunas
Copy link
Contributor

It seems that v0.11 has been released and it has tool support. I haven't tested it yet but I guess I'll try it soon.

https://github.com/sgomez/ollama-ai-provider/releases/tag/ollama-ai-provider%400.11.0

@miurla
Copy link
Owner Author

miurla commented Aug 5, 2024

Ollama's tools do not support streaming yet.

Ollama tooling does not support it in streams, but this provider can detect tool responses.
https://github.com/sgomez/ollama-ai-provider?tab=readme-ov-file#tool-streaming

I tested it with the llama3.1 model, but the Researcher did not work. We need to either not support streaming or manually detect tool-calls and respond.

Test code: #294

image

@miurla miurla changed the title [Unstable] Support Ollama AI Provider Support Ollama AI Provider Sep 30, 2024
@miurla
Copy link
Owner Author

miurla commented Sep 30, 2024

Update

Object generation and Tool usage with Ollama are now working stably, so we have implemented support for them. Ollama currently does not support tool streaming. Therefore, there is a waiting time until the answer is generated.

Model

  • qwen2.5

Currently, the only supported model is qwen2.5

PR

#352

@miurla miurla closed this as completed Oct 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants