-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Trying to use streaming with tools with Ollama via OpenAI is broken #2289
Comments
I did same steps to reproduce , couldn't reproduce it what I'm missing logs:
what i'm missing ? |
@edeandrea thank you for reporting! I was able to reproduce the issue (although in a different way, but I think it is the same issue) and fix it in d2fe663. |
I did a mistake, the fix is also in fef358a |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
This was reported as quarkiverse/quarkus-langchain4j#1164 but was determined to be an issue in the core LangChain4j. I'll copy/paste the issue content from there:
When trying to use Tools with Ollama in streaming mode via the OpenAI API it doesn't seem to work right. It works fine if I use the OpenAI extension to talk to ChatGPT, or the Ollama extension to talk to Ollama. It only seems to be a problem if using the OpenAI endpoint for Ollama.
To reproduce:
rag-ollama-openai-api
branch from https://github.com/edeandrea/parasol-insurance)cd app
ollama pull llama3.2 && ollama pull snowflake-arctic-embed
./mvnw clean quarkus:dev
w
key to open the appCLM195501
)Update claim status to denied
Please complete the following information:
The text was updated successfully, but these errors were encountered: