[Bug] OpenWebUI fails to communicate with OpenAI-like local server when in chat #2447
Description
Hey there! I love your tool so much already.. just had a tiny little problem that I'll work on fixing (but I don't know if I'll be able to). P.S. I can't find the template anywhere so I hope I've provided enough information.
Simply speaking, whenever the WebUI tries to get completion from the local OpenAI API server (hosted through LM Studio fyi), it sends over an empty assistant message. I get the following error on the UI,
And on my local server, I get this:
Expected Behavior: only the user message is sent over to the API, and then when the completion is generated, it gets added to the history.
Actual behavior: the API receives an empty assistant message and can't generate a completion from there.
I confirm that I am on the latest version of OpenWebUI and LM Studio that follows the same exact API format as OpenAI.
After digging a bit into your code, I understand why you are creating this message and adding it to the history before the completion. I think it is to send over the ID of that message and update the message when the completion is done.
My initial thinking is that we could somehow prevent it from being sent over to the API with the simple check (though its still in messages to ensure it gets updated). I'll try to work on it and update you. Thank you for your helpful tool!!
References:
open-webui/src/routes/(app)/c/[id]/+page.svelte
Lines 272 to 286 in 3355577
Changes could be maybe done to solve this issue
open-webui/src/routes/(app)/c/[id]/+page.svelte
Lines 609 to 621 in 3355577