You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the llm function in ai/api/llms.py is created using ChatOpenAI from Langchain with a fixed endpoint for OpenAI.
To make the usage of AI in Briefier OSS more flexible and enable users to utilize other OpenAI-compatible LLM APIs (such as hosted or self-hosted models with OpenAI-compatible interfaces), we could allow specifying a custom base_url via other_args in Langchain's ChatOpenAI options.
To make the endpoint customizable, allow accepting a base_url parameter in other_args, allowing users to set up their own OpenAI-compatible LLM endpoint and map this variable to .env. This would look something like:
llm = ChatOpenAI(
temperature=0,
verbose=False,
openai_api_key=openai_api_key,
model_name=model_id if model_id else config("OPENAI_DEFAULT_MODEL_NAME"),
other_args={"base_url": custom_base_url} if custom_base_url else {}
)
The text was updated successfully, but these errors were encountered:
d8rt8v
changed the title
Feature Request: Allow Using OpenAI-Compatible LLM API
[feature] Allow Using OpenAI-Compatible LLM API
Nov 5, 2024
Currently, the llm function in ai/api/llms.py is created using ChatOpenAI from Langchain with a fixed endpoint for OpenAI.
To make the usage of AI in Briefier OSS more flexible and enable users to utilize other OpenAI-compatible LLM APIs (such as hosted or self-hosted models with OpenAI-compatible interfaces), we could allow specifying a custom base_url via other_args in Langchain's ChatOpenAI options.
To make the endpoint customizable, allow accepting a base_url parameter in other_args, allowing users to set up their own OpenAI-compatible LLM endpoint and map this variable to
.env
. This would look something like:The text was updated successfully, but these errors were encountered: