We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
defaultrole
Currently, the LLM pipeline assumes that string prompts already have all chat tokens applied.
This change will add an option to set the defaultrole on inference.
Options for defaultrole:
prompt
user
See this discussion for more: 8bd4d78#r150476159
The text was updated successfully, but these errors were encountered:
Add support for chat messages in LLM pipeline, closes #718
8bd4d78
952a757
davidmezzetti
No branches or pull requests
Currently, the LLM pipeline assumes that string prompts already have all chat tokens applied.
This change will add an option to set the
defaultrole
on inference.Options for
defaultrole
:prompt
(default): applies no chat formatting to input and passes raw to the modeluser
: creates chat messages with the user roleSee this discussion for more: 8bd4d78#r150476159
The text was updated successfully, but these errors were encountered: