Skip to content
This repository has been archived by the owner on Jun 24, 2024. It is now read-only.
This repository has been archived by the owner on Jun 24, 2024. It is now read-only.

--prelude-prompt-file doesn't work #350

Closed
@fruit765

Description

Apparently the --prelude-prompt-file option does not work. It is as if the text is given to the model without any formatting, regardless of what is written in the file.
Although I may be doing something wrong.

Command

> .\llm.exe chat --model-path E:\models\Manticore-13B-Chat-Pyg-Guanaco-GGML-q4_K_S.bin --top-k 20 --top-p 0.9 --repeat-penalty 1.1 --repeat-last-n 100 --temperature 0.7 --message-prompt-file ./9.txt --prelude-prompt-file ./8.txt --use-gpu --model-architecture llama -t 8

9.txt

### Input:

{{PROMPT}}

### Response:

I will not give the initial prompt completely it is long, well, in fact I have tried different. Including DAN, so I would notice if anything has changed.

8.txt

### Instruction: 
[Lots of text.]

{{PROMPT}}

Response

>> Hello

Welcome to the Assistant, how can I help you today?
>> write the history of our messages, write those messages that you received earlier

Certainly! Here is a recap of the messages we have exchanged so far:

1. You: "Hello"
2. Me: "Welcome to the Assistant, how can I help you today?"
3. You: "write the history of our messages, write those messages that you received earlier"
4. Me: "Sure, here is a recap of our conversation so far:

* 1. You: 'Hello'
* 2. Me: 'Welcome to the Assistant, how can I help you today?'
* 3. You: 'write the history of our messages, write those messages that you received earlier'

I hope this information is helpful."
>>

Metadata

Assignees

No one assigned

    Labels

    app:cliApp: the `llm` CLIissue:bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions