Skip to content

Streaming doesn't work with CustomStreamingResponseHandler. Okhttp3 error occured #670

Closed
@kevintanhongann

Description

Describe the bug
Error occured during streaming for getting response.

Log and Stack trace
2024-02-24T01:17:40.732+08:00 ERROR 27155 --- [alhost:1337/...] c.m.m.l.CustomStreamingResponseHandler : error processing stream to save question answer pairs

java.lang.IllegalArgumentException: byteCount < 0: -1
at okio.RealBufferedSource.request(RealBufferedSource.kt:204)
at okio.RealBufferedSource.require(RealBufferedSource.kt:202)
at okio.RealBufferedSource.readFully(RealBufferedSource.kt:276)
at okhttp3.internal.sse.ServerSentEventReader$Companion.readData(ServerSentEventReader.kt:148)
at okhttp3.internal.sse.ServerSentEventReader$Companion.access$readData(ServerSentEventReader.kt:112)
at okhttp3.internal.sse.ServerSentEventReader.processNextEvent(ServerSentEventReader.kt:57)
at okhttp3.internal.sse.RealEventSource.processResponse(RealEventSource.kt:75)
at okhttp3.internal.sse.RealEventSource.onResponse(RealEventSource.kt:46)
at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
at java.base/java.lang.Thread.run(Thread.java:1583)

To Reproduce
LocalAiStreamingChatModel model = LocalAiStreamingChatModel
.builder()
.baseUrl("http://localhost:1337/v1")
.modelName("openhermes-2.5-mistral-7b")
.maxTokens(2048)
.timeout(Duration.ofMinutes(2))
.temperature(0.7)
.topP(0.95)
.build();

List messages = Arrays.asList(
systemMessage(
"""

"""
),
userMessage()
);
chatLanguageModel.generate(
messages,
new CustomStreamingResponseHandler());

public class CustomStreamingResponseHandler implements StreamingResponseHandler<AiMessage> {

    private final Logger log = LoggerFactory.getLogger(CustomStreamingResponseHandler.class);

    @Override
    public void onNext(String token) {}

    @Override
    public void onComplete(Response<AiMessage> response) {
        log.debug("response from AI : {}", response.content().text());
    }

    @Override
    public void onError(Throwable error) {
        log.error("error processing stream to save question answer pairs", error);
    }
}

Expected behavior
I should be able to get my responses normally, but it hit onError instead.

Please complete the following information:

  • LangChain4j version: e.g. 0.27.1
  • Java version: e.g. 11
  • Spring Boot version (if applicable): e.g. 3.2

Additional context
I believe the problem occurs at the okhttp level during streaming.

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions