Skip to content

Empty response via API #8395

Closed
Closed
@gl2007

Description

What is the issue?

Hosted ollama via 0.0.0.0 in my server in my LAN and "curl :11434 " returns ollama is running. Also, when I run ollama run in cmd in that machine, I am able to see proper responses.

However, when I run an API request via Postman, I get this empty response, irrespective of the model, which seems to indicate model is not loaded properly. This also happens on the server machine via Postman using localhost.

{
"model": "Mistral-Nemo-12B-Instruct-2407-Q8_0:latest",
"created_at": "2025-01-12T07:39:16.7356243Z",
"response": "",
"done": true,
"done_reason": "load"
}

But it seems model is loaded correctly as I see it in "ollama ps".

What am I doing wrong?

OS

Windows

GPU

None

CPU

Intel

Ollama version

0.5.4

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions