Skip to content

[BUG]ollama运行2.6版本,聊天报错:llama_get_logits_ith: invalid logits id 10, reason: no logits #463

Closed
@zhb-code

Description

是否已有关于该错误的issue或讨论? | Is there an existing issue / discussion for this?

  • 我已经搜索过已有的issues和讨论 | I have searched the existing issues / discussions

该问题是否在FAQ中有解答? | Is there an existing answer for this in FAQ?

  • 我已经搜索过FAQ | I have searched FAQ

当前行为 | Current Behavior

mac本地ollama部署2.6版本,按照文档部署启动ollama服务,聊天时报错:llama_get_logits_ith: invalid logits id 10, reason: no logits,请问这个问题有解决吗?

期望行为 | Expected Behavior

No response

复现方法 | Steps To Reproduce

No response

运行环境 | Environment

- OS:macos
- Python:3.11
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions