Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

通过转换成openai api接口的gemini-1.5,提示接口错误 #26

Closed
codegitnoob opened this issue Jul 9, 2024 · 1 comment
Closed

Comments

@codegitnoob
Copy link

ERROR - Error code: 400 - {'error': {'message': 'Request contains an invalid argument. (request id: 2024070)', 'type': '', 'param': '', 'code': 400}}
Traceback (most recent call last):
  File "/gptpdf/.venv/lib/python3.12/site-packages/GeneralAgent/skills/llm_inference.py", line 152, in _llm_inference_with_stream
    response = client.chat.completions.create(messages=messages, model=model, stream=True, temperature=temperature)

已将gemini-1.5转化为openai的API接口,提示上述参数错误。
单独使用GeneralAgent是可以返回正常结果的。

是不是因为图片的识别接口上,gemini-1.5还有一点特殊性?

@CosmosShadow
Copy link
Owner

可能是的。
如果有接入gemini的需求,建议提一个issue: 支持gemini

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants