We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ERROR - Error code: 400 - {'error': {'message': 'Request contains an invalid argument. (request id: 2024070)', 'type': '', 'param': '', 'code': 400}} Traceback (most recent call last): File "/gptpdf/.venv/lib/python3.12/site-packages/GeneralAgent/skills/llm_inference.py", line 152, in _llm_inference_with_stream response = client.chat.completions.create(messages=messages, model=model, stream=True, temperature=temperature)
已将gemini-1.5转化为openai的API接口,提示上述参数错误。 单独使用GeneralAgent是可以返回正常结果的。
是不是因为图片的识别接口上,gemini-1.5还有一点特殊性?
The text was updated successfully, but these errors were encountered:
可能是的。 如果有接入gemini的需求,建议提一个issue: 支持gemini
Sorry, something went wrong.
No branches or pull requests
已将gemini-1.5转化为openai的API接口,提示上述参数错误。
单独使用GeneralAgent是可以返回正常结果的。
是不是因为图片的识别接口上,gemini-1.5还有一点特殊性?
The text was updated successfully, but these errors were encountered: