Skip to content

[Bug]: Got error but the same prompt and prefix worked in Openai playground #1642

Closed Answered by danny-avila
Niqnil asked this question in Troubleshooting
Discussion options

You must be logged in to vote

the google error is specific to Google's servers, not sure why, it has nothing to do with the app.

Can you add Openrouter's codellama/codellama-70b-instruct too? It currently shows error "Something went wrong. Here's the specific error message we encountered: Prompt token count of 30799 exceeds max token count of 4095." when I try it.

Sure, I'll have a solution for all Openrouter models in general, soon.

Replies: 1 comment 11 replies

Comment options

You must be logged in to vote
11 replies
@danny-avila
Comment options

@Niqnil
Comment options

@Niqnil
Comment options

@danny-avila
Comment options

Answer selected by Niqnil
@danny-avila
Comment options

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
3 participants
Converted from issue

This discussion was converted from issue #1641 on January 26, 2024 03:15.