Skip to content

Error getting OpenAILike models: TypeError: Cannot read properties of undefined (reading 'map') #652

Closed
@dreher-in

Description

Describe the bug

Just pulled the latest version (mine was still oTTodev, now the latest bolt.diy) and rebuilt my docker stack. Now the OpenAILike seems broken and stopped working. Before the upgrade it worked, so I assume it's not a problem on my side.

docker compose --profile development --env-file .env.local up --build
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string. 
 => => naming to docker.io/library/bolt-ai:development                                                                                                               0.0s
 => [app-dev] resolving provenance for metadata file                                                                                                                 0.0s
WARN[0003] Found orphan containers ([boltnew-any-llm-bolt-ai-dev-1]) for this project. If you removed or renamed this service in your compose file, you can run this command with the --remove-orphans flag to clean it up. 
[+] Running 1/1
 ✔ Container boltnew-any-llm-app-dev-1  Recreated                                                                                                                    1.0s 
Attaching to app-dev-1
app-dev-1  | 
app-dev-1  | > bolt@ dev /app
app-dev-1  | > remix vite:dev "--host" "0.0.0.0"
app-dev-1  | 
app-dev-1  | [warn] Route discovery/manifest behavior is changing in React Router v7
app-dev-1  | ┃ You can use the `v3_lazyRouteDiscovery` future flag to opt-in early.
app-dev-1  | ┃ -> https://remix.run/docs/en/2.13.1/start/future-flags#v3_lazyRouteDiscovery
app-dev-1  | ┗
app-dev-1  | [warn] Data fetching is changing to a single fetch in React Router v7
app-dev-1  | ┃ You can use the `v3_singleFetch` future flag to opt-in early.
app-dev-1  | ┃ -> https://remix.run/docs/en/2.13.1/start/future-flags#v3_singleFetch
app-dev-1  | ┗
app-dev-1  |   ➜  Local:   http://localhost:5173/
app-dev-1  |   ➜  Network: http://192.168.0.2:5173/
app-dev-1  | Error getting OpenAILike models: TypeError: Cannot read properties of undefined (reading 'map')
app-dev-1  |     at Object.getOpenAILikeModels [as getDynamicModels] (/app/app/utils/constants.ts:400:21)
app-dev-1  |     at processTicksAndRejections (node:internal/process/task_queues:95:5)
app-dev-1  |     at async Promise.all (index 1)
app-dev-1  |     at Module.initializeModelList (/app/app/utils/constants.ts:457:9)
app-dev-1  |     at handleRequest (/app/app/entry.server.tsx:30:3)
app-dev-1  |     at handleDocumentRequest (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:340:12)
app-dev-1  |     at requestHandler (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:160:18)
app-dev-1  |     at /app/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run+react@2.15.0_react-dom@18.3.1_react@18.3.1__react@18.3.1_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25

On the litellm side I can see:
litellm-1 | {"message": "litellm.proxy.proxy_server.user_api_key_auth(): Exception occured - Malformed API Key passed in. Ensure Key has Bearer prefix. Passed in: Bearer\nRequester IP Address:192.168.178.8", "level": "ERROR", "timestamp": "2024-12-11T21:03:08.435971", "stacktrace": "Traceback (most recent call last):\n File \"/usr/local/lib/python3.11/site-packages/litellm/proxy/auth/user_api_key_auth.py\", line 569, in user_api_key_auth\n raise Exception(\nException: Malformed API Key passed in. Ensure Key hasBearer prefix. Passed in: Bearer"}

Link to the Bolt URL that caused the error

NA, local

Steps to reproduce

Clone the current git and configure OPENAI_LIKE_API_BASE_URL and OPENAI_LIKE_API_KEY in .env.local and start it.

Expected behavior

Working OpenAILike models, here litellm

Screen Recording / Screenshot

No response

Platform

  • OS: Debian 12
  • Browser: NA
  • Version: cf21dde

Provider Used

Litellm

Model Used

NA

Additional context

No response

Metadata

Assignees

Labels

questionFurther information is requested

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions