Skip to content

Commit

Permalink
Merge branch 'lobehub:main' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
cookieY authored Dec 29, 2024
2 parents 13c3eae + 53a226c commit 4b3147c
Show file tree
Hide file tree
Showing 4 changed files with 34 additions and 2 deletions.
25 changes: 25 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,31 @@

# Changelog

### [Version 1.42.1](https://github.com/lobehub/lobe-chat/compare/v1.42.0...v1.42.1)

<sup>Released on **2024-12-29**</sup>

#### 🐛 Bug Fixes

- **misc**: Fix custom max_token not saved from customModelCards.

<br/>

<details>
<summary><kbd>Improvements and Fixes</kbd></summary>

#### What's fixed

- **misc**: Fix custom max_token not saved from customModelCards, closes [#5226](https://github.com/lobehub/lobe-chat/issues/5226) ([ab6d17c](https://github.com/lobehub/lobe-chat/commit/ab6d17c))

</details>

<div align="right">

[![](https://img.shields.io/badge/-BACK_TO_TOP-151515?style=flat-square)](#readme-top)

</div>

## [Version 1.42.0](https://github.com/lobehub/lobe-chat/compare/v1.41.0...v1.42.0)

<sup>Released on **2024-12-29**</sup>
Expand Down
7 changes: 7 additions & 0 deletions changelog/v1.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,11 @@
[
{
"children": {
"fixes": ["Fix custom max_token not saved from customModelCards."]
},
"date": "2024-12-29",
"version": "1.42.1"
},
{
"children": {
"features": ["Add custom stream handle support for LobeOpenAICompatibleFactory."]
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@lobehub/chat",
"version": "1.42.0",
"version": "1.42.1",
"description": "Lobe Chat - an open-source, high-performance chatbot framework that supports speech synthesis, multimodal, and extensible Function Call plugin system. Supports one-click free deployment of your private ChatGPT/LLM web application.",
"keywords": [
"framework",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ const ModelConfigForm = memo<ModelConfigFormProps>(
>
<Input placeholder={t('llm.customModelCards.modelConfig.displayName.placeholder')} />
</Form.Item>
<Form.Item label={t('llm.customModelCards.modelConfig.tokens.title')} name={'tokens'}>
<Form.Item label={t('llm.customModelCards.modelConfig.tokens.title')} name={'contextWindowTokens'}>
<MaxTokenSlider />
</Form.Item>
<Form.Item
Expand Down

0 comments on commit 4b3147c

Please sign in to comment.