Skip to content

Commit

Permalink
feat: support async function on initialMessages
Browse files Browse the repository at this point in the history
  • Loading branch information
himself65 committed Nov 20, 2023
1 parent a9eb9f7 commit 08e7456
Show file tree
Hide file tree
Showing 9 changed files with 241 additions and 114 deletions.
85 changes: 84 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,8 @@ function App () {

### Comparison with `useChat`

#### Less headache

`useChat` is a hook provided by Vercel AI SDK, which is a wrapper of `swr` in React, `swrv` in Vue, and `sswr` in
Svelte.
They actually have the different behaviors in the different frameworks.
Expand All @@ -78,7 +80,88 @@ framework-agnostic way.
Also, `chatAtoms` is created out of the Component lifecycle,
so you can share the state between different components easily.

Even you are using `useChat` in React, its requirements React 18.0.0+.
#### Load messages on demand

`chatAtoms` also allows you to pass async fetch function to `initialMessage` option, which is not supported by `useChat`.

```js
const {
messagesAtom,
inputAtom,
submitAtom
} = chatAtoms({
initialMessages: async () => {
// fetch messages from anywhere
const messages = await fetchMessages()
return messages
}
})
```

With the combination with `jotai-effect`, you can create a chatbot with local storage support.

```js
import { Suspense } from 'react'
import { useAtomValue } from 'jotai'
import { chatAtoms } from 'jotai-ai'
import { atomEffect } from 'jotai-effect'

const {
messagesAtom
} = chatAtoms({
initialMessages: async () => {
/**
* call `noSSR` function if you are using next.js.
* @link https://foxact.skk.moe/no-ssr
*/
// noSSR()
const idb = await import('idb-keyval')
return (await idb.get('messages')) ?? []
}
})

const saveMessagesEffectAtom = atomEffect((get, set) => {
const messages = get(messagesAtom)
const idbPromise = import('idb-keyval')
const abortController = new AbortController()
idbPromise.then(async idb => {
if (abortController.signal.aborted) {
return
}
await idb.set('messages', await messages)
})
return () => {
abortController.abort()
}
})

const Messages = () => {
const messages = useAtomValue(messagesAtom)
return (
<>
{messages.length > 0
? messages.map(m => (
<div key={m.id} className="whitespace-pre-wrap">
{m.role === 'user' ? 'User: ' : 'AI: '}
{m.content}
</div>
))
: null}
</>
)
}

const App = () => {
useAtomValue(saveMessagesEffectAtom)
return (
<main>
<Suspense fallback="loading messages...">
<Messages/>
</Suspense>
</main>
)
}
```
## LICENSE
Expand Down
70 changes: 60 additions & 10 deletions examples/llamaindex-straming/app/components/chat-section.tsx
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
'use client'

import { noSSR } from 'foxact/no-ssr'
import { chatAtoms } from 'jotai-ai'
import { atomEffect } from 'jotai-effect'
import { ChatInput, ChatMessages } from './ui/chat'
import { useAtom, useAtomValue, useSetAtom } from 'jotai/react'
import { Suspense } from 'react'

const {
messagesAtom,
Expand All @@ -11,24 +14,71 @@ const {
isLoadingAtom,
reloadAtom,
stopAtom
} = chatAtoms()
} = chatAtoms({
initialMessages: async () => {
noSSR()
const idb = await import('idb-keyval')
return (await idb.get('messages')) ?? []
}
})

export default function ChatSection () {
const saveMessagesEffectAtom = atomEffect((get, set) => {
const messages = get(messagesAtom)
const idbPromise = import('idb-keyval')
const abortController = new AbortController()
idbPromise.then(async idb => {
if (abortController.signal.aborted) {
return
}
await idb.set('messages', await messages)
})
return () => {
abortController.abort()
}
})

const Messages = () => {
const messages = useAtomValue(messagesAtom)
const [input, handleInputChange] = useAtom(inputAtom)
const handleSubmit = useSetAtom(submitAtom)
const isLoading = useAtomValue(isLoadingAtom)
const reload = useSetAtom(reloadAtom)
const stop = useSetAtom(stopAtom)
return (
<ChatMessages
messages={messages}
isLoading={isLoading}
reload={reload}
stop={stop}
/>
)
}

export default function ChatSection () {
useAtomValue(saveMessagesEffectAtom)
const [input, handleInputChange] = useAtom(inputAtom)
const handleSubmit = useSetAtom(submitAtom)
const isLoading = useAtomValue(isLoadingAtom)

return (
<div className="space-y-4 max-w-5xl w-full">
<ChatMessages
messages={messages}
isLoading={isLoading}
reload={reload}
stop={stop}
/>
<Suspense fallback={
<div className="w-full rounded-xl bg-white p-4 shadow-xl pb-0">
<div
className="flex h-[50vh] flex-col gap-5 divide-y overflow-y-auto pb-4">
<div className="animate-pulse flex space-x-4">
<div className="flex-1 space-y-4 py-1">
<div className="space-y-2">
<div className="h-4 bg-gray-200 rounded w-3/4"></div>
<div className="h-4 bg-gray-200 rounded w-5/6"></div>
<div className="h-4 bg-gray-200 rounded w-4/6"></div>
<div className="h-4 bg-gray-200 rounded w-3/4"></div>
</div>
</div>
</div>
</div>
</div>
}>
<Messages/>
</Suspense>
<ChatInput
input={input}
handleSubmit={handleSubmit}
Expand Down
3 changes: 3 additions & 0 deletions examples/llamaindex-straming/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,11 @@
"@radix-ui/react-slot": "^1.0.2",
"ai": "^2.2.24",
"class-variance-authority": "^0.7.0",
"foxact": "^0.2.26",
"idb-keyval": "^6.2.1",
"jotai": "^2.5.1",
"jotai-ai": "workspace:*",
"jotai-effect": "^0.2.3",
"llamaindex": "0.0.35",
"lucide-react": "^0.292.0",
"next": "^14.0.3",
Expand Down
6 changes: 4 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
],
"scripts": {
"build": "bunchee",
"dev": "bunchee --watch",
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [
Expand All @@ -33,7 +34,8 @@
"react": "^18.2.0",
"typescript": "^5.2.2"
},
"dependencies": {
"nanoid": "^5.0.3"
"peerDependencies": {
"ai": ">=2.2.24",
"jotai": ">=2.5.1"
}
}
48 changes: 38 additions & 10 deletions pnpm-lock.yaml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Loading

0 comments on commit 08e7456

Please sign in to comment.