Skip to content

Commit

Permalink
Quick try with Anyscale Endpoints
Browse files Browse the repository at this point in the history
  • Loading branch information
Kundu, Bishwendu authored and Kundu, Bishwendu committed Dec 3, 2023
1 parent 73ce66d commit abbe696
Show file tree
Hide file tree
Showing 14 changed files with 146 additions and 100 deletions.
2 changes: 1 addition & 1 deletion .env.example
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# You must first activate a Billing Account here: https://platform.openai.com/account/billing/overview
# Then get your OpenAI API Key here: https://platform.openai.com/account/api-keys
OPENAI_API_KEY=XXXXXXXX
ANYSCALE_API_KEY=XXXXXXXX

# Generate a random secret: https://generate-secret.vercel.app/32 or `openssl rand -base64 32`
AUTH_SECRET=XXXXXXXX
Expand Down
36 changes: 13 additions & 23 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,24 @@
<a href="https://chat.vercel.ai/">
<img alt="Next.js 14 and App Router-ready AI chatbot." src="https://chat.vercel.ai/opengraph-image.png">
<h1 align="center">Next.js AI Chatbot</h1>
![Alt text](image.png)
<h1 align="center">Llama models powered AI Chatbot</h1>
</a>

<p align="center">
An open-source AI chatbot app template built with Next.js, the Vercel AI SDK, OpenAI, and Vercel KV.
An open-source AI chatbot app experiment built with Next.js, the Vercel AI SDK and Anyscale Endpoints.
</p>

<p align="center">
<a href="#features"><strong>Features</strong></a> ·
<a href="#model-providers"><strong>Model Providers</strong></a> ·
<a href="#deploy-your-own"><strong>Deploy Your Own</strong></a> ·
<a href="#running-locally"><strong>Running locally</strong></a> ·
<a href="#authors"><strong>Authors</strong></a>
</p>
<br/>

## Features

- Try Different Llama model variants in this chatbot
- [Anyscale Endpoints](https://www.anyscale.com/endpoints) for various Llama models to try out.
- [Next.js](https://nextjs.org) App Router
- React Server Components (RSCs), Suspense, and Server Actions
- [Vercel AI SDK](https://sdk.vercel.ai/docs) for streaming chat UI
Expand All @@ -26,24 +27,14 @@
- Styling with [Tailwind CSS](https://tailwindcss.com)
- [Radix UI](https://radix-ui.com) for headless component primitives
- Icons from [Phosphor Icons](https://phosphoricons.com)
- Chat History, rate limiting, and session storage with [Vercel KV](https://vercel.com/storage/kv)
- [NextAuth.js](https://github.com/nextauthjs/next-auth) for authentication

## Model Providers

This template ships with OpenAI `gpt-3.5-turbo` as the default. However, thanks to the [Vercel AI SDK](https://sdk.vercel.ai/docs), you can switch LLM providers to [Anthropic](https://anthropic.com), [Cohere](https://cohere.com/), [Hugging Face](https://huggingface.co), or using [LangChain](https://js.langchain.com) with just a few lines of code.
This experiment ships with Llama models. Thanks to the [Anyscale Endpoints](https://www.anyscale.com/endpoints), you can choose any of the available [Llama models](https://docs.endpoints.anyscale.com/category/supported-models).

## Deploy Your Own
## Vercel AI SDK

You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:

[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?demo-title=Next.js+Chat&demo-description=A+full-featured%2C+hackable+Next.js+AI+chatbot+built+by+Vercel+Labs&demo-url=https%3A%2F%2Fchat.vercel.ai%2F&demo-image=%2F%2Fimages.ctfassets.net%2Fe5382hct74si%2F4aVPvWuTmBvzM5cEdRdqeW%2F4234f9baf160f68ffb385a43c3527645%2FCleanShot_2023-06-16_at_17.09.21.png&project-name=Next.js+Chat&repository-name=nextjs-chat&repository-url=https%3A%2F%2Fgithub.com%2Fvercel-labs%2Fai-chatbot&from=templates&skippable-integrations=1&env=OPENAI_API_KEY%2CAUTH_GITHUB_ID%2CAUTH_GITHUB_SECRET%2CAUTH_SECRET&envDescription=How+to+get+these+env+vars&envLink=https%3A%2F%2Fgithub.com%2Fvercel-labs%2Fai-chatbot%2Fblob%2Fmain%2F.env.example&teamCreateStatus=hidden&stores=[{"type":"kv"}])

## Creating a KV Database Instance

Follow the steps outlined in the [quick start guide](https://vercel.com/docs/storage/vercel-kv/quickstart#create-a-kv-database) provided by Vercel. This guide will assist you in creating and configuring your KV database instance on Vercel, enabling your application to interact with it.

Remember to update your environment variables (`KV_URL`, `KV_REST_API_URL`, `KV_REST_API_TOKEN`, `KV_REST_API_READ_ONLY_TOKEN`) in the `.env` file with the appropriate credentials provided during the KV database setup.
This whole project is based off of the [Next.js AI chatbot template](https://vercel.com/templates/next.js/nextjs-ai-chatbot)

## Running locally

Expand All @@ -62,10 +53,9 @@ pnpm dev

Your app template should now be running on [localhost:3000](http://localhost:3000/).

## Authors

This library is created by [Vercel](https://vercel.com) and [Next.js](https://nextjs.org) team members, with contributions from:
## Standing on the shoulders of giant:

- Jared Palmer ([@jaredpalmer](https://twitter.com/jaredpalmer)) - [Vercel](https://vercel.com)
- Shu Ding ([@shuding\_](https://twitter.com/shuding_)) - [Vercel](https://vercel.com)
- shadcn ([@shadcn](https://twitter.com/shadcn)) - [Vercel](https://vercel.com)
- [Vercel](https://vercel.com)
- [Next.js](https://nextjs.org)
- [Anyscale Endpoints](https://www.anyscale.com/endpoints)
- [Next.js AI Chatbot](https://vercel.com/templates/next.js/nextjs-ai-chatbot)
44 changes: 27 additions & 17 deletions app/api/chat/route.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,27 +7,38 @@ import { nanoid } from '@/lib/utils'

export const runtime = 'edge'

const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY
const anyscaleAI = new OpenAI({
baseURL: "https://api.endpoints.anyscale.com/v1",
apiKey: process.env.ANYSCALE_API_KEY
})

export async function POST(req: Request) {
const json = await req.json()
const { messages, previewToken } = json
const userId = (await auth())?.user.id
const { messages, previewToken, selectedModel } = json
// const userId = (await auth())?.user.id

if (!userId) {
return new Response('Unauthorized', {
status: 401
// if (!userId) {
// return new Response('Unauthorized', {
// status: 401
// })
// }

if (previewToken) {
anyscaleAI.apiKey = previewToken
} else {
return new Response('Anyscale Endpoint Token missing', {
status: 400
})
}

if (previewToken) {
openai.apiKey = previewToken
if(!selectedModel) {
return new Response('Model not selected', {
status: 400
})
}

const res = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
const res = await anyscaleAI.chat.completions.create({
model: selectedModel,
messages,
temperature: 0.7,
stream: true
Expand All @@ -42,7 +53,6 @@ export async function POST(req: Request) {
const payload = {
id,
title,
userId,
createdAt,
path,
messages: [
Expand All @@ -53,11 +63,11 @@ export async function POST(req: Request) {
}
]
}
await kv.hmset(`chat:${id}`, payload)
await kv.zadd(`user:chat:${userId}`, {
score: createdAt,
member: `chat:${id}`
})
// await kv.hmset(`chat:${id}`, payload)
// await kv.zadd(`user:chat:${userId}`, {
// score: createdAt,
// member: `chat:${id}`
// })
}
})

Expand Down
48 changes: 24 additions & 24 deletions app/chat/[id]/page.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -11,37 +11,37 @@ export interface ChatPageProps {
}
}

export async function generateMetadata({
params
}: ChatPageProps): Promise<Metadata> {
const session = await auth()
// export async function generateMetadata({
// params
// }: ChatPageProps): Promise<Metadata> {
// const session = await auth()

if (!session?.user) {
return {}
}
// if (!session?.user) {
// return {}
// }

const chat = await getChat(params.id, session.user.id)
return {
title: chat?.title.toString().slice(0, 50) ?? 'Chat'
}
}
// const chat = await getChat(params.id, session.user.id)
// return {
// title: chat?.title.toString().slice(0, 50) ?? 'Chat'
// }
// }

export default async function ChatPage({ params }: ChatPageProps) {
const session = await auth()
// const session = await auth()

if (!session?.user) {
redirect(`/sign-in?next=/chat/${params.id}`)
}
// if (!session?.user) {
// redirect(`/sign-in?next=/chat/${params.id}`)
// }

const chat = await getChat(params.id, session.user.id)
// const chat = await getChat(params.id, session.user.id)

if (!chat) {
notFound()
}
// if (!chat) {
// notFound()
// }

if (chat?.userId !== session?.user?.id) {
notFound()
}
// if (chat?.userId !== session?.user?.id) {
// notFound()
// }

return <Chat id={chat.id} initialMessages={chat.messages} />
return <Chat id={params.id} initialMessages={[]} />
}
6 changes: 3 additions & 3 deletions app/layout.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,10 @@ import { Header } from '@/components/header'
export const metadata = {
metadataBase: new URL(`https://${process.env.VERCEL_URL}`),
title: {
default: 'Next.js AI Chatbot',
template: `%s - Next.js AI Chatbot`
default: 'Llama Models AI Chatbot',
template: `%s - Llama Models AI Chatbot`
},
description: 'An AI-powered chatbot template built with Next.js and Vercel.',
description: 'An AI-powered chatbot with Next.js and Llama models.',
icons: {
icon: '/favicon.ico',
shortcut: '/favicon-16x16.png',
Expand Down
Binary file modified app/opengraph-image.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
5 changes: 1 addition & 4 deletions auth.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,12 +28,9 @@ export const {
session.user.id = String(token.id)
}
return session
},
authorized({ auth }) {
return !!auth?.user // this ensures there is a logged in user for -every- request
}
},
pages: {
signIn: '/sign-in' // overrides the next-auth default signin page https://authjs.dev/guides/basics/pages
signIn: '/sign-in', // overrides the next-auth default signin page https://authjs.dev/guides/basics/pages
}
})
50 changes: 48 additions & 2 deletions components/chat-panel.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,13 @@ import { PromptForm } from '@/components/prompt-form'
import { ButtonScrollToBottom } from '@/components/button-scroll-to-bottom'
import { IconRefresh, IconStop } from '@/components/ui/icons'
import { FooterText } from '@/components/footer'
import {
Select,
SelectContent,
SelectItem,
SelectTrigger,
SelectValue
} from './ui/select'

export interface ChatPanelProps
extends Pick<
Expand All @@ -18,6 +25,11 @@ export interface ChatPanelProps
| 'setInput'
> {
id?: string
handleModelChange?: any
}

interface LlamaModelsProps {
handleModelChange: (selectedModel: string) => void
}

export function ChatPanel({
Expand All @@ -28,7 +40,8 @@ export function ChatPanel({
reload,
input,
setInput,
messages
messages,
handleModelChange
}: ChatPanelProps) {
return (
<div className="fixed inset-x-0 bottom-0 bg-gradient-to-b from-muted/10 from-10% to-muted/30 to-50%">
Expand Down Expand Up @@ -57,7 +70,8 @@ export function ChatPanel({
)
)}
</div>
<div className="space-y-4 border-t bg-background px-4 py-2 shadow-lg sm:rounded-t-xl sm:border md:py-4">
<div className="flex flex-col justify-end space-y-4 border-t bg-background px-4 py-2 shadow-lg sm:rounded-t-xl sm:border md:py-4">
<LlamaModels handleModelChange={handleModelChange} />
<PromptForm
onSubmit={async value => {
await append({
Expand All @@ -76,3 +90,35 @@ export function ChatPanel({
</div>
)
}

const LlamaModels: React.FC<LlamaModelsProps> = ({ handleModelChange }) => {
return (
<span className="w-1/2">
<Select onValueChange={handleModelChange}>
<SelectTrigger>
<SelectValue placeholder="Select a LLama" />
</SelectTrigger>
<SelectContent>
<SelectItem value="meta-llama/Llama-2-7b-chat-hf">
Llama-2-7b-chat-hf
</SelectItem>
<SelectItem value="meta-llama/Llama-2-13b-chat-hf">
Llama-2-13b-chat-hf
</SelectItem>
<SelectItem value="codellama/CodeLlama-34b-Instruct-hf">
CodeLlama-34b-Instruct
</SelectItem>
<SelectItem value="HuggingFaceH4/zephyr-7b-beta">
Zephyr-7b-beta
</SelectItem>
<SelectItem value="mistralai/Mistral-7B-Instruct-v0.1">
Mistral-7B-Instruct-v0.1
</SelectItem>
<SelectItem value="meta-llama/Llama-2-70b-chat-hf">
Llama-2-70b-chat-hf
</SelectItem>
</SelectContent>
</Select>
</span>
)
}
Loading

0 comments on commit abbe696

Please sign in to comment.