Join our community Discord: AI Stack Devs
AI Town is a virtual town where AI characters live, chat and socialize.
This project is a deployable starter kit for easily building and customizing your own version of AI town. Inspired by the research paper Generative Agents: Interactive Simulacra of Human Behavior.
The primary goal of this project, beyond just being a lot of fun to work on, is to provide a platform with a strong foundation that is meant to be extended. The back-end natively supports shared global state, transactions, and a simulation engine and should be suitable from everything from a simple project to play around with to a scalable, multi-player game. A secondary goal is to make a JS/TS framework available as most simulators in this space (including the original paper above) are written in Python.
- π» Stack
- π§ Installation
- π€ Customize - run YOUR OWN simulated world
- π©βπ» Deploying
- π Credits
- Game engine, database, and vector search: Convex
- Auth (Optional): Clerk
- Default chat model is
llama3
and embeddings withmxbai-embed-large
. - Local inference: Ollama
- Configurable for other cloud LLMs: Together.ai or antying that speaks the OpenAI API. PRs welcome to add more cloud provider support.
- Pixel Art Generation: Replicate, Fal.ai
- Background Music Generation: Replicate using MusicGen
git clone https://github.com/a16z-infra/ai-town.git
cd ai-town
npm install
2. To develop locally with Convex:
Either download a pre-built binary(recommended), or build it from source and run it.
# For new Macs:
curl -L -O https://github.com/get-convex/convex-backend/releases/latest/download/convex-local-backend-aarch64-apple-darwin.zip
unzip convex-local-backend-aarch64-apple-darwin.zip
brew install just
# Runs the server
./convex-local-backend
This also installs just
(e.g. brew install just
or cargo install just
).
We use just
like make
to add extra params, so you run just convex ...
instead of npx convex ...
for local development.
If you're running the pre-built binary on Mac and there's an Apple warning, go to the folder it's in and right-click it and select "Open" to bypass. From then on you can run it from the commandline. Or you can compile it from source and run it (see above).
To develop against the cloud-hosted version, change the package.json scripts
to use convex ...
instead of just convex ...
.
3. To run a local LLM, download and run Ollama.
You can leave the app running or run ollama serve
.
ollama serve
will warn you if the app is already running.
Run ollama pull llama3
to have it download llama3
.
Test it out with ollama run llama3
.
If you want to customize which model to use, adjust convex/util/llm.ts or set
just convex env set OLLAMA_MODEL # model
.
Ollama model options can be found here.
You might want to set NUM_MEMORIES_TO_SEARCH
to 1
in constants.ts,
to reduce the size of conversation prompts, if you see slowness.
Check out convex/config.ts
to configure which models to offer to the UI,
or to set it up to talk to a cloud-hosted LLM.
For Daily background music generation, create a
Replicate account and create a token in your Profile's
API Token page.
npx convex env set REPLICATE_API_TOKEN # token
Specify just
instead of npx
if you're doing local development.
To run both the front and and back end:
npm run dev
You can now visit http://localhost:5173.
If you'd rather run the frontend in a separate terminal from Convex (which syncs your backend functions as they're saved), you can run these two commands:
npm run dev:frontend
npm run dev:backend
See package.json for details, but dev:backend runs just convex dev
Note: The simulation will pause after 5 minutes if the window is idle.
Loading the page will unpause it.
You can also manually freeze & unfreeze the world with a button in the UI.
If you want to run the world without the
browser, you can comment-out the "stop inactive worlds" cron in convex/crons.ts
.
To stop the back end, in case of too much activity
This will stop running the engine and agents. You can still run queries and run functions to debug.
just convex run testing:stop
To restart the back end after stopping it
just convex run testing:resume
To kick the engine in case the game engine or agents aren't running
just convex run testing:kick
To archive the world
If you'd like to reset the world and start from scratch, you can archive the current world:
just convex run testing:archive
Then, you can still look at the world's data in the dashboard, but the engine and agents will no longer run.
You can then create a fresh world with init
.
just convex run init
To clear all databases
You can wipe all tables with the wipeAllTables
testing function.
just convex run testing:wipeAllTables
To pause your backend deployment
You can go to the dashboard to your deployment settings to pause and un-pause your deployment. This will stop all functions, whether invoked from the client, scheduled, or as a cron job. See this as a last resort, as there are gentler ways of stopping above. Once you
NOTE: every time you change character data, you should re-run
just convex run testing:wipeAllTables
and then
npm run dev
to re-upload everything to Convex.
This is because character data is sent to Convex on the initial load.
However, beware that just convex run testing:wipeAllTables
WILL wipe all of your data.
-
Create your own characters and stories: All characters and stories, as well as their spritesheet references are stored in characters.ts. You can start by changing character descriptions.
-
Updating spritesheets: in
data/characters.ts
, you will see this code:
export const characters = [
{
name: 'f1',
textureUrl: '/assets/32x32folk.png',
spritesheetData: f1SpritesheetData,
speed: 0.1,
},
...
];
You should find a sprite sheet for your character, and define sprite motion / assets in the corresponding file (in the above example, f1SpritesheetData
was defined in f1.ts)
-
Update the Background (Environment): The map gets loaded in
convex/init.ts
fromdata/gentle.js
. To update the map, follow these steps:- Use Tiled to export tilemaps as a JSON file (2 layers named bgtiles and objmap)
- Use the
convertMap.js
script to convert the JSON to a format that the engine can use.
node data/convertMap.js <mapDataPath> <assetPath> <tilesetpxw> <tilesetpxh>
<mapDataPath>
: Path to the Tiled JSON file.<assetPath>
: Path to tileset images.<tilesetpxw>
: Tileset width in pixels.<tilesetpxh>
: Tileset height in pixels. Generatesconverted-map.js
that you can use likegentle.js
- Change the background music by modifying the prompt in
convex/music.ts
- Change how often to generate new music at
convex/crons.ts
by modifying thegenerate new background music
job
Configure convex/util/llm.ts
or set these env variables:
# Local Convex
just convex env set LLM_API_HOST # url
just convex env set LLM_MODEL # model
# Cloud Convex
npx convex env set LLM_API_HOST # url
npx convex env set LLM_MODEL # model
The embeddings model config needs to be changed in code, since you need to specify the embeddings dimension.
For Together.ai, visit https://api.together.xyz/settings/api-keys For OpenAI, visit https://platform.openai.com/account/api-keys
You can run your Convex backend in the cloud by just running
npx convex dev --once --configure
And updating the package.json
scripts to remove just
:
change just convex ...
to convex ...
.
You'll then need to set any environment variables you had locally in the cloud
environment with npx convex env set
or on the dashboard:
https://dashboard.convex.dev/deployment/settings/environment-variables
To run commands, use npx convex ...
where you used to run just convex ...
.
Before you can run the app, you will need to make sure the Convex functions are deployed to its production environment.
- Run
npx convex deploy
to deploy the convex functions to production - Run
npx convex run init --prod
If you have existing data you want to clear, you can run npx convex run testing:wipeAllTables --prod
You can add clerk auth back in with git revert b44a436
.
Or just look at that diff for what changed to remove it.
Make a Clerk account
- Go to https://dashboard.clerk.com/ and click on "Add Application"
- Name your application and select the sign-in providers you would like to offer users
- Create Application
- Add
VITE_CLERK_PUBLISHABLE_KEY
andCLERK_SECRET_KEY
to.env.local
VITE_CLERK_PUBLISHABLE_KEY=pk_***
CLERK_SECRET_KEY=sk_***
- Go to JWT Templates and create a new Convex Template.
- Copy the JWKS endpoint URL for use below.
npx convex env set CLERK_ISSUER_URL # e.g. https://your-issuer-url.clerk.accounts.dev/
- Register an account on Vercel and then install the Vercel CLI.
- If you are using Github Codespaces: You will need to install the Vercel CLI and authenticate from your codespaces cli by running
vercel login
. - Deploy the app to Vercel with
vercel --prod
.
We support using Ollama for conversation generations. To have it accessible from the web, you can use Tunnelmole or Ngrok or similar.
Using Tunnelmole
Tunnelmole is an open source tunneling tool.
You can install Tunnelmole using one of the following options:
- NPM:
npm install -g tunnelmole
- Linux:
curl -s https://tunnelmole.com/sh/install-linux.sh | sudo bash
- Mac:
curl -s https://tunnelmole.com/sh/install-mac.sh --output install-mac.sh && sudo bash install-mac.sh
- Windows: Install with NPM, or if you don't have NodeJS installed, download the
exe
file for Windows here and put it somewhere in your PATH.
Once Tunnelmole is installed, run the following command:
tmole 11434
Tunnelmole should output a unique url once you run this command.
Using Ngrok
Ngrok is a popular closed source tunneling tool.
Once ngrok is installed and authenticated, run the following command:
ngrok http http://localhost:11434
Ngrok should output a unique url once you run this command.
Add Ollama endpoint to Convex
npx convex env set OLLAMA_HOST # your tunnelmole/ngrok unique url from the previous step
Update Ollama domains
Ollama has a list of accepted domains. Add the ngrok domain so it won't reject traffic. see ollama.ai for more details.
- All interactions, background music and rendering on the component in the project are powered by PixiJS.
- Tilesheet:
- https://opengameart.org/content/16x16-game-assets by George Bailey
- https://opengameart.org/content/16x16-rpg-tileset by hilau
- We used https://github.com/pierpo/phaser3-simple-rpg for the original POC of this project. We have since re-wrote the whole app, but appreciated the easy starting point
- Original assets by ansimuz
- The UI is based on original assets by Mounir Tohami
Convex is a hosted backend platform with a
built-in database that lets you write your
database schema and
server functions in
TypeScript. Server-side database
queries automatically
cache and
subscribe to data, powering a
realtime useQuery
hook in our
React client. There are also clients for
Python,
Rust,
ReactNative, and
Node, as well as a straightforward
HTTP API.
The database supports NoSQL-style documents with opt-in schema validation, relationships and custom indexes (including on fields in nested objects).
The
query
and
mutation
server functions have transactional,
low latency access to the database and leverage our
v8
runtime with
determinism guardrails
to provide the strongest ACID guarantees on the market:
immediate consistency,
serializable isolation, and
automatic conflict resolution via
optimistic multi-version concurrency control (OCC / MVCC).
The action
server functions have
access to external APIs and enable other side-effects and non-determinism in
either our
optimized v8
runtime or a more
flexible node
runtime.
Functions can run in the background via scheduling and cron jobs.
Development is cloud-first, with hot reloads for server function editing via the CLI, preview deployments, logging and exception reporting integrations, There is a dashboard UI to browse and edit data, edit environment variables, view logs, run server functions, and more.
There are built-in features for reactive pagination, file storage, reactive text search, vector search, https endpoints (for webhooks), snapshot import/export, streaming import/export, and runtime validation for function arguments and database data.
Everything scales automatically, and itβs free to start.