Skip to content

Turn expensive prompts into cheap fine-tuned models

License

Notifications You must be signed in to change notification settings

data2json/OpenPipe

 
 

Repository files navigation

Note: we’ve temporarily stopped development on the open-source version of OpenPipe to integrate some proprietary third-party code. We hope to make the non-proprietary parts of the repository open again under an open core model once we have the bandwidth to do so!

logo

OpenPipe

Open-source fine-tuning and model-hosting platform.

License Apache-2.0 PRs Welcome GitHub commit activity GitHub closed issues Y Combinator S23

Demo - Running Locally - Docs


Use powerful but expensive LLMs to fine-tune smaller and cheaper models suited to your exact needs. Query your past requests and evaluate models against one another. Switch between OpenAI and fine-tuned models with one line of code.

Features

  • Easy integration with OpenAI's SDK in both Python and TypeScript.
  • OpenAI-compatible chat completions endpoint.
  • Fine-tune GPT 3.5, Mistral, and Llama 2 models. Host on-platform or download the weights.
    • Model output is OpenAI-compatible.
    • Switching from GPT 4 to a fine-tuned Mistral model only requires changing the model name.
  • Query logs using powerful built-in filters.
  • Import datasets in OpenAI-compatible JSONL files.
  • Prune large chunks of duplicate text like system prompts.
  • Compare output accuracy against base models like gpt-3.5-turbo.

Supported Base Models

Documentation

Running Locally

  1. Install Postgresql.
  2. Install NodeJS 20 (earlier versions will very likely work but aren't tested).
  3. Install pnpm: npm i -g pnpm
  4. Clone this repository: git clone https://github.com/openpipe/openpipe
  5. Install the dependencies: cd openpipe && pnpm install
  6. Create a .env file (cd app && cp .env.example .env) and enter your OPENAI_API_KEY.
  7. If you just installed postgres and wish to use the default DATABASE_URL run the following commands:
psql postgres
CREATE ROLE postgres WITH LOGIN PASSWORD 'postgres';
ALTER ROLE postgres SUPERUSER;
  1. Update DATABASE_URL if necessary to point to your Postgres instance and run pnpm prisma migrate dev in the app directory to create the database.
  2. Create a GitHub OAuth App, set the callback URL to <your local instance>/api/auth/callback/github, e.g. http://localhost:3000/api/auth/callback/github.
  3. Update the GITHUB_CLIENT_ID and GITHUB_CLIENT_SECRET values from the Github OAuth app (Note: a PR to make auth optional when running locally would be a great contribution!).
  4. To start the app run pnpm dev in the app directory.
  5. Navigate to http://localhost:3000

Using Locally

import os
from openpipe import OpenAI

client = OpenAI(
    api_key="Your API Key",
    openpipe={
        "api_key": "Your OpenPipe API Key",
        "base_url": "http://localhost:3000/api/v1", # Local OpenPipe instance
    }
)

completion = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "system", "content": "count to 10"}],
    openpipe={
        "tags": {"prompt_id": "counting"},
        "log_request": True
    },
)

Testing Locally

  1. Copy your .env file to .env.test.
  2. Update the DATABASE_URL to have a different database name than your development one
  3. Run DATABASE_URL=[your new datatase url] pnpm prisma migrate dev --skip-seed --skip-generate
  4. Run pnpm test

About

Turn expensive prompts into cheap fine-tuned models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 82.9%
  • Python 15.2%
  • Other 1.9%