Skip to content

Commit

Permalink
ci: pre-commit autoupdate [pre-commit.ci] (#1100)
Browse files Browse the repository at this point in the history
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
  • Loading branch information
pre-commit-ci[bot] authored Oct 22, 2024
1 parent bc8c4bf commit 395cc73
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ default_language_version:
python: python3.11 # NOTE: sync with .python-version-default
repos:
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: "v0.6.9"
rev: "v0.7.0"
hooks:
- id: ruff
alias: r
Expand All @@ -20,7 +20,7 @@ repos:
verbose: true
types_or: [python, pyi, jupyter]
- repo: https://github.com/pre-commit/mirrors-mypy
rev: "v1.11.2"
rev: "v1.12.1"
hooks:
- id: mypy
args: [--strict]
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ To start an LLM server locally, use the `openllm serve` command and specify the
> OpenLLM does not store model weights. A Hugging Face token (HF_TOKEN) is required for gated models.
> 1. Create your Hugging Face token [here](https://huggingface.co/settings/tokens).
> 2. Request access to the gated model, such as [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B).
> 3. Set your token as an environment variable by running:
> 3. Set your token as an environment variable by running:
> ```bash
> export HF_TOKEN=<your token>
> ```
Expand Down

0 comments on commit 395cc73

Please sign in to comment.