Skip to content

Adds model, temperature and maxToken as configurable parameters to LLM client #2370

Adds model, temperature and maxToken as configurable parameters to LLM client

Adds model, temperature and maxToken as configurable parameters to LLM client #2370

Triggered via pull request January 23, 2025 12:01
Status Success
Total duration 29s
Artifacts

link-checker.yml

on: pull_request
test  /  htmlproofer
19s
test / htmlproofer
Fit to window
Zoom out
Zoom in

Annotations

1 warning
test / htmlproofer
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636