Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding a devcontainer with Redis and a Ollama server #217

Merged
merged 12 commits into from
Oct 7, 2024

Conversation

ProKil
Copy link
Member

@ProKil ProKil commented Oct 6, 2024

This PR introduces an experimental dev container which would allow developers to easily set up a clean environment with a Redis server and an LLM server running in the local environment with minimal memory footprint (<4GB) so that it could run easily on laptop and codespaces.

📑 Description

  1. .devcontainer/devcontainer.json a specification of a container with Python, Redis, and Ollama support. It would automatically install poetry and sotopia, and serve Llama 3.2 1b on start.
  2. An example for using custom models, which can also serve a sanity check for the devcontainer.
  3. Fix agenerate API to use the model_name for reformatting.
  4. Updated the doc for contributors.

Out of scope:

  1. Use the devcontainer in CI. This will be more of a breaking change which could be a separate PR.
  2. Pre-load sotopia data into redis. The technical implementation needs more discussion.

✅ Checks

  • My pull request adheres to the code style of this project
  • My code requires changes to the documentation
  • I have updated the documentation as required
  • All the tests have passed
  • Branch name follows type/descript (e.g. feature/add-llm-agents)
  • Ready for code review

ℹ Additional Information

Copy link

codecov bot commented Oct 6, 2024

Codecov Report

Attention: Patch coverage is 75.00000% with 2 lines in your changes missing coverage. Please review.

Project coverage is 71.49%. Comparing base (7f5be99) to head (ec17cd2).
Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
sotopia/generation_utils/generate.py 75.00% 2 Missing ⚠️
@@            Coverage Diff             @@
##             main     #217      +/-   ##
==========================================
- Coverage   71.82%   71.49%   -0.33%     
==========================================
  Files          55       55              
  Lines        2875     3021     +146     
==========================================
+ Hits         2065     2160      +95     
- Misses        810      861      +51     
Files with missing lines Coverage Δ
sotopia/generation_utils/generate.py 62.50% <75.00%> (+1.34%) ⬆️

@ProKil ProKil requested a review from XuhuiZhou October 7, 2024 00:07
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why remove this?

examples/generation_api/custom_model.py Show resolved Hide resolved
@@ -777,7 +765,7 @@ def process_history(
async def agenerate_init_profile(
model_name: str,
basic_info: dict[str, str],
bad_output_process_model: str = DEFAULT_BAD_OUTPUT_PROCESS_MODEL,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why remove the default?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To use custom model without OpenAI keys, the original way doesn't work. Now we use the the model as the default reformat model, but allow the user to use other models for reformatting.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would this break the existing running script that does not specify reformatting models? can we still have a default one?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, the default one would be the one used for generating the output for the first time.

@ProKil ProKil merged commit 71d6199 into main Oct 7, 2024
10 checks passed
@ProKil ProKil deleted the feature/dev-container branch October 7, 2024 16:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants