You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description: The configuration for local_model_name gets overwritten in the .json-config, preventing the addition of a local model in the llm2ssh setup. I think this is caused by the missing initialization of local_model_name in the from_dict method of the Config class.
Steps to Reproduce:
Install from pip
Configure the local URI and set default_model to "local". Set a local_model_name.
Observe that the local_model_name configuration gets overwritten.
Actual Behavior: The local_model_name configuration is overwritten, causing the local model setup to fail.
Proposed Solution:
modify the from_dict method of the Config class in config.py to include the local_model_name field. The updated method may look like this:
Description: The configuration for
local_model_name
gets overwritten in the .json-config, preventing the addition of a local model in the llm2ssh setup. I think this is caused by the missing initialization oflocal_model_name
in thefrom_dict
method of theConfig
class.Steps to Reproduce:
default_model
to "local". Set alocal_model_name
.local_model_name
configuration gets overwritten.Actual Behavior: The local_model_name configuration is overwritten, causing the local model setup to fail.
Environment:
OS: macOS
Python Version: 3.11.2
Package Version: llm2ssh 0.3.7
Local LLM: Ollama running Llama3.1
Proposed Solution:
modify the
from_dict
method of the Config class inconfig.py
to include thelocal_model_name
field. The updated method may look like this:The text was updated successfully, but these errors were encountered: