Memgpt sends all prev messages (chat history) to OPENAI everytimeΒ #2322
Open
Description
I'm invoking memgpt agents using this following code
agent_state = client.load_agent(agent_config=get_agent_config(agent_name), project=project)
response = client.user_message(agent_id=agent_state.get("id"), message=message)
Now this user_message
which is located in memgpt.client.client.LocalClient
uses ._step
method that accumulates all the prev messages and sends them to OPENAI.
I want to send only prev 3-4 messages max as sending all messages is very costly and my input token count goes upto 13k-14k for gpt-4o. Is there a way I can summarize messages early like whenever the token count exceeds 4k or 5k.
I'm using pymemgpt==0.3.17.
One thing that I noticed was in step()
method, the summarization is done based on the following code block
if current_total_tokens > MESSAGE_SUMMARY_WARNING_FRAC * int(self.agent_state.llm_config.context_window):
printd(
f"{CLI_WARNING_PREFIX}last response total_tokens ({current_total_tokens}) > {MESSAGE_SUMMARY_WARNING_FRAC * int(self.agent_state.llm_config.context_window)}"
)
# Only deliver the alert if we haven't already (this period)
if not self.agent_alerted_about_memory_pressure:
active_memory_warning = True
self.agent_alerted_about_memory_pressure = True # it's up to the outer loop to handle this
Can I also maybe change MESSAGE_SUMMARY_WARNING_FRAC
variable from somewhere (given I'm using memgpt as a pip package, so I can't update it's code files directly)?
Metadata
Assignees
Labels
No labels