You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
import asyncio
from langchain_ollama import OllamaLLM
from langchain_prefect.plugins import RecordLLMCalls
llm = OllamaLLM(model="qwen2.5:latest")
async def record_call_using_LLM_agenerate():
"""async func"""
await llm.agenerate(
[
"What would be a good name for a company that makes colorful socks?",
"What would be a good name for a company that sells carbonated water?",
]
)
with RecordLLMCalls():
asyncio.run(record_call_using_LLM_agenerate())
The code fails when you try to patch the generate method due to assumptions about the inheritance hierarchy.
The text was updated successfully, but these errors were encountered:
I tried to use a local LLM as follows:
The code fails when you try to patch the
generate
method due to assumptions about the inheritance hierarchy.The text was updated successfully, but these errors were encountered: