You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a use case where I need to query a collection that returns about ~1M object, which when my scripts is running I see memory usage upward of 1G. I thought this shouldn't happen as I am iterating over the cursor and not reading all object to memory at once.
Current Behavior
asyncfordatainENGINE.find(Data):
# do something with data
this will loads all object to memory because of __aiter__ method in AIOCursor :
Bug
I have a use case where I need to query a collection that returns about ~1M object, which when my scripts is running I see memory usage upward of 1G. I thought this shouldn't happen as I am iterating over the cursor and not reading all object to memory at once.
Current Behavior
this will loads all object to memory because of
__aiter__
method inAIOCursor
:Expected behavior
shouldn't
__aiter__
yield each instance without caching it to memory since I'm iterating over the cursor not reading all object to memory?ex:
Environment
python -c "import pydantic.utils; print(pydantic.utils.version_info())
):Additional context
Curios why it needs to cache instance to a private variable when I'm iterating over the cursor.
The text was updated successfully, but these errors were encountered: