Skip to content

Race condition in case of multiple event loopsΒ #611

Open
@bcmills

Description

It appears that the alru_cache decorator is using an unsynchronized OrderedDict instance for its cache:

self.__cache: OrderedDict[Hashable, _CacheItem[_R]] = OrderedDict()

If I am not mistaken, that will lead to a data race if the decorated function is called simultaneously from different asyncio event loops running on different threads β€” which seems to be a realistic (if rare) situation (compare python/cpython#93462 (comment)).

If I understand correctly, other asyncio wrappers generally bind the event loop during initialization and fail explicitly if called from a different event loop. Unfortunately, since the alru_cache decorator generally runs at module init time there is no such loop on the thread.

I don't have a clear idea of how this problem could be fixed; I just wanted to point it out as a (significant and subtle) risk for users of this module.

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions