Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix limitedParallelism doing dispatches when it has no tasks #3672

Merged
merged 3 commits into from
Mar 16, 2023

Conversation

dkhalanskyjb
Copy link
Collaborator

To support fairness w.r.t. the tasks that want to run on the wrapped dispatcher outside of its view, LimitedDispatcher redispatches the work loop periodically. This can happen even when there are no more tasks in LimitedDispatcher's queue. Typically, this behavior is completely benign: after being redistpatched, the new runner just checks the queue, sees that there is nothing there, and leaves. However, with closable dispatchers, this affects correctness, as this redispatch may happen even after all the dispatched tasks were run and the dispatcher was closed.

To support fairness w.r.t. the tasks that want to run on the
wrapped dispatcher outside of its view, `LimitedDispatcher`
redispatches the work loop periodically. This can happen even when
there are no more tasks in `LimitedDispatcher`'s queue.
Typically, this behavior is completely benign: after being
redistpatched, the new runner just checks the queue, sees that
there is nothing there, and leaves. However, with closable
dispatchers, this affects correctness, as this redispatch may
happen even after all the dispatched tasks were run and the
dispatcher was closed.
@dkhalanskyjb dkhalanskyjb force-pushed the fix-limitedDispatcher-extra-dispatches branch from d28ccef to 2ac1644 Compare March 10, 2023 14:49
@dkhalanskyjb
Copy link
Collaborator Author

Testing this is difficult. The problematic behavior only happens when there are no more tasks to run when the fairness dispatch wants to proceed, but before that, the worker had a steady supply of tasks and did not get fired. With a hardcoded 16 this could be done, but I don't think it's worth the effort. As for stress tests, for me, it takes several minutes to reproduce the problem.

@dkhalanskyjb dkhalanskyjb requested a review from qwwdfsad March 10, 2023 15:15
handleCoroutineException(EmptyCoroutineContext, e)
}
// 16 is our out-of-thin-air constant to emulate fairness. Used in JS dispatchers as well
if (++fairnessCounter >= 16 && dispatcher.isDispatchNeeded(this)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it correct that this PR addresses the following issue in this snippet:

  • run executes task (:54)
  • Underlying dispatcher is closed
  • run finishes, unluckily enough overflows fairnessCounter, goes to redispatch and crashes?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's one aspect of it, yes. Consider also the case where an external dispatch or dispatchYield happened (for example, from some callback), scheduled a task, waited for a millennia, during which this task and every other one got executed and the dispatcher got closed, and then proceeded to allocate a worker and run it.

Co-authored-by: Vsevolod Tolstopyatov <qwwdfsad@gmail.com>
Copy link
Contributor

@qwwdfsad qwwdfsad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great job with figuring that one out!

@dkhalanskyjb dkhalanskyjb requested a review from qwwdfsad March 15, 2023 14:41
@dkhalanskyjb dkhalanskyjb merged commit 33b2a9a into develop Mar 16, 2023
@dkhalanskyjb dkhalanskyjb deleted the fix-limitedDispatcher-extra-dispatches branch March 16, 2023 10:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants