Skip to content

fail fast with large graphs #11547

Open
Open
@dcherian

Description

upstream issue: pydata/xarray#9802

More data, bigger the graph is, to the point where the graph is so huge (31GB at my maximum) that the .compute() fails, with a "error to serialize" error in msgpack.

Can dask just fail faster and tell the user that something is absolutely wrong if they've constructed this graph? Some way to find out what these large objects are would also be useful (layer names?).

Metadata

Assignees

No one assigned

    Labels

    needs triageNeeds a response from a contributor

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions