Open
Description
Via #1130.
When crawl on a spec fails, the crawler records the error in an error
property in ed/index.json
and reuses previous extracts. In some cases, failure is transient, e.g., due to a network hiccup. In other cases though, the error is more permanent, e.g., because the extraction logic bumps into unexpected markup.
The more permanent errors may go unnoticed for some time, because nothing notifies us about the problem. Code should report these errors in an issue (and ideally close the issue if the problem disappears).
Side note: when the crawler crashes completely, the job fails, no need to handle that, GitHub already sends email notifications.
Metadata
Assignees
Labels
No labels