Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[ENH] Notebook and Template For Global Forecasting API (#6699)
To close #6575 and #6684. The notebook is originally from #6551. Copy some discussion: --- fkiraly: Great! FYI @benHeid, have you seen and reviewed this? I mainly have comments about the notebook. * could you separate this into another PR? The notebook is great, but there is some iteration needed regarding location and integration with the other notebooks. I think we can merge the forecasters already and don't need to delay. Some minor comments on the notebook content: * there are a lot of printouts that confuse the reader. These should be silenced so you can focus on the didactic content. * the markdown text is nice, I would just format it so the lines are not too long, and I would also use shorter telegram style, like on ppt slides Regarding location, I would actually add the content to the notebook 01c, which has some content already. There is some minor confusion in the notebook about terminology, the notebook also uses the term "global" but in a different way. * in the M5 paper, what the 01c notebook does is called "cross-learning" * the "global" in current 01d is more of a pre-training on other instances In any case, we need to disambiguate terminology and perhaps adopt clearer distinctions here. @benHeid, what do you suggest on how we handle the terminology clash between 01c and 01d? And, should this go in the same notebook, so the "multiple instances" cases can be explained easily? _Originally posted by @fkiraly in #6551 (review) --- shlok191: @fkiraly, @Xinyu-Wu-0000. Yes, I have some minor input! In the `01d_forecasting_global_forecast.ipynb` notebook and the `01c_forecasting_hierarchical_global.ipynb` notebooks, "global learning" is referred to as a term. I do agree that instead of using "global learning", we can instead use "pretrained" and "cross-learning" as replacements. Here is how I'd differentiate them: - To the best of my knowledge, pre-trained models do not require the training dataset time series to be correlated to each other - Referencing this [paper on the M5 competition](https://www.sciencedirect.com/science/article/pii/S0169207021001874), I believe "cross-learning" is the term utilized for training models on time series which have a strong correlation. I think M5 was referenced in 01c, so it might be good to use "cross-learning" there instead of "global learning"! _Originally posted by @shlok191 in #6551 (comment)
- Loading branch information