Skip to content

Commit

Permalink
update bart examples to reflect changes in last version (#759)
Browse files Browse the repository at this point in the history
  • Loading branch information
aloctavodia authored Dec 26, 2024
1 parent 0067c6e commit b239132
Show file tree
Hide file tree
Showing 8 changed files with 455 additions and 522 deletions.
199 changes: 113 additions & 86 deletions examples/bart/bart_categorical_hawks.ipynb

Large diffs are not rendered by default.

18 changes: 11 additions & 7 deletions examples/bart/bart_categorical_hawks.myst.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ myst:
pip_dependencies: pymc-bart
---

+++ {"editable": true, "slideshow": {"slide_type": ""}}
+++ {"slideshow": {"slide_type": ""}}

(bart_categorical)=
# Categorical regression
Expand Down Expand Up @@ -136,11 +136,11 @@ It may be that some of the input variables are not informative for classifying b

```{code-cell} ipython3
---
editable: true
slideshow:
slide_type: ''
---
pmb.plot_variable_importance(idata, μ, x_0, method="VI", random_seed=RANDOM_SEED);
vi_results = pmb.compute_variable_importance(idata, μ, x_0, method="VI", random_seed=RANDOM_SEED)
pmb.plot_variable_importance(vi_results);
```

It can be observed that with the covariables `Hallux`, `Culmen`, and `Wing` we achieve the same R$^2$ value that we obtained with all the covariables, this is that the last two covariables contribute less than the other three to the classification. One thing we have to take into account in this is that the HDI is quite wide, which gives us less precision on the results, later we are going to see a way to reduce this.
Expand All @@ -152,7 +152,7 @@ It can be observed that with the covariables `Hallux`, `Culmen`, and `Wing` we a
Let's check the behavior of each covariable for each species with `pmb.plot_pdp()`, which shows the marginal effect a covariate has on the predicted variable, while we average over all the other covariates.

```{code-cell} ipython3
pmb.plot_pdp(μ, X=x_0, Y=y_0, grid=(5, 3), figsize=(6, 9));
pmb.plot_pdp(μ, X=x_0, Y=y_0, grid=(5, 3), figsize=(12, 7));
```

The pdp plot, together with the Variable Importance plot, confirms that `Tail` is the covariable with the smaller effect over the predicted variable. In the Variable Importance plot `Tail` is the last covariable to be added and does not improve the result, in the pdp plot `Tail` has the flattest response. For the rest of the covariables in this plot, it's hard to see which of them have more effect over the predicted variable, because they have great variability, showed in the HDI wide, same as before later we are going to see a way to reduce this variability. Finally, some variability depends on the amount of data for each species, which we can see in the `counts` from one of the covariables using Pandas `.describe()` and grouping the data from "Species" with `.groupby("Species")`.
Expand Down Expand Up @@ -215,11 +215,14 @@ with pm.Model(coords=coords) as model_t:
Now we are going to reproduce the same analyses as before.

```{code-cell} ipython3
pmb.plot_variable_importance(idata_t, μ_t, x_0, method="VI", random_seed=RANDOM_SEED);
vi_results = pmb.compute_variable_importance(
idata_t, μ_t, x_0, method="VI", random_seed=RANDOM_SEED
)
pmb.plot_variable_importance(vi_results);
```

```{code-cell} ipython3
pmb.plot_pdp(μ_t, X=x_0, Y=y_0, grid=(5, 3), figsize=(6, 9));
pmb.plot_pdp(μ_t, X=x_0, Y=y_0, grid=(5, 3), figsize=(12, 7));
```

Comparing these two plots with the previous ones shows a marked reduction in the variance for each one. In the case of `pmb.plot_variable_importance()` there are smallers error bands with an R$^{2}$ value more close to 1. And for `pm.plot_pdp()` we can see thinner bands and a reduction in the limits on the y-axis, this is a representation of the reduction of the uncertainty due to adjusting the trees separately. Another benefit of this is that is more visible the behavior of each covariable for each one of the species.
Expand Down Expand Up @@ -254,7 +257,8 @@ all
```

## Authors
- Authored by [Pablo Garay](https://github.com/PabloGGaray) and [Osvaldo Martin](https://aloctavodia.github.io/) in May, 2024
- Authored by [Pablo Garay](https://github.com/PabloGGaray) and [Osvaldo Martin](https://aloctavodia.github.io/) in May, 2024
- Updated by Osvaldo Martin in Dec, 2024

+++

Expand Down
117 changes: 52 additions & 65 deletions examples/bart/bart_heteroscedasticity.ipynb

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions examples/bart/bart_heteroscedasticity.myst.md
Original file line number Diff line number Diff line change
Expand Up @@ -147,6 +147,7 @@ The fit looks good! In fact, we see that the mean and variance increase as a fun
- Authored by [Juan Orduz](https://juanitorduz.github.io/) in Feb, 2023
- Rerun by Osvaldo Martin in Mar, 2023
- Rerun by Osvaldo Martin in Nov, 2023
- Rerun by Osvaldo Martin in Dec, 2024

+++

Expand Down
450 changes: 198 additions & 252 deletions examples/bart/bart_introduction.ipynb

Large diffs are not rendered by default.

6 changes: 4 additions & 2 deletions examples/bart/bart_introduction.myst.md
Original file line number Diff line number Diff line change
Expand Up @@ -199,14 +199,15 @@ Finally, like with other regression methods, we should be careful that the effec

### Variable importance

As we saw in the previous section a partial dependence plot can visualize and give us an idea of how much each covariable contributes to the predicted outcome. Moreover, PyMC-BART provides a novel method to assess the importance of each variable in the model. You can see an example in the following figure.
As we saw in the previous section a partial dependence plot can visualize give us an idea of how much each covariable contributes to the predicted outcome. Moreover, PyMC-BART provides a novel method to assess the importance of each variable in the model. You can see an example in the following figure.

On the x-axis we have the number of covariables and on the y-axis R² (the the square of the Pearson correlation coefficient) between the predictions made for the full model (all variables included) and the restricted models, those with only a subset of the variables.

In this example, the most important variable is `hour`, then `temperature`, `humidity`, and finally `workingday`. Notice that the first value of R², is the value of a model that only includes the variable `hour`, the second R² is for a model with two variables, `hour` and `temperature`, and so on. Besides this ranking, we can see that even a model with a single component, `hour`, is very close to the full model. Even more, the model with two components `hour`, and `temperature` is on average indistinguishable from the full model. The error bars represent the 94 \% HDI from the posterior predictive distribution. This means that we should expect a model with only `hour` and `temperature` to have a similar predictice performance than a model with the four variables, `hour`, `temperature`, `humidity`, and `workingday`.

```{code-cell} ipython3
pmb.plot_variable_importance(idata_bikes, μ, X);
vi_results = pmb.compute_variable_importance(idata_bikes, μ, X)
pmb.plot_variable_importance(vi_results);
```

`plot_variable_importance` is fast because it makes two assumptions:
Expand Down Expand Up @@ -405,6 +406,7 @@ This plot helps us understand the reason behind the bad performance on the test
* Juan Orduz added out-of-sample section in Jan, 2023
* Updated by Osvaldo Martin in Mar, 2023
* Updated by Osvaldo Martin in Nov, 2023
* Updated by Osvaldo Martin in Dec, 2024

+++

Expand Down
185 changes: 75 additions & 110 deletions examples/bart/bart_quantile_regression.ipynb

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions examples/bart/bart_quantile_regression.myst.md
Original file line number Diff line number Diff line change
Expand Up @@ -149,6 +149,7 @@ We can see that when we use a Normal likelihood, and from that fit we compute th
* Authored by Osvaldo Martin in Jan, 2023
* Rerun by Osvaldo Martin in Mar, 2023
* Rerun by Osvaldo Martin in Nov, 2023
* Rerun by Osvaldo Martin in Dec, 2024

+++

Expand Down

0 comments on commit b239132

Please sign in to comment.