diff --git a/paper/paper.bib b/paper/paper.bib index 0f9337a..fb039b6 100644 --- a/paper/paper.bib +++ b/paper/paper.bib @@ -22,14 +22,13 @@ @article{dirmeier2023simulation year={2023} } -@article{dirmeier2023ood, +@article{dirmeier2023uncertainty, title={Uncertainty quantification and out-of-distribution detection using surjective normalizing flows}, author={Dirmeier, Simon and Hong, Ye and Xin, Yanan and Perez-Cruz, Fernando}, - journal={In preparation}, + journal={arXiv preprint arXiv:2311.00377}, year={2023} } - @article{hoffman2019neutra, title={Neutra-lizing bad geometry in hamiltonian monte carlo using neural transport}, author={Hoffman, Matthew and Sountsov, Pavel and Dillon, Joshua V and Langmore, Ian and Tran, Dustin and Vasudevan, Srinivas}, @@ -141,3 +140,4 @@ @inproceedings{oliva18transform booktitle = {Proceedings of the 35th International Conference on Machine Learning}, year = {2018} } + diff --git a/paper/paper.jats b/paper/paper.jats deleted file mode 100644 index d580077..0000000 --- a/paper/paper.jats +++ /dev/null @@ -1,381 +0,0 @@ - - -
- - - - -Journal of Open Source Software -JOSS - -2475-9066 - -Open Journals - - - -0 -N/A - -Surjectors: surjective normalizing flows for density -estimation - - - - -Dirmeier -Simon - - - -* - - - -Swiss Data Science Center, Zurich, -Switzerland - - - - -ETH Zurich, Zurich, Switzerland - - - - -* E-mail: - - -10 -10 -2023 - -¿VOL? -¿ISSUE? -¿PAGE? - -Authors of papers retain copyright and release the -work under a Creative Commons Attribution 4.0 International License (CC -BY 4.0) -2022 -The article authors - -Authors of papers retain copyright and release the work under -a Creative Commons Attribution 4.0 International License (CC BY -4.0) - - - -Python -JAX -Density estimation -Normalizing flow -Machine learning -Statistics - - - - - - Summary -

Normalizing flows (NFs, - Papamakarios - et al., 2021) are tractable neural density estimators which - have in the recent past been applied successfully for, e.g., - generative modelling Ping et al. - (2020), - Bayesian inference Hoffman et al. - (2019) - or simulation-based inference Dirmeier et al. - (2023). - Surjectors is a Python library in particular - for surjective, i.e., dimensionality-reducing - normalizing flows (SNFs, Klein et al. - (2021)). - Surjectors is based on the libraries JAX, Haiku - and Distrax Babuschkin et al. - (2020) - and is fully compatible with them. By virtue of being entirely written - in JAX - (Bradbury - et al., 2018), Surjectors naturally - supports usage on either CPU, GPU and TPU.

-
- - Statement of Need -

Real-world data are often lying in a high-dimensional ambient space - embedded in a lower-dimensional manifold - (Fefferman - et al., 2016) which can complicate estimation of probability - densities Nalisnick et al. - (2019). - As a remedy, recently neural density estimators using surjective - normalizing flows (SNFs) have been proposed which reduce the - dimensionality of the data while still allowing for exact computation - of data likelihoods - (Klein - et al., 2021). While several computational libraries exist that - implement bijective normalizing flows, i.e., flows - that are dimensionality-preserving, currently none exist that - efficiently implement dimensionality-reducing flows.

-

Surjectors is a normalizing flow library - that implements both bijective and surjective normalizing flows. - Surjectors is light-weight, conceptually simple - to understand if familiar with the JAX ecosystem, and computationally - efficient due to leveraging the XLA compilation and vectorization from - JAX. We additionally make use of several well-established packages - within the JAX ecosystem - (Bradbury - et al., 2018) and probabilistic deep learning community. For - composing the conditioning networks that NFs facilitate, - Surjectors uses the deep learning library Haiku - (Hennigan - et al., 2020). For training and optimisation, we utilize the - gradient transformation library Optax - (Babuschkin - et al., 2020). Surjectors leverages - Distrax - (Babuschkin - et al., 2020) and TensorFlow probability - (Dillon - et al., 2017) for probability distributions and several base - bijector implementations.

-
- - Adoption -

Dirmeier et al. - (2023) - have proposed a novel method for simulation-based inference where they - make use autoregressive inference surjections for density estimation - and where they are using Surjectors for their - implementations.

-
- - - - - - - PapamakariosGeorge - NalisnickEric - RezendeDanilo Jimenez - MohamedShakir - LakshminarayananBalaji - - Normalizing flows for probabilistic modeling and inference - The Journal of Machine Learning Research - 2021 - 22 - 1 - 2617 - 2680 - - - - - - PapamakariosGeorge - SterrattDavid - MurrayIain - - Sequential neural likelihood: Fast likelihood-free inference with autoregressive flows - Proceedings of the 22nd international conference on artificial intelligence and statistics - 2019 - - - - - - DirmeierSimon - AlbertCarlo - Perez-CruzFernando - - Simulation-based inference using surjective sequential neural likelihood estimation - arXiv preprint arXiv:2308.01054 - 2023 - - - - - - HoffmanMatthew - SountsovPavel - DillonJoshua V - LangmoreIan - TranDustin - VasudevanSrinivas - - Neutra-lizing bad geometry in hamiltonian monte carlo using neural transport - arXiv preprint arXiv:1903.03704 - 2019 - - - - - - KingmaDurk P - DhariwalPrafulla - - Glow: Generative flow with invertible 1x1 convolutions - Advances in neural information processing systems - 2018 - - - - - - PingWei - PengKainan - ZhaoKexin - SongZhao - - WaveFlow: A compact flow-based model for raw audio - Proceedings of the 37th international conference on machine learning - 2020 - - - - - - RezendeDanilo - MohamedShakir - - Variational inference with normalizing flows - Proceedings of the 32nd international conference on machine learning - 2015 - - - - - - KleinSamuel - RaineJohn A. - Pina-OteySebastian - VoloshynovskiySlava - GollingTobias - - Funnels: Exact maximum likelihood with dimensionality reduction - Workshop on bayesian deep learning, advances in neural information processing systems - 2021 - - - - - - BradburyJames - FrostigRoy - HawkinsPeter - JohnsonMatthew James - LearyChris - MaclaurinDougal - NeculaGeorge - PaszkeAdam - VanderPlasJake - Wanderman-MilneSkye - ZhangQiao - - JAX: Composable transformations of Python+NumPy programs - 2018 - http://github.com/google/jax - - - - - - BabuschkinIgor - BaumliKate - BellAlison - BhupatirajuSurya - BruceJake - BuchlovskyPeter - BuddenDavid - CaiTrevor - ClarkAidan - DanihelkaIvo - DedieuAntoine - FantacciClaudio - others - - The DeepMind JAX Ecosystem - 2020 - http://github.com/deepmind - - - - - - FeffermanCharles - MitterSanjoy - NarayananHariharan - - Testing the manifold hypothesis - Journal of the American Mathematical Society - 2016 - 29 - 4 - 983 - 1049 - - - - - - DaiBiwei - SeljakUros - - Sliced iterative normalizing flows - ICML workshop on invertible neural networks, normalizing flows, and explicit likelihood models - 2021 - - - - - - NalisnickEric - MatsukawaAkihiro - TehYee Whye - GorurDilan - LakshminarayananBalaji - - Do deep generative models know what they don’t know? - International conference on learning representations - 2019 - - - - - - HenniganTom - CaiTrevor - NormanTamara - MartensLena - BabuschkinIgor - - Haiku: Sonnet for JAX - 2020 - http://github.com/deepmind/dm-haiku - - - - - - DillonJoshua V - LangmoreIan - TranDustin - BrevdoEugene - VasudevanSrinivas - MooreDave - PattonBrian - AlemiAlex - HoffmanMatt - SaurousRif A - - Tensorflow distributions - arXiv preprint arXiv:1711.10604 - 2017 - - - - -
diff --git a/paper/paper.md b/paper/paper.md index 2912690..176dbf5 100644 --- a/paper/paper.md +++ b/paper/paper.md @@ -1,5 +1,5 @@ --- -title: 'Surjectors: surjective normalizing flows for density estimation' +title: 'Surjectors: surjection layers for density estimation with normalizing flows' tags: - Python - JAX @@ -8,7 +8,7 @@ tags: - Machine learning - Statistics authors: - - name: Simon Dirmeier^[corresponding author] + - name: Simon Dirmeier affiliation: "1, 2" affiliations: - name: Swiss Data Science Center, Zurich, Switzerland @@ -22,25 +22,25 @@ bibliography: paper.bib # Summary Normalizing flows [NFs, @papamakarios2021normalizing] are tractable neural density estimators which have in the recent past been applied successfully for, e.g., -generative modelling [@kingma2018glow,@ping20wave], Bayesian inference [@rezende15flow,@hoffman2019neutra] or simulation-based inference [@papamakarios2019sequential,@dirmeier2023simulation]. `Surjectors` is a Python library in particular -for *surjective*, i.e., dimensionality-reducing normalizing flows (SNFs, @klein2021funnels). `Surjectors` is based on the libraries JAX, Haiku and Distrax [@jax2018github, @deepmind2020jax] and is fully compatible with them. -By virtue of being entirely written in JAX [@jax2018github], `Surjectors` naturally supports usage on either CPU, GPU and TPU. +generative modelling (@kingma2018glow, @ping20wave), Bayesian inference (@rezende15flow, @hoffman2019neutra) or simulation-based inference (@papamakarios2019sequential, @dirmeier2023simulation). `Surjectors` is a Python library in particular +for *surjective*, i.e., dimensionality-reducing normalizing flows (SNFs, @klein2021funnels). `Surjectors` is based on the libraries JAX, Haiku and Distrax (@jax2018github, @deepmind2020jax) and is fully compatible with them. +By virtue of being entirely written in JAX [@jax2018github], `Surjectors` naturally supports usage on either CPU, GPU or TPU. # Statement of Need -Real-world data are often lying in a high-dimensional ambient space embedded in a lower-dimensional manifold [@fefferman2016testing] which can complicate estimation of probability densities [@dai2020sliced,@klein2021funnels,@nalisnick2018deep]. +Real-world data are often lying in a high-dimensional ambient space embedded in a lower-dimensional manifold [@fefferman2016testing] which can complicate estimation of probability densities (@dai2020sliced, @klein2021funnels, @nalisnick2018deep). As a remedy, recently neural density estimators using surjective normalizing flows (SNFs) have been proposed which reduce the dimensionality of the data while still allowing for exact computation of data likelihoods [@klein2021funnels]. While several computational libraries exist that implement *bijective* normalizing flows, i.e., flows that are dimensionality-preserving, currently none exist that efficiently implement dimensionality-reducing flows. `Surjectors` is a normalizing flow library that implements both bijective and surjective normalizing flows. `Surjectors` is light-weight, conceptually simple to understand if familiar with the JAX ecosystem, and computationally efficient due to leveraging the XLA compilation and vectorization from JAX. We additionally make use of several well-established packages within the JAX ecosystem [@jax2018github] and probabilistic deep learning community. -For composing the conditioning networks that NFs facilitate, `Surjectors` uses the deep learning library Haiku [@haiku2020github]. For training and optimisation, we utilize the gradient transformation library +For composing the conditioning networks that NFs facilitate, `Surjectors` uses the deep learning library Haiku [@haiku2020github]. For training and optimization, we utilize the gradient transformation library Optax [@deepmind2020jax]. `Surjectors` leverages Distrax [@deepmind2020jax] and TensorFlow probability [@dillon2017tensorflow] for probability distributions and several base bijector implementations. # Adoption @dirmeier2023simulation have proposed a novel method for simulation-based inference where they make use autoregressive inference surjections for density estimation and where they -are using `Surjectors` for their implementations. +are using `Surjectors` for their implementations. @dirmeier2023uncertainty used `Surjectors` for uncertainty quantification and out-of-distribution detection in deep neural network models. # References