Implementation of Information Bottleneck with Mutual Information Neural Estimation (MINE)
-
Updated
Jun 18, 2021 - Python
Implementation of Information Bottleneck with Mutual Information Neural Estimation (MINE)
PyTorch implementation of the estimator proposed in the paper "Estimating Differential Entropy under Gaussian Convolutions"
Statistical Learning based Estimation of Mutual Information. R CRAN package.
"Simulations for the paper 'Deep Learning for Channel Coding via Neural Mutual Information Estimation' by Rick Fritschek, Rafael F. Schaefery, and Gerhard Wunder"
An Independence Test based on Data-Driven Tree-Structured Representations.
Implementation and Analysis of Information Bottleneck Theory of Deep Learning
Final project for the Deep Learning course @ UniVR
Python code for the paper "Telescoping Density-Ratio Estimation", NeurIPS 2020
Entropy and Mutual Information Estimation
Implement Mutual Information Neural Estimator with TensorFlow 2
Add a description, image, and links to the mutual-information-estimation topic page so that developers can more easily learn about it.
To associate your repository with the mutual-information-estimation topic, visit your repo's landing page and select "manage topics."