This is the first example using the PyDP, about animals using the library to aggregate their data (about how many carrots they have eaten) before reporting it to their owner. More about it here.
A restaurant owner would share business statistics with her visitors or potential clients, uses the PyDP library to preserve privacy of visitors while keeping track of number of visitors entering the restaurant and how much time and money they spend there. More about it here.
This example uses the infamous Titanic dataset and finds what sorts of people were more likely to survive by finding demographic of people on the ship using differentially private statistical methods. More about it here.
In this example, two copies of one database are created where they differ by one record. This is for demonstrating the general principle used by all differentially private algorithms to protect users from MIA(Membership Inference Attack). More about it here.
This demo shows it is required to add noise to make data private. Laplace distribution makes it easy to satisfy ε-differential privacy by setting the b parameter to 1/ε. Hence, Laplace noise is used for making the data differentially private. More about it here.
This demo compares the results of scikit-learn's verses PyDP's Naive Bayes algorithm on various datasets. More about it here and here.