The Climate Mind application makes conversations about climate change easier, by letting users explore climate issues that speak to their personal values. We aim to inspire users to take action with a range of attractive solutions consistent with their values that they can get excited about.
The application currently presents solutions based on the user's personal values (as determined by a questionnaire) and their location (zip code). In the future, we plan to add the user's occupation as an option to personalize the results.
In order to serve users with relevant climate information our data team has organized climate data into an Ontology. Don't let the fancy term overwhelm you, as it is (at the end of the day) a data structure. It contains information about the relationships between climate issues, solutions, myths, and other data.
However, this data structure, in it's native form, is not easy to work with. We have another repo climatemind-ontology-processing which does all of the hard work to convert this data into an easy to work with graph structure (known as NetworkX). This graph is packaged into the .gpickle file found in the /output directory and read by the application.
Detailed instructions for processing the ontology can be found below or in the climatemind-ontology-processing repo.
In order to use this application you need to:
- Install the project
- Install Docker
- Install the Ontology Processing repo through Pip
- Download the Ontology file and process it to create the .gpickle
- Build the application with Docker
- Launch the application with Docker
Following are more details about each of these steps
To intall the code to your local machine, navigate to the desired parent folder via the command line and clone the repo
git clone https://github.com/ClimateMind/climatemind-backend.git
You will now have access to our backend code.
Install Docker through their website.
Be sure git is installed on your computer before installing the repo
Open up your command prompt/terminal (it doesn't matter which directory you're in) and install the package as follows:
Be sure you have installed all requirements first by running:
python3 -m pip install -r https://raw.githubusercontent.com/ClimateMind/climatemind-ontology-processing/main/requirements.txt
Then, install the package via pip install:
python3 -m pip install git+https://github.com/ClimateMind/climatemind-ontology-processing.git
- Download a fresh copy of the ontology from web protege. Make sure it's the RDF/XML format (check the downloaded item has .owl at the end of it!).
- Put the .owl file into the PUT_NEW_OWL_FILE_IN_HERE folder
- Using the terminal/command-line, navigate to the climatemind-backend/process_ontology folder.
- Make sure you are using the latest copy of the Ontology-Processing-Repo by running:
python3 -m pip install git+https://github.com/ClimateMind/climatemind-ontology-processing.git --upgrade
- Run the process.py script by executing the following:
python3 process.py
Ensure that you are in the process_ontology folder when you run this or the command will not find the file.
- Check the climatemind-backend/Output folder. If you did this correctly, there should be a .gpickle file.
- You can now run the app and it will automatically use this gpickle file.
Before doing what's below, be sure the Docker application is running and the command line working directory is changed to the climatemind-backend path.
Windows users - you may experience problems with building or starting the containers if your system automatically converts Unix line endings ('\n') to DOS line endings ('\r\n'). You can prevent this from happening by running the command below.
git config --global core.autocrlf false
Build the image container to download and install code dependencies needed for running the app:
docker-compose build
SPECIAL NOTE: Whenever the backend repo has added new dependancies in the requirements.txt file the docker image will need to be re-built by re-running the build command.
Start in foreground (good for debugging flask and see the logs). You can stop it with [CTRL + C] on OSX, Windows, and Linux.
docker-compose up
Start in background (best for when trying to attach the docker instance to the front-end application)
docker-compose up -d
The application should now be running on localhost. You can access it at http://127.0.0.1:5000
SPECIAL NOTE: Sometimes the terminal says 'Running on http://0.0.0.0:5000' and that url does not work in the browser. If this happens, try going to "http://127.0.0.1:5000" instead.
When you're done working, stop the container. Stopping containers will remove containers, networks, volumes, and images created by docker-compose up.
docker-compose down
The Docker lifecycle is to build the image and run it only once. After that you can stop or start the image. If you are a new developer on the team, you do not need to do this
Building Docker image
docker build -t "climatemind-backend:0.1" .
Checking the built image
docker images climatemind-backend
Running Docker
docker run -d --name climatemind-backend --publish 5000:5000 climatemind-backend:0.1
Stop the container
docker stop climatemind-backend
Start the container
docker start climatemind-backend
We use pdb to debug and troubleshoot issues. You can run pdb a few ways. First you need to set a breakpoint() in the code where you want to stop and examine the state of variables.
Then run one of the following for scripts:
# For Scripts (eg. process_new_ontology_file.py):
# Note that this is only for scripts that have to be run
docker exec -it $CLIMATEMIND_ID python process_new_ontology_file.py
Or the following for the general docker instance:
# Run the docker instance in the background and attach to the docker image
docker-compose up -d
docker attach climatemindproduct_web_1
# Navigate to the frontend directory in a separate terminal
cd climatemind-frontend
# If desired, you can link the local frontend app and test (you can also just use
#postman and breakpoints will still happen)
# First time only - Install npm
npm -i
# Start the local frontend server with npm
npm start
AutoDoc Our API is currently documented using AutoDoc. This will soon be deprecated and replaced with Swagger.
Swagger Documentation will be available soon, detailing the API endpoints and how they should be used. Whilst in development this can be found at http://localhost:5000/swagger.
Now you will have breakpoints in the docker container that you can interact with. Learn more about what you can do in the pdb documentation
Flask migrate is used to handle database structure migrations.
Whenever the data model changes, you need to manually run flask db migrate -m "Name of migration"
This will generate a file under /migrations/version
which then should be checked into GIT. Whenever the API starts up, it calls flask db upgrade
. This command will automatically apply any new migrations to the database. No manual scripts or post release commands are required!
-
After running step 3 OPTION B above, find the URL that appears in the terminal and go to it in your internet browser.
Example: "Dash is running on http://127.0.0.1:8050/" appears in the terminal, so go to http://127.0.0.1:8050/ in your internet browser.
-
Use the visualization dashboard in your internet browser.
-
When done using the dashboard, close the browser and stop the script from running by going to the terminal and pressing [CTRL + C]
The Python code style is formatted using Black. Black is a file formatter that converts your code to follow PEP 8 standards. PEP 8 provides guidelines and best practices for writing Python, and is the standard style used for modern Python libraries.
Install Black by running pip install black
. Note: Python 3.6.0+ is required.
Cd into the climatemind-backend directory on your computer to run black commands.
Run Black locally to see which files need formatting using python3 -m black --check ./
Use Black to automatically format your files using python3 -m black ./