Skip to content

Latest commit

 

History

History

ppo

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 

Proximal policy optimization (PPO)

This is an implementation of PPO algorithm.

Usage

Run the following command to start parallelized training:

python experiment.py

One could modify experiment.py to quickly set up different configurations.

Results

MLP Policy