Skip to content

This repo contains jupyter notebook files for the gymnasium continuous mountain car environment and the discrete taxi environment. Methods used include DQN and Q-table

Notifications You must be signed in to change notification settings

JCC211/Reinforcement-Learning

Repository files navigation

Reinforcement-Learning

This repo contains jupyter notebook files for the gymnasium continuous mountain car environment and the discrete taxi environment. Methods used include DQN and Q-table

About

This repo contains jupyter notebook files for the gymnasium continuous mountain car environment and the discrete taxi environment. Methods used include DQN and Q-table

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published