SARSA, Q-Learning, Expected SARSA, SARSA(λ) and Double Q-learning Implementation and Analysis
-
Updated
Aug 19, 2019 - Python
SARSA, Q-Learning, Expected SARSA, SARSA(λ) and Double Q-learning Implementation and Analysis
Code for our AAMAS 2020 paper: "A Story of Two Streams: Reinforcement Learning Models from Human Behavior and Neuropsychiatry".
A Reinforcement Learning agent to perform overtaking action using Double DQN based CNNs which takes images as input built using TensorFlow.
Implementations of various RL and Deep RL algorithms in TensorFlow, PyTorch and Keras.
Path Planning with Reinforcement Learning algorithms in an unknown environment
Reversi game with multiple reinforcement learning algorithms.
Environment-related differences of Deep Q-Learning and Deep Double Q-Learning
Reinforcement Learning experiments, comparing performance of Q-learning and Double Q-learning algorithms.
Solving CartPole using Distributional RL
Deliverables relating to the Reinforcement Learning University Unit
Implement several deep reinforcement learning algorithms on one of games in Atari 2600 - Space Invaders.
A reinforcement learning framework for the game of Nim.
Reinforcement Learning: Modification of Q-learning through the use DynaQ learning and Double-Q learning.
This repository contains all of the Reinforcement Learning-related projects I've worked on. The projects are part of the graduate course at the University of Tehran.
Python script to balance Pendulum from open ai gym using Q-Learning and Double Q-Learning
With underflow, create trafic light clusters that interact together to regulate circulation
Understanding several problems in RL and understanding how to solve those issues.
Reinforcement learning algorithms
Pytorch implementation of Randomized Ensembled Double Q-learning (REDQ)
Slide presentation reviewing advances in reinforcement learning
Add a description, image, and links to the double-q-learning topic page so that developers can more easily learn about it.
To associate your repository with the double-q-learning topic, visit your repo's landing page and select "manage topics."