Implementation of SARSA Semi-Gradient Method on the Mountain Car Open AI Environment.
-
Updated
Dec 8, 2022 - Python
Implementation of SARSA Semi-Gradient Method on the Mountain Car Open AI Environment.
Own researches in reinforcement learning using openai-gym.
Deep Reinforcement learning applied on open AI MountainCar environment
An implementation of the paper "Reinforcement learning with a bilinear Q function" on the Mountain Car problem.
Double deep q network implementation in OpenAI Gym's "Mountain Car" environment
This repository contains codes of deep deducing solving the classic control problems.
Q Learning, SARSA, Expected SARSA to solve OpenAI's gym.mountain_car environment
Reinforcement learning algorithm implementation for "Artificial Intelligence" course project, La Sapienza, Rome, Italy, 2018
My programs during CS747 (Foundations of Intelligent and Learning Agents) Autumn 2021-22
Comparing VPG, TRPO and PPO from Policy Gradient family
This repo is for playing with reinforcement learning algorithms. I am either using openai gym or ViZDoom as an environment.
Implementing reinforcement learning algorithms using TensorFlow and Keras in OpenAI Gym
Deep RL toy example based on gym package with several methods
A car is on a one-dimensional track, positioned between two "mountains". The goal is to drive up the mountain on the right; however, the car's engine is not strong enough to scale the mountain in a single pass. Therefore, the only way to succeed is to drive back and forth to build up momentum.
Inverse Reinforcement Learning Algorithm implementation with Python
This repo implements Deep Q-Network (DQN) for solving the Mountain Car v0 environment (discrete version) of the Gymnasium library using Python 3.8 and PyTorch 2.0.1 with a custom reward function for faster convergence.
Reinforcement learning algorithms to solve OpenAI gym environments
Add a description, image, and links to the mountain-car topic page so that developers can more easily learn about it.
To associate your repository with the mountain-car topic, visit your repo's landing page and select "manage topics."