Thanks to visit codestin.com
Credit goes to github.com

Skip to content
#

ppo-algorithm

Here are 23 public repositories matching this topic...

Comparative research platform for Deep Reinforcement Learning and heuristic controllers in autonomous racing. Benchmarks DRL (PPO) agents against deterministic baselines in Unity MiniKart, with full reproducibility, human-like evaluation, and performance logs.

  • Updated Sep 4, 2025
  • C#

This repository implements a Proximal Policy Optimization (PPO) agent that learns to play Super Mario Bros using TensorFlow/Keras and OpenAI Gym. Features CNNs for vision, Actor-Critic architecture, and parallel environments. Train your own Mario master or run a pre-trained one!

  • Updated Jun 1, 2025
  • PureBasic

Reinforcement learning–based controller for balancing an inverted pendulum using Proximal Policy Optimization (PPO). Supports configurable mass, length, and gravity settings (Earth, lunar, microgravity) with automated training logs, reward visualization, and performance analysis.

  • Updated Aug 13, 2025
  • Python

Developed-an-AWS-DeepRacer-model-using-Python-&-the-PPO-algorithm,-leveraging-TensorFlow-to-train-&-fine-tune-a-deep-reinforcement-learning-model.-Designed-a-custom-reward-function-&-optimized-hyperparameters-to-improve-policy-learning-&-navigation-performance.-Utilized-AWS-infrastructure-for-scalable-training-&-deployment.

  • Updated Apr 26, 2025

This repository explores Reinforcement Learning (RL) through hands-on implementations of key algorithms and environments. It demonstrates how agents learn by interacting with environments, optimizing rewards, and adapting to tasks ranging from Atari games to autonomous driving and custom simulations.

  • Updated Aug 28, 2025
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the ppo-algorithm topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the ppo-algorithm topic, visit your repo's landing page and select "manage topics."

Learn more