Thanks to visit codestin.com
Credit goes to github.com

Skip to content
View RichardMinsooGo's full-sized avatar

Block or report RichardMinsooGo

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Showing results

Pytorch GAIL VAIL AIRL VAIRL EAIRL SQIL Implementation

Python 67 12 Updated May 25, 2021

Implementation of GateLoop Transformer in Pytorch and Jax

Python 1 Updated Mar 11, 2025

A concise but complete full-attention transformer with a set of promising experimental features from various papers

Python 1 Updated Mar 11, 2025

Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT

Python 1 Updated Mar 11, 2025

Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch

Python 1 Updated Mar 11, 2025

Implementation of RQ Transformer, proposed in the paper "Autoregressive Image Generation using Residual Quantization"

Python 1 Updated Mar 11, 2025

Implementation of Memorizing Transformers (ICLR 2022), attention net augmented with indexing and retrieval of memories using approximate nearest neighbors, in Pytorch

Python 1 Updated Mar 11, 2025

Implementation of Mega, the Single-head Attention with Multi-headed EMA architecture that currently holds SOTA on Long Range Arena

Python 1 Updated Mar 11, 2025

Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways

Python 1 Updated Mar 11, 2025

Implementation of Multistream Transformers in Pytorch

Python 1 Updated Mar 11, 2025

Implementation of Feedback Transformer in Pytorch

Python 1 Updated Mar 11, 2025

Implementation of Linformer for Pytorch

Python 1 Updated Mar 11, 2025

A Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering

Python 1 Updated Mar 11, 2025

Implementation of Fast Transformer in Pytorch

Python 1 Updated Mar 11, 2025

Implementation of gMLP, an all-MLP replacement for Transformers, in Pytorch

Python 1 Updated Mar 11, 2025

Implementation of Hourglass Transformer, in Pytorch, from Google and OpenAI

Python 1 Updated Mar 11, 2025

Implementation of Metaformer, but in an autoregressive manner

Python 1 Updated Mar 11, 2025

Implementation of Nyström Self-attention, from the paper Nyströmformer

Python 1 Updated Mar 11, 2025

Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch

Python 1 Updated Mar 11, 2025

Implementation of a Transformer using ReLA (Rectified Linear Attention) from https://arxiv.org/abs/2104.07012

Python 1 Updated Mar 11, 2025

Bible_2_U_ViY_Family

Python 3 1 Updated Jun 3, 2024
Next