Highlights
- Pro
Stars
BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.
A Flow-based Generative Network for Speech Synthesis
LaNMT: Latent-variable Non-autoregressive Neural Machine Translation with Deterministic Inference
higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual training steps.
Pytorch implementation of Block Neural Autoregressive Flow
Code for the paper "VAE with a VampPrior", J.M. Tomczak & M. Welling
Generative Flow based Sequence-to-Sequence Toolkit written in Python.
XLNet: Generalized Autoregressive Pretraining for Language Understanding
Supplementary materials and code for Multi-Turn Beam Search
A Pytorch-based Neural Machine Translation Framework for Research
Implementation of "Von Mises-Fisher Loss for Training Sequence to Sequence Models with Continuous Outputs"
Reference BLEU implementation that auto-downloads test sets and reports a version string to facilitate cross-lab comparisons
Experiments for the Neural Autoregressive Flows paper
PyTorch implementation of Trust Region Policy Optimization
PyTorch implementation of Advantage Actor Critic (A2C), Proximal Policy Optimization (PPO), Scalable trust-region method for deep reinforcement learning using Kronecker-factored approximation (ACKT…
A PyTorch implementation of Learning to learn by gradient descent by gradient descent