Thanks to visit codestin.com
Credit goes to github.com

Skip to content

migueldecampos/slm

Repository files navigation

SLM

Small Language Models

What is this?

This repo contains a from-scratch implementation of a transformer-based language model. This implementation allows training language models both on next-token and previous-token prediction. New features and upgrades will be made soon.

Current features

  • Training and inference scripts for forward (next token prediction) and reverse (previous token prediction) GPT2-like language models.
  • Dataloaders for the TinyShakespeare and TinyStories datasets.

Sources

Inspiered by and adapted from Andrej Karpathy's video series.

About

Toy Small Language Models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages