An entire GPT (Generative Pre-trained Transformer) language model in a Jupyter Notebook, written in PyTorch.
Intended as a small starter kit for your next NLP project. Just provide is a text file.
The model will be trained on this text, line by line, and will aim to imitate it.
-
Notifications
You must be signed in to change notification settings - Fork 0
MK2112/GPTemplate
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
A GPT in a Jupyter Notebook, written in PyTorch. Starter kit for your next NLP project.
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published