Thanks to visit codestin.com
Credit goes to Github.com

Skip to content

Kelu01/smiles-gen

Repository files navigation

Autoregressive Transformer Model

This is a simple and easy-to-implement autoregressive Transformer model for sequence generation tasks such as SMILES generation. It is implemented entirely in PyTorch with minimal dependencies.

Features

  • Clean and lightweight implementation
  • Supports autoregressive (causal) sequence modeling
  • Includes loss computation and sampling functions
  • Automatically selects device (MPS, CUDA, or CPU)
  • Easy to integrate with any custom vocabulary

Model Overview

The model uses:

  • Token and positional embeddings
  • A Transformer encoder with causal masking
  • A linear output layer projecting to vocabulary logits

The TransformerModel class wraps this architecture and provides:

  • compute_loss() for training
  • sample() for autoregressive generation

Requirements

  • Python 3.8+
  • PyTorch
  • RDKit (optional, for SMILES visualization)

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published