Thanks to visit codestin.com
Credit goes to github.com

Skip to content

namgyu-youn/optimizers

 
 

Repository files navigation

Optimizers

Python3.12 tests gpu-tests pre-commit type-checking examples license

Copyright (c) Meta Platforms, Inc. and affiliates. All rights reserved.

Description

Optimizers is a Github repository of PyTorch optimization algorithms. It is designed for external collaboration and development.

Currently includes the optimizers:

  • Distributed Shampoo

See the CONTRIBUTING file for how to help out.

License

Optimizers is released under the BSD license.

Installation and Dependencies

Install distributed_shampoo with all dependencies:

git clone [email protected]:facebookresearch/optimizers.git
cd optimizers
pip install .

If you also want to try the examples, replace the last line with pip install ".[examples]".

Usage

After installation, basic usage looks like:

import torch
from distributed_shampoo import AdamPreconditionerConfig, DistributedShampoo

model = ...  # Instantiate model

optim = DistributedShampoo(
    model.parameters(),
    lr=1e-3,
    betas=(0.9, 0.999),
    epsilon=1e-8,
    grafting_config=AdamPreconditionerConfig(
        beta2=0.999,
        epsilon=1e-8,
    ),
)

For more, please see the additional documentation here and especially the How to Use section.

About

For optimization algorithm research and development.

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.8%
  • Makefile 0.2%