Thanks to visit codestin.com
Credit goes to lib.rs

#adam

  1. adam

    A command-line interface for compiling Gms2 projects

    v0.11.1 2.3K #compiler #gms2 #game-maker #project #vm #command-line-interface #command-line-tool #macos #generated-artifact
  2. trustformers-optim

    Optimizers for TrustformeRS

    v0.1.0-alpha.1 #learning-rate #gradient-descent #decay #memory-optimization #momentum #transformer-models #adam #hyper-parameters #memory-efficient #sgd
  3. ghostflow-optim

    Optimizers for GhostFlow ML framework

    v1.0.0 #optimization #machine-learning #adam #sgd
  4. tensorrs

    Tensors is a lightweight machine learning library in Rust

    v0.3.2 #machine-learning #matrix #tensor #neural-network #linalg #adam #optim
  5. dendritic

    Iterative Optimization Library

    v2.2.0 #automatic-differentiation #machine-learning #model #sgd #pre-processor #adam #multidimensional-array #multi-dimensional-array #logistic
  6. ruvector-attention-wasm

    WASM bindings for ruvector-attention

    v0.1.0 #multi-head-attention #wasm-bindings #adam #hyperbolic #training #learning-rate #decay #dot-product #moe #warmup
  7. torsh-optim

    Optimization algorithms for ToRSh with PyTorch-compatible API

    v0.1.0-alpha.2 #gradient-descent #deep-learning #torsh #adam #decay #momentum #sgd #optimization-algorithm #machine-learning #pytorch
  8. optirs-core

    OptiRS core optimization algorithms and utilities

    v0.1.0 #optimization #adam #sgd #adamw #rmsprop
  9. stochastic_optimizers

    Generic implementations of gradient based stochastic optimizing algorithms

    v0.3.0 #optimization #adam #machine-learning #gradient
  10. Try searching with DuckDuckGo.