Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Offical Repo for Splitting Steepest Descent for Growing Neural Architectures

License

Notifications You must be signed in to change notification settings

klightz/splitting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 

Repository files navigation

Splitting Steepest Descent for Growing Neural Architectures

Here is the implementation of the following paper which is received by NeurIPS 2019.

[Paper]

Overview

This code mainly implements our algorithm on CIFAR10 and CIFAR100 datasets and uses MobileNet V1 as the backbone. To use the code, simply run

bash mbv1/train.sh

Citation

Please cite this paper if you want to use it in your work,

@article{liu2019splitting,
  title={Splitting Steepest Descent for Growing Neural Architectures},
  author={Liu, Qiang and Wu, Lemeng and Wang, Dilin},
  journal={arXiv preprint arXiv:1910.02366},
  year={2019}
}

Firefly Splitting version

Here is our related work Firefly Neural Architecture Descent: a General Approach for Growing Neural Networks, which is accepted by Neurips 2020. This work allows more splitting schemes with much faster speed by approximating the splitting metrics using the first-order information. [Link]

Energy-Aware Fast Splitting version

Here is an Energy-aware fast splitting version with more benchmarks implemented. [Link]

License

MIT License

About

Offical Repo for Splitting Steepest Descent for Growing Neural Architectures

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published