Code examples for some popular machine learning algorithms, using TensorFlow library. This tutorial is designed to easily dive into TensorFlow, through examples. It includes both notebook and code with explanations.
TFLearn is a library that provides a simplified interface for TensorFlow. It was designed to speed-up experimentations. You can have a look, there are many other examples and pre-built operations.
- Nearest Neighbor (notebook) (code)
- Linear Regression (notebook) (code)
- Logistic Regression (notebook) (code)
- Multilayer Perceptron (notebook) (code)
- Convolutional Neural Network (notebook) (code)
- AlexNet (notebook) (code)
- Recurrent Neural Network (LSTM) (notebook) (code)
- Bidirectional Recurrent Neural Network (LSTM) (notebook) (code)
These examples are coming from TFLearn examples. They require tflearn to be installed in order to work. TFLearn is a simplified interface for TensorFlow that introduce pre-built layers, ops, training functions...
- Linear Regression. Implement a linear regression using TFLearn.
- Logical Operators. Implement logical operators with TFLearn (also includes a usage of 'merge').
- Weights Persistence. Save and Restore a model.
- Fine-Tuning. Fine-Tune a pre-trained model on a new task.
- Using HDF5. Use HDF5 to handle large datasets.
- Using DASK. Use DASK to handle large datasets.
- Multi-layer perceptron. A multi-layer perceptron implementation for MNIST classification task.
- Convolutional Network (MNIST). A Convolutional neural network implementation for classifying MNIST dataset.
- Convolutional Network (CIFAR-10). A Convolutional neural network implementation for classifying CIFAR-10 dataset.
- Network in Network. 'Network in Network' implementation for classifying CIFAR-10 dataset.
- Alexnet. Apply Alexnet to Oxford Flowers 17 classification task.
- VGGNet. Apply VGG Network to Oxford Flowers 17 classification task.
- RNN Pixels. Use RNN (over sequence of pixels) to classify images.
- Residual Network (MNIST). A residual network with shallow bottlenecks applied to MNIST classification task.
- Residual Network (CIFAR-10). A residual network with deep bottlenecks applied to CIFAR-10 classification task.
- Auto Encoder. An auto encoder applied to MNIST handwritten digits.
- Reccurent Network (LSTM). Apply an LSTM to IMDB sentiment dataset classification task.
- Bi-Directional LSTM. Apply a bi-directional LSTM to IMDB sentiment dataset classification task.
- City Name Generation. Generates new US-cities name, using LSTM network.
- Shakespeare Scripts Generation. Generates new Shakespeare scripts, using LSTM network.
tensorflow
numpy
matplotlib
cuda (to run examples on GPU)
tflearn (if using tflearn examples)
For more details about TensorFlow installation, you can check Setup_TensorFlow.md
Some examples require MNIST dataset for training and testing. Don't worry, this dataset will automatically be downloaded when running examples (with input_data.py). MNIST is a database of handwritten digits, with 60,000 examples for training and 10,000 examples for testing. (Website: http://yann.lecun.com/exdb/mnist/)