- We show a nonlinear function approximation task performed by linear model (polynomial degree) and a simple 1/2 hidden layer (densely connected) neural net to illustrate the difference and the capacity of deep neural nets to take advantage of larger datasets (Here is the Notebook).
- Fashion MNIST image classification using densely connected network and 1/2/3 layer CNNs (Here is the Notebook).
-
Horse or human image classification using Keras
ImageDataGeneratorand Google colaboratory platform (Here is the Notebook) -
Classification on the flowers dataset and the famous Caltech-101 dataset using
fit_generatorandflow_from_directory()method of theImageDataGenerator. Illustrates how to streamline CNN model building from a single storage of image data using these utility methods. (Here is the Notebook)
- Simple illustration of transfer learning using CIFAR-10 dataset (Here is the Notebook)
- Adding simple Object-oriented Programming (OOP) principle to your deep learning workflow (Here is the Notebook).
- ResNet on CIFAR-10 dataset, showing how to use Keras Callbacks classes like
ModelCheckpoint,LearningRateScheduler, andReduceLROnPlateau. You can also change a single parameter to generate ResNet of various depths. (Here is the Notebook).
- Automatic text generation (based on simple character vectors) using LSTM network. Play with character sequence length, LSTM architecture, and hyperparameters to generate synthetic texts based on a particular author's style! (Here is the Notebook).
- Bi-directional LSTM with embedding applied to the IMDB sentiment classification task (Here is the Notebook)
- Simple demo of building a GAN model from scratch using a one-dimensional algebraic function (Here is the Notebook)
- Keras Scikit-learn wrapper example with 10-fold cross-validation and exhaustive grid search (Here is the Notebook)