| addition_rnn | Implementation of sequence to sequence learning for performing addition of two numbers (as strings). | 
| babi_memnn | Trains a memory network on the bAbI dataset for reading comprehension. | 
| babi_rnn | Trains a two-branch recurrent network on the bAbI dataset for reading comprehension. | 
| cifar10_cnn | Trains a simple deep CNN on the CIFAR10 small images dataset. | 
| cifar10_densenet | Trains a DenseNet-40-12 on the CIFAR10 small images dataset. | 
| conv_lstm | Demonstrates the use of a convolutional LSTM network. | 
| deep_dream | Deep Dreams in Keras. | 
| imdb_bidirectional_lstm | Trains a Bidirectional LSTM on the IMDB sentiment classification task. | 
| imdb_cnn | Demonstrates the use of Convolution1D for text classification. | 
| imdb_cnn_lstm | Trains a convolutional stack followed by a recurrent stack network on the IMDB sentiment classification task. | 
| imdb_fasttext | Trains a FastText model on the IMDB sentiment classification task. | 
| imdb_lstm | Trains a LSTM on the IMDB sentiment classification task. | 
| lstm_text_generation | Generates text from Nietzsche’s writings. | 
| mnist_acgan | Implementation of AC-GAN (Auxiliary Classifier GAN ) on the MNIST dataset | 
| mnist_antirectifier | Demonstrates how to write custom layers for Keras | 
| mnist_cnn | Trains a simple convnet on the MNIST dataset. | 
| mnist_irnn | Reproduction of the IRNN experiment with pixel-by-pixel sequential MNIST in “A Simple Way to Initialize Recurrent Networks of Rectified Linear Units” by Le et al. | 
| mnist_mlp | Trains a simple deep multi-layer perceptron on the MNIST dataset. | 
| mnist_hierarchical_rnn | Trains a Hierarchical RNN (HRNN) to classify MNIST digits. | 
| mnist_tfrecord | MNIST dataset with TFRecords, the standard TensorFlow data format. | 
| mnist_transfer_cnn | Transfer learning toy example. | 
| neural_style_transfer | Neural style transfer (generating an image with the same “content”" as a base image, but with the “style”" of a different picture). | 
| reuters_mlp | Trains and evaluatea a simple MLP on the Reuters newswire topic classification task. | 
| stateful_lstm | Demonstrates how to use stateful RNNs to model long sequences efficiently. | 
| variational_autoencoder | Demonstrates how to build a variational autoencoder. | 
| variational_autoencoder_deconv | Demonstrates how to build a variational autoencoder with Keras using deconvolution layers. |