Adversarial Attacks - Breaking and defending neural networks

Despite impressive accomplishments of deep neural networks in recent years, adversarial examples are stark examples of their brittleness and vulnerability.

Deep Clustering

Can deep neural networks learn to do clustering? Introduction, survey and discussion of recent works on deep clustering algorithms.

Introduction to TensorFlow and Computation Graph

Tensorflow is very popular and powerful machine learning library from Google. It was developed by Google Brain Team for in-house research and later open sourced on November 2015. It has been widely adopted in research and production and has become one of the most popular library for Deep Learning.

Putting it all together and Classifying MNIST dataset

Time for showdown! Lets assemble the layers, bring forward our model solvers and try to train the CNN we implemented from scratch on the oh so popular MNIST dataset and see how well we can do.

Solving the model - SGD, Momentum and Adaptive Learning Rate

Thanks to active research, we are much better equipped with various optimization algorithms than just vanilla Gradient Descent. Lets discuss two more different approaches to Gradient Descent - Momentum and Adaptive Learning Rate.

Classification and Loss Evaluation - Softmax and Cross Entropy Loss

Lets dig a little deep into how we convert the output of our CNN into probability - Softmax; and the loss measure to guide our optimization - Cross Entropy.

Dropout Layer - The unconventional regularization technique

Overfitting has always been the enemy of generalization. Dropout is very simple and yet very effective way to regularize networks by reducing coadaptation between the neurons. More discussion and implementation follows.

BatchNorm Layer - Understanding and eliminating Internal Covariance Shift

Batch Normalization is new technique that gives relaxation while initializing the network, allows higher learning rate and allows us to train very deep networks. Very promising! Lets derive the math for forward and backward pass step by step by hand and implement the BatchNorm layer!

Maxpool Layer - Summarizing the output of Convolution Layer

Pooling layers are important building block of CNNs. They summarize the activation maps and keep the number of network parameters low. Time to implement Maxpool!

Convolution Layer - The core idea behind CNNs

What makes CNN special is of course the Convolution Layers. Inspired by how visual cortex in animals work, these layers extract features independent of where they occur in the images. Lets derive the math and implement our own Conv Layer!