Residual Networks

Welcome to the second assignment of this week! You will learn how to build very deep convolutional networks, using Residual Networks (ResNets). In theory, very deep networks can represent very complex functions; but in practice, they are hard to train. Residual Networks, introduced by He et al., allow you to train much deeper networks than were previously practically feasible.

In this assignment, you will:

Updates

If you were working on the notebook before this update...

List of updates

This assignment will be done in Keras.

Before jumping into the problem, let's run the cell below to load the required packages.

1 - The problem of very deep neural networks

Last week, you built your first convolutional neural network. In recent years, neural networks have become deeper, with state-of-the-art networks going from just a few layers (e.g., AlexNet) to over a hundred layers.

**Figure 1** : **Vanishing gradient**
The speed of learning decreases very rapidly for the shallower layers as the network trains

You are now going to solve this problem by building a Residual Network!