Convolutional Neural Networks: Application

Welcome to Course 4's second assignment! In this notebook, you will:

After this assignment you will be able to:

We assume here that you are already familiar with TensorFlow. If you are not, please refer the TensorFlow Tutorial of the third week of Course 2 ("Improving deep neural networks").

Updates to Assignment

If you were working on a previous version

  • The current notebook filename is version "1a".
  • You can find your work in the file directory as version "1".
  • To view the file directory, go to the menu "File->Open", and this will open a new tab that shows the file directory.

List of Updates

  • initialize_parameters: added details about tf.get_variable, eval. Clarified test case.
  • Added explanations for the kernel (filter) stride values, max pooling, and flatten functions.
  • Added details about softmax cross entropy with logits.
  • Added instructions for creating the Adam Optimizer.
  • Added explanation of how to evaluate tensors (optimizer and cost).
  • forward_propagation: clarified instructions, use "F" to store "flatten" layer.
  • Updated print statements and 'expected output' for easier visual comparisons.
  • Many thanks to Kevin P. Brown (mentor for the deep learning specialization) for his suggestions on the assignments in this course!