Thanks to

Training a Convolutional Neural Network

Convolutional Neural Network Theoretical Course

Training a Convolutional Neural Network

Greetings! Some links on this site are affiliate links. That means that, if you choose to make a purchase, The Click Reader may earn a small commission at no extra cost to you. We greatly appreciate your support!

The training process of a Convolutional Neural Network is also similar to that of training a Dense Neural Network. The training process goes through multiple finite iterations where data is feed-forwarded into the network and then, weights are adjusted using back-propagation (gradient descent) until the loss of the network reaches a certain threshold. Once the model reaches the threshold, training is stopped.

Training a Convolutional Neural Network

However, the parameters that CNNs learn during training are slightly different than that of an ordinary Deep Neural Network. In CNN, the training objective is to optimize the pixel values of the kernels of the convolutional layers along with the connections of the fully connected layers. In other words, the weights of the convolution filter/kernel is also a learnable parameter in CNNs.

The pooling layer doesn’t have any weight assigned to it since it is just taking the maximum or average value of the output of the convolutional layer.

Training a Convolutional Neural NetworkTraining a Convolutional Neural Network

Do you want to learn Python, Data Science, and Machine Learning while getting certified? Here are some best selling Datacamp courses that we recommend you enroll in:

  1. Introduction to Python (Free Course) - 1,000,000+ students already enrolled!
  2. Introduction to Data Science  in Python- 400,000+ students already enrolled!
  3. Introduction to TensorFlow for Deep Learning with Python - 90,000+ students already enrolled!
  4. Data Science and Machine Learning Bootcamp with R - 70,000+ students already enrolled!

Leave your thought here

Your email address will not be published. Required fields are marked *