Topic DL02: Understanding Backpropagation

  • We first initialized some random value to ‘W’ and propagated forward.
  • Then, we noticed that there is some error. To reduce that error, we propagated backwards and increased the value of ‘W’.
  • After that, also we noticed that the error has increased. We came to know that, we can’t increase the ‘W’ value.
  • So, we again propagated backwards and we decreased ‘W’ value.
  • Now, we noticed that the error has reduced.

How Backpropagation Works for a neural network ?

  • Two inputs
  • Two hidden neurons
  • Two output neurons
  • Two biases
  • Step — 1: Forward Propagation
  • Step — 2: Backward Propagation
  • Step — 3: Putting all the values together and calculating the updated weight value

Step — 1: Forward Propagation

Step — 2: Backward Propagation

Step — 3: Putting all the values together and calculating the updated weight value

  • Similarly, we can calculate the other weight values as well.
  • After that we will again propagate forward and calculate the output. Again, we will calculate the error.
  • If the error is minimum we will stop right there, else we will again propagate backwards and update the weight values.
  • This process will keep on repeating until error becomes minimum.

--

--

--

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Day 5: Conditional Image Generation with PixelCNN Decoders

Scalability of Machine Learning in Big Data: A Systematic Mapping Study

Intel MLSL Makes Distributed Training with MXNet Faster

HOWTO: Get Started With Machine Learning! (4 Tips)

How to handle Imbalanced Data for Classification. Case study: Credit Card Fraud Detection

How to avoid Overfitting

Dreaming of Robotic Manipulation Trajectories with Deep Learning

BERT Meets GPUs

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
abhigoku10

abhigoku10

More from Medium

CNN for Autonomous Driving

How deep learning took so much time to take off

Hello, Neural Networks!

Neural Networks: Abstract

Gradient Descent, you beauty! A layman's explanation of Gradient Descent