What is the Vanishing Gradient Problem?

Posted by & filed under General.

data science

The Vanishing Gradient Problem appears in Neural Networks when you train a NN using Gradient Descent, the gradients tend to get smaller and smaller as we keep on moving backward in a NN. Basically in a Neural Network, after the forward propagation ends, the gradient is not providing meaningful information back to the first layers… Read more »