Ms Aerin
1 min readSep 29, 2017

--

Hi Sam, glad you liked it!

How should this cost gradient “connect” to the softmax backward gradient you posted here? → do you mean the chain rule?

Maybe this post will help? https://medium.com/@aerinykim/derive-the-gradients-w-r-t-the-inputs-to-an-one-hidden-layer-neural-network-fb24ed1ed05f

It’s not the complete explanation of backprop but it explains how to connect them using the chain rule.

If you want a step by step explanation of the backprop, this article might help you: https://mattmazur.com/2015/03/17/a-step-by-step-backpropagation-example/

--

--

Ms Aerin

Engineer. Love teaching math concepts intuitively. https://automata88.medium.com/subscribe