paint-brush
Introduction To Maths Behind Neural Networksby@dasaradhsk
8,593 reads
8,593 reads

Introduction To Maths Behind Neural Networks

by Dasaradh S K4mDecember 23rd, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The Math behind neural networks and Deep Learning is still a mystery to some of us. Having knowledge of deep learning can help us understand what’s happening inside a neural network. I decided to to to start from scratch and derive the methodology and Math. Backpropagation, short for backward propagation of errors, refers to the algorithm for computing the gradient of the loss function with respect to the weights. In order to find the best weights and bias for our Perceptron, we need to know how the cost function changes. This is done with the help of gradients (gradients) in relation to another quantity.

Coin Mentioned

Mention Thumbnail
featured image - Introduction To Maths Behind Neural Networks
Dasaradh S K HackerNoon profile picture
Dasaradh S K

Dasaradh S K

@dasaradhsk

ML Enthusiast | Mechatronics Engineering Student

About @dasaradhsk
LEARN MORE ABOUT @DASARADHSK'S
EXPERTISE AND PLACE ON THE INTERNET.
L O A D I N G
. . . comments & more!

About Author

Dasaradh S K HackerNoon profile picture
Dasaradh S K@dasaradhsk
ML Enthusiast | Mechatronics Engineering Student

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite