paint-brush
Binary Classification: Understanding Activation and Loss Functions with a PyTorch Exampleby@owlgrey
1,547 reads
1,547 reads

Binary Classification: Understanding Activation and Loss Functions with a PyTorch Example

by Dmitrii Matveichev 23mAugust 15th, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

To build a binary classification neural network you need to use the sigmoid activation function on its final layer together with binary cross-entropy loss. The final layer size should be 1. Such a neural network will output a probability p that the input belongs to class 1 and 1-p that the input belongs to class 0.
featured image - Binary Classification: Understanding Activation and Loss Functions with a PyTorch Example
Dmitrii Matveichev  HackerNoon profile picture
Dmitrii Matveichev

Dmitrii Matveichev

@owlgrey

No

About @owlgrey
LEARN MORE ABOUT @OWLGREY'S
EXPERTISE AND PLACE ON THE INTERNET.
0-item
1-item

STORY’S CREDIBILITY

Code License

Code License

The code in this story is for educational purposes. The readers are solely responsible for whatever they build with it.

Guide

Guide

Walkthroughs, tutorials, guides, and tips. This story will teach you how to do something new or how to do something better.

L O A D I N G
. . . comments & more!

About Author

Dmitrii Matveichev  HackerNoon profile picture
Dmitrii Matveichev @owlgrey
No

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite