paint-brush
Nudity Detection and Abusive Content Classifiers — Research and Use Casesby@shashankgupta_54342
1,076 reads
1,076 reads

Nudity Detection and Abusive Content Classifiers — Research and Use Cases

by Shashank Gupta5mFebruary 2nd, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Web 2.0 revolution has led to the explosion of content generated every day on the internet. Social sharing platforms such as Facebook, Twitter, Instagram etc. have seen astonishing growth in their daily active users but have been at their split ends when it comes to monitoring the content generated by their users. Users are uploading inappropriate content such as nudity or using abusive language while commenting on posts. Such behavior leads to social issues like bullying and revenge porn and also hampers the authenticity of the platform. However, the pace at which the content is generated online today is so high that it is nearly impossible to monitor everything manually. On Facebook itself, a total of 136,000 photos are uploaded, 510,000 comments are posted and 293,000 statuses are updated in every 60 seconds. At ParallelDots, we solved this problem through Machine Learning by building an algorithm that can classify nude photos (nudity detection) or abusive content with very high accuracy.

People Mentioned

Mention Thumbnail

Companies Mentioned

Mention Thumbnail
Mention Thumbnail

Coin Mentioned

Mention Thumbnail
featured image - Nudity Detection and Abusive Content Classifiers — Research and Use Cases
Shashank Gupta HackerNoon profile picture
Shashank Gupta

Shashank Gupta

@shashankgupta_54342

L O A D I N G
. . . comments & more!

About Author

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite