Yahoo’s NSFW Image-Blocking Algorithm Has A Dirty, Dirty Mind, According To A Research Paper

10.21.16 1 month ago • 3 Comments


The internet is a bottomless repository just not of dirty pictures, but of pictures we find dirty because we, as a species, think with our genitals. Children’s toys, sports logos, produce, you name it, somebody has pointed out what it looks like. And now, Yahoo! has, accidentally, taken on the noble work of making computers just as immature and dumb as us humans.

In a research paper posted to Github (warning: Link is not safe for work) by PhD candidate Gabriel Goh, Goh takes the rather simple step of pairing Yahoo!’s open-source dong-finding software with a neural network, similar to the one used to make robots dream, to figure out what, precisely, computers think is raunchy. And the answer is pretty much everything.

To explain, Yahoo!’s peen-blocker ranks every image on a scale from zero, which means nothing raunchy, to 1, which is everything raunchy. So over time, Goh found that the algorithm began seeing raunch where there was no raunch to be found, and with a little fine-tuning of the code, he discovered that basically he gave his neural network the sense of humor of a middle schooler. This isn’t just funny to watch a computer take a picture and twist it into a picture of a penis, although that’s hilarious; Goh points out that the network is experiencing a very real human phenomenon.

The reason we see faces and other bits of anatomy in everything is called pareidolia. The human brain likes patterns and will find them in anything, even if something isn’t actually there, because it’s better to get a false positive on a predator hiding in the jungle than to miss that it’s out there completely. So, Goh has effectively spotted a human thought process in the mathematics of neural networks. Now let’s get the roboticists on that whole “point and snicker” toolset.

(via Github)

Around The Web