New Times,
New Thinking.

3 September 2012

The solution to porn on Instagram is more porn on Instagram

Scrubbing the Internet of filth isn't going to happen. So embrace it!

By Alex Hern

The Telegraph‘s Willard Foxton writes on the epidemic of smut slowly filling up Instagram:

A quick browse through tags like “handbra” brings up almost 500,000 images of young women using their hands to preserve their modesty – and that’s just the shallow, softcore end of the pool. Look deeper if you like, but I warn you, it gets pretty shocking, pretty fast, even to a hardened internet veteran like me. . .

So, what’s the problem with this? If like-minded adults want to swap nude pictures of one another, surely it’s no-one else’s concern? The trouble is, Instagram is hugely popular with the under-15s. Recently, an American parenting blog shared a chilling story about how easily children were being groomed through the service. . .

A spokesman for Instagram says: “We rely heavily on users to flag inappropriate content and we do our best to remove any media that we determine to be inappropriate.”

Foxton suggests that, without the manpower to properly moderate the site, Instagram may go the same way as Habbo Hotel – which was exposed by Channel 4 News as being, well, not the safest place for kids to hang out.

But Habbo Hotel wasn’t stymied by its failure to moderate. As the Channel 4 report shows, it actually employs 225 people to do just that – 15 times larger than Instagram’s entire staff.

Habbo’s failure was being aimed at kids while not being able to guarantee that safe space, but there’s no reason why Instagram should fall prey to the same target. While the site may have a lot of younger users, that doesn’t mean it has to completely sanitise itself, any more than TV or the internet itself should because they have kids using them.

Give a gift subscription to the New Statesman this Christmas from just £49

The problem Instagram has isn’t all the porn, but the fact that all that porn isn’t labelled – and it won’t ever be, because there’s no motivation for uploaders to tag porn as such. The only way to mark porn is to report it as “inappropriate” and get it kicked off the site.

The solution to porn is more porn. Let people posting pictures of their naked selves to clearly mark it as such; let parents set filters on their children’s accounts; and enforce stricter penalties against those who insist on uploading untagged porn. This is how Flickr dealt with adult content (before Yahoo killed it), offering “safe, moderate and restricted” as categories.

Besides, how could we live with ourselves if we scrubbed fake polaroid photos of handbras from the internet?

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football