You can test it out for yourself. Pick up your iPhone, go to Photos, and click the little magnifying glass. Type “Brassiere” – if you have any, you’ll soon realise that Apple has been cataloguing pictures of you/your loved ones/your mates in their bras.
Apple has been doing this for over a year, but most people only realised this week. A viral tweet on Monday and a follow up one on Tuesday mean thousands of people are now aware of the feature. But as celebrity model Chrissy Teigen asked when she first exposed the tech to her 8.25 million Twitter followers: why? Why does it exist?
It’s true. If u type in “brassiere” in the search of your iphotos, it has a category for every boob or cleavage pic you’ve ever taken. Why. pic.twitter.com/KWWmJoRneJ
— christine teigen (@chrissyteigen) October 31, 2017
Image categorisation has been a feature on iPhones since the introduction of iOS 10 in June last year. Searching your photos will reveal the thousands of categories Apple caters for, from the ordinary (cocktails, puppies, birthday cake) to the slightly more bizarre (dance palace, firearm, clock towers).There are nearly 4,500 total categories, a number which was announced by developer Kenny Yin over a year ago.
Is this creepy? Those who just found out about the feature seem to think so – but Apple reassures its users that its image recognition AI works within your actual device, maintaining the privacy of your pictures. Only you can see your brassiere shots, and no humans have been involved in picking them out and plonking them in an album for you. It’s also worth noting that the “brassiere” album is about as saucy as it gets – Apple doesn’t categorise pictures with labels like “underwear”, “nipples”, “nude”, “naked”, etc., etc.
Despite this, brassiere-gate has still exposed a serious privacy flaw.
It’s nothing to do with Apple, and it’s nothing to do with any bosoms that may or may not be in your camera roll. The privacy flaw exposed by this episode is simple: the flaw is the way we think about privacy itself.
It’s easy to freak out about how technology invades our private lives when nip pics are on the line. But similar image recognition to Apple’s is being used by other companies for far more nefarious ends. Do you want to live in a world where the police use facial recognition software to identify “troublemakers” at Notting Hill carnival? Where Facebook can automatically tag pictures of you? Where the iPhone X’s facial recognition could be used by brands to check you really are watching their ads?
Even these aren’t the real privacy issues we face today. It’s much easier to view tech as dystopian when it’s literally looking at you and/or your underwear pics, but in reality our privacy is invaded in far more boring (but still disturbing) ways. Your personal data is being collected, bought, and sold every day when you use Google, Amazon, Apple, Facebook, Instagram, etc., etc.
The fact Apple’s image categorisation has existed for over a year and is only hitting headlines now illustrates how little we’re tuned in to the decisions companies make about our lives.
Yet again this week, headlines were made when people suspected Facebook was using microphones to listen to their conversations and then advertise products towards them. Facebook said it wasn’t – but why do we repeatedly freak out about this and not the fact the social network uses practically everything but the microphone to serve you ads? Listening in is spooky, sure, but so is monitoring your emotional state, keeping a track of your hobbies, handing over your personal details to the British government, and using your phonebook to recommend you Friends.
It’s even creepier that we often don’t know how Facebook has figured out stuff about us. I turned vegetarian around four months ago, and a search of Facebook’s Activity Log shows I haven’t used the word “vegetarian” on the social network since 16 March 2013. So why has the site’s ad preferences toolbar suddenly started listing “vegetarianism” and “vegetarian cuisine” under my interests?
Related: A guide to escaping Facebook’s evil clutches without, erm, actually deleting it
The fact that Apple’s “bra” album has made people think twice about privacy is still a good thing. Hopefully it will open up conversations about what we’re consenting to when we buy and use tech. But in the end, Apple’s AI hasn’t exposed anyone’s breasts – it has exposed our ignorance.