For those of us who have returned to live with boomer parents during lockdown, we’re being faced with the fact they are suddenly incredibly online. While Gen-Z and millennials may be spending their quarantine on TikTok, YouTube and Instagram, boomers are obsessively sharing Facebook memes, beefing on community pages and above all else obsessively sending each other messages that they were forwarded on WhatsApp. A Twitter keyword search yields thousands of complaints over the past month from adult children who have received misinformation from their parents on WhatsApp. The messages being forwarded range from suggestions to microwave your money to rid it of coronavirus to clearly fake messages from medical staff that begin “this is a text from an NHS worker”.
WhatsApp, which is owned by Facebook, announced on Tuesday that it would be limiting the way in which users can forward messages amid the Covid-19 outbreak. “In recent weeks, people have also used WhatsApp to organise public moments of support for frontline health workers,” it said in a press release. “However, we’ve seen a significant increase in the amount of forwarding which users have told us can feel overwhelming and can contribute to the spread of misinformation. We believe it’s important to slow the spread of these messages down to keep WhatsApp a place for personal conversation.”
So, as of now, if you are sent a message on WhatsApp that has already been forwarded five different times, then you will only be able to forward it to one chat at a time (previous rules would have allowed users to forward such messages to five chats with a single click). The idea is to mitigate misinformation from being shared rapidly, since it is more work for users to individually forward messages to every chat they are in than simply sending them in one mass forward. This rule follows previous limits on forwarding that WhatsApp rolled out globally in 2019, after misinformation spread via the app in India was later linked to mob violence.
The question you might be asking yourself now is: wait – WhatsApp already knew it had a forwarding problem? And after answering that question with a resounding “yes”, you might ask: is making it slightly more difficult to forward misinformation really the best that WhatsApp could do? This is where we have a problem. WhatsApp knows when messages are being forwarded. That heavily forwarded messages will now set off a “stop mass forwarding” switch means that the platform has the capacity to start flagging messages that are being forwarded frequently, ie the ones that might contain misinformation. These messages could be automatically sent to fact-checkers or moderators who could then take down those found to contain misinformation or conspiracy theories – and such messages could even be deleted across the app.
It may seem baffling that WhatsApp isn’t doing this, but it’s likely that there would be a backlash if it did so. With all conspiracy theories, and misinformation more generally, people want to believe that this information is true – that microwaving your cash will save you from catching coronavirus and that a reassuring (or even nonreassuring) message from an NHS worker will give you a greater sense of control over an out-of-control reality. It’s inevitable that people would accuse WhatsApp of “silencing” truth and that it could even be accused of dangerous behaviour for removing medical information amid a pandemic. WhatsApp also brands itself as ultra-secure through its end-to-end encryption – something that would have to be sacrificed temporarily to remove misinformation.
That said, though, WhatsApp should not be let off for this light-touch approach. Now more than ever, heavy-handedness and aggressive new policies are necessary to tackle misinformation that could put people at risk. WhatsApp already has the tools available to start making itself a safer platform. Simply making it trickier to forward misinformation is far from the best it could do.
This piece was updated to highlight the issues with end-to-end encryption.