Discover more from Ctrl Alt Right Delete
“Disinformation” Doesn’t Even Begin to Describe This
After all this time, we still aren't grasping the whole problem of online harms
Most of the time, I find the term disinformation inadequate. For the past couple of weeks, I’ve absolutely loathed the word. There’s been a lot written about what’s happening online around the horrors in Israel and Gaza, but Joan Donovan, writing in Time, has come the closest to capturing my own feelings on the scope and scale of what we’re dealing with:
[S]ocial media is a battleground of facts, lies, and deception, where governments, journalists, law enforcement, and activists are on an uneven playing field.
It is a massive understatement to use the term “fog of war” to describe what is happening in discussions of Hamas and Israel on social media. It’s a torrent of true horror, violent pronunciations, sadness, and disinformation.
I’ve had a few emails and press requests asking about how to spot misinformation and disinformation online or how to correct others who are sharing it gently. I’m always glad to point folks to methods for spotting false information generally, such as the SIFT Method or recent well-done topical guides like this one from Vox. But I also know that trying to sort truth from fiction isn’t enough; it doesn’t get to the heart of the problem, the game of media manipulation that’s constantly playing out online, or how it’s affecting all of us.
It’s a systemic problem, and of course, the tech platforms are failing. Last week, I covered Twitter specifically, but the other social media companies aren’t faring much better. The New York Times reports that even with the safety features turned on, children can easily access graphic images of the war on Snapchat, Instagram, and TikTok. And constant policy changes on all the platforms have made it more difficult for researchers to track disinformation and problematic content.
It’s maddening that after all this time, individuals are left to sort this out on their own. I’m all for media literacy and digital literacy, but I don’t think it’s reasonable that any one person should be expected to fend off disinformation and propaganda from warning governments, chaos agents, political organizations, PR firms, troll farms, and anyone else with an agenda with little to no help from the tech platforms that host this constant madness. And yet, the most common question I’m asked is how individuals can spot disinformation or correct others who share it online.
Personally, I’m less worried about disinformation and more worried about content online that dehumanizes others and incites hate, harassment, and harm. Last week, I talked about increased threats towards Jews and Muslims around the world and how it made them less safe. Since writing that post, here in the States, a 6-year-old Palestinian American boy was murdered and his mother attacked by their landlord. In Germany, a synagogue was firebombed. As long as the war continues, as long as the propaganda efforts continue unchecked, the world is less safe for Jews and Muslims.
It’s also not good for anyone’s mental health, something else the term “disinformation” doesn’t really convey either. The Center for Countering Digital Hate, in a recent guide to navigating disinformation and propaganda and building digital resilience takes a more holistic approach that I really like. The guide asks people not to share violent or distressing images, both because the people featured in them were filmed without their consent and because they are often released for propaganda purposes. CCDH also advises practicing self-care, potentially even seeking professional help, as a way to increase your resilience to what’s happening online.
I think that’s good advice. I’d add that it’s a good idea to show ourselves (and others) some grace. Emotions are running high, people are afraid, and they’re angry. People are going to screw up, sharing things they shouldn’t and saying things they’ll likely regret. It’s obviously not OK for anyone to cause others harm knowingly, but we’re all swimming upstream against a tidal wave of shit from bad actors, propagandists, and the tech platforms that give them safe harbor. It’s OK not to respond, take a beat, or log off for a while. Whatever you need to do to protect and care for yourself.
A worthwhile read. Especially as we’re heading into the holidays, the time when we break bread with our right-wing uncles and their differing political views.
Inside Discord’s Reform Movement for Banned Users (Platformer)
On the other hand, Discord, a platform where over half the users are 13-24, has a unique opportunity to help teach young people how to interact and be in community online. The company just announced a new trust and safety warning system, and Casey Newton’s Platformer has the scoop. I’m keen to learn more about how this works out and what Discord learns along the way.
I was already looking forward to reading the book about the work of the Sedition Hunters, but this excerpt has me even more excited.
Including this link because you deserve it as a treat. It stars Rob Schneider as the father, and the kids are homeschooled. I find it amusing that the Daily Wire spends so much time coming for Disney while also copying its content (Bluey is distributed by Disney in America) so blatantly. As my patrons already know, the Daily Wire is also producing it’s own live-action Snow White. I’m sure the Daily Wire’s attempt to create MAGADisney will be as entertaining as it sounds.
Thanks so much to everyone who reached out in response to last week’s big news! It’s great to see that CARD readers are as enthusiastic about the new partnership as me and the Courier Team are. As a reminder, you can always get in touch with questions or feedback by replying directly to this email. I read everything you write and respond to most messages.
That’s all for this week. I know *gestures* all of this is a lot right now, so please enjoy this GIF of several puppies and one annoyed cat. We’ll talk again next Sunday!