…And all of a sudden, Q is everywhere.
For people who don't know, QAnon is an incredibly stupid and crazy conspiracy theory, and it has at least hundreds of thousands, possibly millions, of adherents. The founders of the movement, even, are surprised by its reach. Ben Collins writes on Twitter: “These aren't just boomer memes and 4chan shitposters anymore. They're moms who were radicalized in wellness Facebook groups they've never even heard of.”
How did it get so far? Apparently because social media engagement algorithms kept propagating it, finding the versions that were the most attention-getting, and propagating those. It is very much like the evolution of a virus; the most infectious strains that do not kill the host do best. As Jaron Lanier put it:
…behind the scenes there are these manipulation, behavior modification, and addiction algorithms that are running. And these addiction algorithms are blind. They’re just dumb algorithms. What they want to do is take whatever input people put into the system and find a way to turn it into the most engagement possible. And the most engagement comes from the startle emotions, like fear and anger and jealousy, because they tend to rise the fastest and then subside the slowest in people, and the algorithms are measuring people very rapidly, so they tend to pick up and amplify startle emotions over slower emotions like the building of trust or affection. – link
So the engagement algorithms of YouTube, Twitter, and, above all, Facebook find the versions of the Q story that provide the most fear and anger and, above all, attention and spread them around.
This tells us something about the common fears of people (child kidnapping and rape), and it also tells us something about how publics can come to support genocide, because this is very like from what is said about every group that has been subjected to genocide. The slightly-whifty vaguely magical thinking we see in wellness promotions and the like turns out to have a darker side.
It also shows us something about for-profit social media. Facebook profited from genocide in Myanmar, is profiting from spreading hate in India right now, and played a role in radical-right victories in the UK and USA. Unchecked, existing social media engagement algorithms easily promote fascist thinking. This has to change. I don't see how any people who extensively use social media can stay free when in every social media conversation the engagement algorithms are there, spreading fear and hate.
There is also the possibility of social media that works differently, that promotes cooperation and compassion rather than fear and hate. It would be just as reprehensible to use these without consent as it is to use the current engagement algorithms, but with enthusiastic consent, it might be possible to legitimately use social media algorithms to propagate positive social values. It would have to be non-profit, or at least less-profit social media, but it might be worth trying.
2 comments:
So far I have never seen such. My hers of friends is thinned down and I don't hang out in places that are likely to post them. Old and Lucky I guess.
If algorithms can promote conspiracy theories then as you say, they ought to be able to promote positives instead.
I've seen a bit on YouTube – the YouTube iOS app is kinda scary and can send you places you never intended to go – but I don't hang out in the places where this is likely to show up.
People trust negatives more than positives, so there's more "engagement" to be had from negatives. Which is how we got here/
Post a Comment