(no subject)
10 September 2020 11:45The ongoing search for online spaces which will scratch that "feel distressed, must sit down and *browse* until I feel better" itch, without making it worse - so, no tumblr, no news sites, no sites which...
I've learnt a new concept this week, comparing memes to a virus and the way that the "best" memes basically have R1, compelling you to reproduce and spread them, and that's especially concerning when that meme triggers anger or fear; no matter what side of an issue you're on, engaging with it is a win for the platform - it keeps your attention - which is bad for digital futures, because then Facebook has a strong incentive to keep promoting hate speech to you. If it knows that, if putting something horrendous in trending topics "triggers a response" from you, it'll keep doing so, because the algorithm has no ethics or compassion.
It sort of, confirms this sense I've had for a while that *nobody* should be talking about, like, TERFs. Period. Or, Donald Trump. Like, blocking and refusal to engage or even to snark about these topics in your private space is a form of praxis: the goal is to reform our digital social spaces, by teaching the algorithm that this content is NOT popular. Like, literally, it's social distancing with hate speech. If no one spreads it, it dies.
I think this idea will be quite unpopular with younger folx who have grown up with internet activism being a thing, but it doesn't make me wrong - in that, for Facebook at least, a pro-BLM post and a pro-White Supremacy post are *identical* in terms of, they spread the same virus, the same addictive clickiness, the same engagement, the same rage, the same more people spending more time on facebook. And then once they have your attention, you're super vulnerable to whatever messages get pushed by bad faith actors at pivotal moments
(Apparently, a big part of the Russian troll bot strategy was running groups for black women which was 99% haircare, feminism, fatherhood, religion and family - and 1% "don't vote", slipped in among the rest. What does this mean, then, for how we go forward - knowing that even positive messages that we see could be laying the basis of trust for like, a psychological sleeper agent who will seem normal until we're asleep)
This way of thinking has been super helpful to me in terms of like, reflecting on my own vulnerability to some of this noise and how it gets into my psyche; I think I am going to give up my tumblr permanently. And I wish there were online places I could chill which weren't the equivalent of going to a concert during a pandemic and being exposed to this sheer scale of alien sound that clogs up my lungs; it's made me far more suspicious of even things like, consuming "positive" content on these platforms because ultimately, that's still giving a degree of psychological sovereignty to like 28 tech bros in California.
I've learnt a new concept this week, comparing memes to a virus and the way that the "best" memes basically have R1, compelling you to reproduce and spread them, and that's especially concerning when that meme triggers anger or fear; no matter what side of an issue you're on, engaging with it is a win for the platform - it keeps your attention - which is bad for digital futures, because then Facebook has a strong incentive to keep promoting hate speech to you. If it knows that, if putting something horrendous in trending topics "triggers a response" from you, it'll keep doing so, because the algorithm has no ethics or compassion.
It sort of, confirms this sense I've had for a while that *nobody* should be talking about, like, TERFs. Period. Or, Donald Trump. Like, blocking and refusal to engage or even to snark about these topics in your private space is a form of praxis: the goal is to reform our digital social spaces, by teaching the algorithm that this content is NOT popular. Like, literally, it's social distancing with hate speech. If no one spreads it, it dies.
I think this idea will be quite unpopular with younger folx who have grown up with internet activism being a thing, but it doesn't make me wrong - in that, for Facebook at least, a pro-BLM post and a pro-White Supremacy post are *identical* in terms of, they spread the same virus, the same addictive clickiness, the same engagement, the same rage, the same more people spending more time on facebook. And then once they have your attention, you're super vulnerable to whatever messages get pushed by bad faith actors at pivotal moments
(Apparently, a big part of the Russian troll bot strategy was running groups for black women which was 99% haircare, feminism, fatherhood, religion and family - and 1% "don't vote", slipped in among the rest. What does this mean, then, for how we go forward - knowing that even positive messages that we see could be laying the basis of trust for like, a psychological sleeper agent who will seem normal until we're asleep)
This way of thinking has been super helpful to me in terms of like, reflecting on my own vulnerability to some of this noise and how it gets into my psyche; I think I am going to give up my tumblr permanently. And I wish there were online places I could chill which weren't the equivalent of going to a concert during a pandemic and being exposed to this sheer scale of alien sound that clogs up my lungs; it's made me far more suspicious of even things like, consuming "positive" content on these platforms because ultimately, that's still giving a degree of psychological sovereignty to like 28 tech bros in California.