Cosma and I wrote the article to push back against one version of the common claim that we can blame everything that is wrong and toxic with social media (and by extension, American democracy – this is a U.S. centric piece) on engagement maximizing algorithms and their cousins. Specifically, we don’t think that we can fully blame these algorithms for the kinds of belief polarization that we see online: people’s willingness, for example, to concoct elaborate justifications for their belief that Trump Really Won in 2020.We do this by engaging in a kind of thought experiment. Would we see similar polarization of beliefs if we lived in a world where Facebook, Twitter et al. hadn’t started using these algorithms after 2012 or so? Our rough answer is that plausibly, yes: we would see lots of polarization.
We do this by engaging in a kind of thought experiment. Would we see similar polarization of beliefs if we lived in a world where Facebook, Twitter et al. hadn’t started using these algorithms after 2012 or so? Our rough answer is that plausibly, yes: we would see lots of polarization. Following Mercier and Sperber, we assume that people are motivated reasoners – they more often look for evidence to support what they want to believe than to challenge their assumptions. And all they need to do this is a combination of simple search (Google like it used to be) and social media 2.0.
Toxicity, and hygiene, the language of liberal fascism. [more comedy]
If Farrell were a method actor, he might ask "What's my motivation?", but being one of the "big children in university chairs", he imagines his own disinterest. Weber was honest and gave the answer: the preservation of the academy.
Cultural consensus in the field of education can be justified basically only on the condition of severe self-restraint in the observance of the canons of science and scholarship. If one desires this consensus, one must put aside the idea of any sort of instruction in ultimate values and beliefs; similarly the university teacher, especially in the confidentiality of his lecture hall—nowadays of such solicitude—is under the sternest obligation to avoid proposing his own position in the struggle of ideals. He must make his chair into a forum where the understanding of ultimate standpoints—alien to and divergent from his own—is fostered, rather than into an arena where he propagates his own ideals
So the vicious circle is first, we demand content moderation, then we get some and we're unhappy with it (because human beings are fallible, inconsistent, etc.), then we demand *less* "censorship" or *more* "moderation" or *the end of all unfair censorship/moderation*.
— Mike Godwin (@sfmnemonic) September 13, 2018
We imagine that platforms can bring the whole sprawling chaos of human behavior into compliance with the law. Make our lives policeable, and policed, to a degree no govt in history could have imagined. Not only do we seem to think it's possible– we think it's a good idea. https://t.co/7FIZvt6Uw8
— Daphne Keller (@daphnehk) September 13, 2018
Google & Facebook are moving to finish off destroying the journalism industry w ad market monopoly, algorithm & UX tweaks. Don’t care at all about annoying woke digital media — VICE deserved to die — but we’re headed to almost no media left in virtually every rural & small city.
— Lee Fang (@lhfang) February 23, 2024
One reason to care about Google eliminating the entire media is soon you won't be able to complain about media conspiracies. https://t.co/Ixt7Gu0IzY
— Matt Stoller (@matthewstoller) February 23, 2024
"We imagine that platforms can bring the whole sprawling chaos of human behavior into compliance with the law. Make our lives policeable, and policed, to a degree no govt in history could have imagined."
No comments:
Post a Comment
Comment moderation is enabled.