Monday, January 10, 2022

Myside bias” and "Implicit bias". I remembered today that Leiter was once a fan of Dan Sperber. Maybe not anymore.
---

Leiter: "The Epistemology of the Internet and the Regulation of Speech in America", in October

And now

I've been gratified by the interest this paper has already attracted, and how useful many legal scholars, especially, have found the notion of epistemic authority.

As always: legal scholars and lawyers are  two groups, not one. And "epistemic authorities" don't believe in free speech.

The paper is on SSRN.
So is this:

These platforms are now responsible for shaping and allowing participation in our new digital and democratic culture, yet they have little direct accountability to their users. Future intervention, if any, must take into account how and why these platforms regulate online speech in order to strike a balance between preserving the democratizing forces of the internet and protecting the generative power of our New Governors.

but not this

I am fine with this,” wrote Sheryl Sandberg, Facebook’s No. 2 executive, in a one-sentence message to a team that reviewed the page. Three years later, YPG’s photos and updates about the Turkish military’s brutal attacks on the Kurdish minority in Syria still can’t be viewed by Facebook users inside Turkey. 

or this

Five years ago, Facebook gave its users five new ways to react to a post in their news feed beyond the iconic “like” thumbs-up: “love,” “haha,” “wow,” “sad” and “angry.”

Behind the scenes, Facebook programmed the algorithm that decides what people see in their news feeds to use the reaction emoji as signals to push more emotional and provocative content — including content likely to make them angry. Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook’s business.

Facebook’s own researchers were quick to suspect a critical flaw. Favoring “controversial” posts — including those that make users angry — could open “the door to more spam/abuse/clickbait inadvertently,” a staffer, whose name was redacted, wrote in one of the internal documents. A colleague responded, “It’s possible.”

You'd think these idiots would at least read the news. But it wouldn't help. Call that "expertise for realists".

No comments: