One more time: Facebook is not a "platform"; it is a "publisher".
ABC: Facebook employees questioned apparent restrictions on Palestinian activist's account: Documents
Earlier this year, multiple Facebook employees questioned the apparent restrictions on well-known Palestinian activist Mohammed El-Kurd's Instagram account, according to internal Facebook documents shared with ABC News and a group of other news organizations.
The document, titled "Concerns with added restrictions/demotions on content pertaining to Palestine," shows concern among some employees over content moderation decisions during the May escalation of violence in Gaza and the West Bank.
Politico: Facebook staff complained for years about their lobbyists’ power
Facebook says it does not take the political winds of Washington into account when deciding what posts to take down or products to launch.
But a trove of internal documents shows that Facebook’s own employees are concerned that the company does just that — and that its Washington, D.C.-based policy office is deeply involved in these calls at a level not previously reported.
The lobbying and government relations shop, overseen by former Republican operative Joel Kaplan, regularly weighs in on speech-related issues, such as how to deal with prominent right-wing figures, misinformation, ads from former President Donald Trump and the aftermath of the George Floyd protests in June 2020, according to internal reports, posts from Facebook’s staff and interviews with former employees. The dynamic is so prevalent that employees argued internally that Facebook regularly ignored its own written policies to keep political figures happy, even overriding concerns about public safety.
“Facebook routinely makes exceptions for powerful actors when enforcing content policy,” a Facebook data scientist wrote in a December 2020 presentation titled “Political Influences on Content Policy.” It added: “The standard protocol for enforcement and policy involves consulting Public Policy on any significant changes, and their input regularly protects powerful constituencies.” The public policy team includes the company’s lobbyists.
WaPo: Five points for anger, one for a ‘like’: How Facebook’s formula fostered rage and misinformation
Five years ago, Facebook gave its users five new ways to react to a post in their news feed beyond the iconic “like” thumbs-up: “love,” “haha,” “wow,” “sad” and “angry.”
Behind the scenes, Facebook programmed the algorithm that decides what people see in their news feeds to use the reaction emoji as signals to push more emotional and provocative content — including content likely to make them angry. Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook’s business.
Facebook’s own researchers were quick to suspect a critical flaw. Favoring “controversial” posts — including those that make users angry — could open “the door to more spam/abuse/clickbait inadvertently,” a staffer, whose name was redacted, wrote in one of the internal documents. A colleague responded, “It’s possible.”
Free speech is for fascists, not monopolists.
No comments:
Post a Comment
Comment moderation is enabled.