Saturday, November 05, 2022

The Atlantic, April 2022  "Shadowbanning Is Big Tech’s Big Problem" 

Social-media companies deny quietly suppressing content, but many users still believe it happens. The result is a lack of trust in the internet.

Zuckerberg, 2018, "Last edited May 5, 2021" A Blueprint for Content Governance and Enforcement

Discouraging Borderline Content 

One of the biggest issues social networks face is that, when left unchecked, people will engage disproportionately with more sensationalist and provocative content. This is not a new phenomenon. It is widespread on cable news today and has been a staple of tabloids for more than a century. At scale it can undermine the quality of public discourse and lead to polarization. In our case, it can also degrade the quality of our services.

Our research suggests that no matter where we draw the lines for what is allowed, as a piece of content gets close to that line, people will engage with it more on average  -- even when they tell us afterwards they don't like the content. 

This is a basic incentive problem that we can address by penalizing borderline content so it gets less distribution and engagement. By making the distribution curve look like the graph below where distribution declines as content gets more sensational, people are disincentivized from creating provocative content that is as close to the line as possible.

This process for adjusting this curve is similar to what I described above for proactively identifying harmful content, but is now focused on identifying borderline content instead. We train AI systems to detect borderline content so we can distribute that content less. 

The Intercept, Fang and Klppenstein,

The Department of Homeland Security is quietly broadening its efforts to curb speech it considers dangerous, an investigation by The Intercept has found. Years of internal DHS memos, emails, and documents — obtained via leaks and an ongoing lawsuit, as well as public documents — illustrate an expansive effort by the agency to influence tech platforms.

The work, much of which remains unknown to the American public, came into clearer view earlier this year when DHS announced a new “Disinformation Governance Board”: a panel designed to police misinformation (false information spread unintentionally), disinformation (false information spread intentionally), and malinformation (factual information shared, typically out of context, with harmful intent) that allegedly threatens U.S. interests. While the board was widely ridiculed, immediately scaled back, and then shut down within a few months, other initiatives are underway as DHS pivots to monitoring social media now that its original mandate — the war on terror — has been wound down.

repeats of repeats: the law banning US government propaganda within the US was repealed in 2013.

Masnick makes an appearance here

No comments:

Post a Comment

Comment moderation is enabled.