The Knight First Amendment Institute, again. It's an absolute disaster for free speech doctrine.
Views on First, Episode Four
Social media platforms make more decisions about free speech every minute than the Supreme Court has made in more than 200 years. So the values and systems adopted by these corporate speech overlords matter a lot. Guests Nicole Wong—former Google VP and Twitter exec—and Alex Stamos—director of the Stanford Internet Observatory and former Facebook chief security officer—tell the story of how Big Tech stumbled its way through developing systems of speech regulation, from the early laissez-faire days to the controversies of today. Turns out that speech regulation is hard.
Host Evelyn Douek, a "self-described content-moderation nerd"
For those of you who have been with us so far, you'll know that we've been thinking and talking a lot about how the First Amendment should adjust to the new challenges of the platform era.
Because adjusting the platforms to fit the First Amendment is unthinkable.
And that shouldn't be a surprise. We're a First Amendment Institute and so we spend a lot of time thinking about the First Amendment and legal doctrine. But most of our free speech debates happen outside the courts. They're about norms and rules that are not legal or constitutional issues, but social ones. And increasingly, they're corporate ones. Because alongside the story that we've been telling about how tech platforms have collided with the First Amendment, there's another story about how tech platforms have collided with different understandings of free speech. And that's an important story because tech platforms are perhaps the most important speech regulators in the world.
But the the so-called "must carry" provisions are a little different in the two laws and they're both long convoluted hard to read poorly drafted laws. I don't think the lawmakers thought I don't know thought about a future of anybody actually complying. I think it was just really fun for them to to draft these laws. So the Texas one, the key provision says that platforms can't discriminate on the basis of viewpoint when they remove or demote or demonetize or otherwise moderate content by their users. And I think what these legislators were thinking was you can't discriminate against you know Tucker Carlson you know stop censoring conservative voices. But really what it means to not discriminate on the basis of viewpoint, while this is somewhat debated on the details, but if you are leaving up anti teen anorexia videos then you have to also leave up the pro teen anorexia videos. If you're leaving up the anti-suicide videos you ['can't' makes sense in context] take down the pro-suicide videos. If you're leaving up the claims that the Holocaust is real you also leave up the claims that the Holocaust is not real, so there's just this very grim array of consequences from that rule requiring viewpoint neutrality.
The"very grim array of consequences" is the definition of free speech. So these go here.
A German-Israeli singer Nirit Sommerfeld taking part in a Klezmer concert in Munich got a letter from authorities that if she utters the word BDS or says anything they deem anti-Semitic, they will stop the concert. She is also the child of Holocaust survivors h/t @BartalYossi
— Mairav Zonszein מרב זונשיין (@MairavZ) September 28, 2019
Amnesty International, September 29, 2022
Beginning in August 2017, the Myanmar security forces undertook a brutal campaign of ethnic cleansing against Rohingya Muslims. This report is based on an in-depth investigation into Meta (formerly Facebook)’s role in the serious human rights violations perpetrated against the Rohingya. Meta’s algorithms proactively amplified and promoted content which incited violence, hatred, and discrimination against the Rohingya – pouring fuel on the fire of long-standing discrimination and substantially increasing the risk of an outbreak of mass violence. The report concludes that Meta substantially contributed to adverse human rights impacts suffered by the Rohingya and has a responsibility to provide survivors with an effective remedy.
Defend free speech but not incitement promoted by a global monopoly. It's so fucking obvious. For FB incitement is the model. It's the definition of engagement, and engagement is money. Balkin at this point just depresses me.
Two from Deborah Lipstadt, Recognize the name?
Donald Trump’s "Inadvertent Anti-Semitism"
Is Donald Trump a committed anti-Semite? I don’t think so. This is a man who is exceptionally proud of his daughter, a traditional Jew who is giving her children a solid Jewish education. His son-in-law, upon whose advice he increasingly leans, is an Orthodox Jew. This is not the profile of an anti-Semite. I also think that the comparisons between Trump and Hitler are way over the top.
"Jimmy Carter's Jewish Problem"
It is hard to criticize an icon. Jimmy Carter's humanitarian work has saved countless lives. Yet his life has also been shaped by the Bible, where the Hebrew prophets taught us to speak truth to power. So I write.
Lipstadt: "Anti-Zionism is anti-Semitism." Karl Popper was an anti-Zionist. Was he an anti-Semite?
A slew of internal communications and depositions taken by Dominion as part of its discovery process has left many legal experts warning that Fox could be on shaky legal footing.Dominion argues the vignettes contained in its court filings demonstrate how top hosts and executives at Fox knew the claims being pushed by Trump’s associates about Dominion were false but aired them anyway.“One just doesn’t see cases like this in defamation,” said Catherine Ross, a constitutional law professor at George Washington University who specializes in First Amendment issues.“Fox does not appear to have any plausible defense, particularly in light of what Dominion uncovered in discovery of real-time knowledge of falsity,” she said.
I hope the lawyers bring down Fox and Facebook, because the law professors ain't doin' shit. Get the joke?
No comments:
Post a Comment
Comment moderation is enabled.