Friday, May 24

What to know about the Supreme Court case on free speech on social media

Social media companies are preparing for Supreme Court arguments Monday that could fundamentally alter how they police their sites.

After Facebook, Twitter and YouTube kicked out President Donald J. Trump following the January 6, 2021 riots at the Capitol, Florida made it illegal for tech companies to ban a candidate for office in the state from their sites. Texas subsequently passed its own law banning platforms from removing political content.

Two technology industry groups, NetChoice and the Computer & Communications Industry Association, sued to stop the laws from taking effect. They argued that companies have the right to make decisions about their platforms under the First Amendment, just as a newspaper decides what to publish on its pages.

The Supreme Court’s decision in these cases – Moody v. NetChoice and NetChoice v. Paxton – is a big test of the power of social media companies, potentially reshaping millions of social media feeds while giving the government influence over how and what stays online.

“What’s at stake is whether they can be forced to publish content they don’t want,” said Daphne Keller, a professor at Stanford Law School who filed a brief with the Supreme Court supporting the tech groups’ challenge to Texas and Florida. legislation. “And, perhaps more to the point, whether the government can force them to post content they don’t want.”

If the Supreme Court were to say that the Texas and Florida laws are constitutional and go into effect, some legal experts speculate that companies could create versions of their feeds specifically for those states. However, such a ruling could introduce similar laws in other states, and it is technically complicated to precisely limit access to a website based on location.

Critics of the laws say feeds sent to the two states could include extremist content – neo-Nazis, for example – that the platforms previously removed for violating their standards. Or, critics say, the platforms could ban discussion of anything remotely political by banning posts on many controversial issues.

Texas law prohibits social media platforms from removing content based on the user’s “point of view” or expressed in the post. The law gives individuals and the state attorney general the right to sue platforms for violations.

Florida law fines platforms if they permanently ban a candidate for office in the state from their sites. It also prohibits platforms from removing content from a “journalism enterprise” and requires companies to be upfront about their rules for moderating content.

Supporters of the Texas and Florida laws, passed in 2021, say they will protect conservatives from the liberal bias that they say pervades California-based platforms.

“People around the world use Facebook, YouTube and X (the social media platform formerly known as Twitter) to communicate with friends, family, politicians, journalists and the general public,” said Texas Attorney General Ken Paxton . a legal brief. “And like the telegraph companies of old, today’s social media giants use their control over the mechanisms of this ‘modern public square’ to direct – and often stifle – public discourse.”

Chase Sizemore, a spokesman for Florida’s attorney general, said the state “looks forward to defending our social media law that protects Floridians.” A spokesperson for the Texas attorney general had no comment.

Now they decide what stays online and what doesn’t.

Companies including Facebook and Instagram’s Meta, TikTok, Snap, YouTube and X have long policed ​​themselves, setting their own rules on what users can say while the government has taken a hands-off approach.

In 1997, the Supreme Court ruled that a law regulating indecent speech online was unconstitutional, differentiating the Internet from mediums in which the government regulates content. The government, for example, imposes decency standards on television and radio broadcasts.

For years, bad actors have flooded social media with misleading information, hate speech and harassment, prompting companies to draft new rules in the last decade that include bans on false information about elections and the pandemic. Platforms have banned figures such as influencer Andrew Tate for violating their rules, including against hate speech.

But there has been a right-wing backlash to these measures, with some conservatives accusing the platforms of censoring their views – and even prompting Elon Musk to say he wants to buy Twitter in 2022 to help ensure people’s free speech users.

Thanks to a law known as Section 230 of the Communications Decency Act, social media platforms are not held liable for most content posted on their sites. They therefore face little legal pressure to remove problematic posts and users who violate their rules.

Technology groups say the First Amendment gives companies the right to remove content as they see fit, because it protects their ability to make editorial choices about the content of their products.

In their lawsuit challenging the Texas law, the groups argued that, just like a magazine’s publishing decision, “a platform’s decision about what content to host and what to exclude is intended to convey a message about the kind of community it the platform hopes to promote. “

However, some jurists are concerned about the implications of granting social media companies unfettered power under the First Amendment, which is intended to protect free speech and a free press.

“I worry about a world in which these companies invoke the First Amendment to protect what many of us believe are non-expressive commercial activity and behavior,” said Olivier Sylvain, a professor at Fordham Law School who until recently was a senior counsel at President of the Federal Trade Commission, Lina Khan.

The court will hear arguments from both sides on Monday. A decision is expected by June.

Legal experts say the court could rule the laws unconstitutional but provide a roadmap for how to resolve them. Or it could fully support corporations’ First Amendment rights.

Carl Szabo, general counsel for NetChoice, which represents companies including Google and Meta and lobbies against technology regulations, said that if the group’s challenge to the laws fails, “Americans across the country would be required to see legitimate but horrible” which could be interpreted as political and therefore covered by laws.

“There are a lot of things that are presented as political content,” he said. “Terrorist recruitment is undoubtedly political content.”

But if the Supreme Court rules that the laws violate the Constitution, it will reinforce the status quo: It will be the platforms, and not anyone else, who will determine what speech remains online.

Adam Liptak contributed to the reporting.