Supporters of the law in Texas, and a similar law in Florida, said the legislation would prevent tech companies from engaging in censorship by prohibiting them from deleting posts featuring political viewpoints with which they are not in agreement. disagree. But the wording of the Texas law effectively prohibits companies from moderating or blocking any content that isn’t already illegal, paving the way, experts say, for terrorist recruitment, white supremacy organizing, posts inciting people with eating disorders, misinformation about vaccines, and other harmful material that many websites currently ban.
Although both states’ laws are the product of conservative lawmakers, the Fifth Circuit’s ruling on the Texas law contradicts some longstanding Supreme Court views supporting First Amendment protections for businesses — views that conservatives have held. at one time greeted. It also contrasts with a May ruling by the United States Court of Appeals for the 11th Circuit striking down a similar Florida law. The dispute means the law is likely to be reviewed by the U.S. Supreme Court, where conservative justices have repeatedly upheld companies’ First Amendment rights in cases such as Citizens United, a 2010 ruling that overturned the Long-standing limits on corporate campaign contributions that the court said restricts the right of corporations to express themselves politically.
Despite their hope that the Supreme Court will eventually overturn the law, Silicon Valley companies are beginning to prepare for worst-case scenarios, playing around with the answers in planning exercises called “sandboxing,” said Carl Szabo, vice president. and general counsel for NetChoice, one of the tech company lobby groups that challenged the Texas law. Group members include Meta, TikTok, Google, Nextdoor, and dozens of other services.
Appeals court upholds Texas law regulating social media moderation
The strategy falls into four broad areas, the most sweeping of which includes the ability for companies to shut down services entirely in Texas and potentially any other states where copy-bills have been introduced.
Tech companies could also create “pop-up screens” that greet users, letting them know that the material they’re about to see could be very disturbing and giving them the option to opt in to a more subdued environment, a said Daphne Keller. , director of the platform regulation program at Stanford University’s Cyber Policy Center.
Companies have also explored the risky proposition of stopping all moderation – essentially complying with the law to a T – and waiting for a mass public protest or people to flee their products. And some have floated the idea of ”lobotomizing” content on their services, making it so fluffy there’s no reason to remove anything, said Matt Schruers, president of the Computer & Communications Industry Association. (CCIA), the other technology industry group. fight the law.
“The unifying factor of all of these options is utter confusion,” Schruers said.
Szabo said tech companies have “actually sat down and tried to figure out how to implement Texas law,” but right now most of the possibilities seem impossible to implement, legally. questionable or would cost them tens of millions. of customers.
“Some of the greatest technical minds on the planet have come together, but they can’t make it work because what Texas and Florida are basically doing is asking the rigs to square the circle,” he said. -he declares.
Experts have likened the law to forcing Barnes & Noble bookstores to house copies of Adolf Hitler’s Mein Kampf manifesto, or forcing newspapers such as the Washington Post to publish editorials by self-proclaimed neo-Nazi candidates.
Tech companies have beefed up their ability to reluctantly remove, downgrade and moderate content from their services, first doing the bare minimum to comply with US laws that prohibit services from hosting copyrighted material. author or child pornography, and European laws that prohibit pros. – Nazi speech. In its early years, Facebook tried to distinguish itself from its then-competitor, Myspace, by setting a higher bar for opportunity, banning pure nudity and language calling for violence, for example, and engaging a small number of moderators to enforce its rules. .
But the company soon ran into the complexities of content moderation when it mistakenly deleted a famous Vietnam War photo of a naked girl fleeing napalm bombs dropped by South Vietnamese planes. After the protests, the company restored the photo and added a timeliness exception to its no-nudity policies.
In 2017, Silicon Valley social media companies were dragged before Congress to account for revelations that Russian operatives planted widespread misinformation about their services during the previous year’s presidential election. In response, companies like Facebook and Google-owned YouTube hired tens of thousands of moderators, essentially spawning a content moderation industry overnight. With each new rule, tech companies hired more moderators and developed software to filter out potentially problematic content.
The pandemic has led to more rules and more takedowns by people and by algorithms, as companies have banned misinformation about vaccines, such as posts opposing masks or the sale of fake cures.
The content moderation boom reached an inflection point after the January 6, 2021 riot at the United States Capitol, when tech companies banned former President Donald Trump’s social media accounts. Trump’s ban provoked a conservative backlash, leading to laws in Florida and Texas.
Concerns that social media sites were too slow to tackle misinformation and calls for violence have also prompted liberal legislative responses. A California law passed last month requires platforms to file twice a year with the state attorney general setting out their content moderation policies regarding hate speech, misinformation and extremism.
California’s new law risks sparking a fight against social media moderation
There are no similar federal laws.
Since the Texas law applies to any tech service with more than 50 million users, experts say it would also cover companies that have nothing to do with political speech, such as Pinterest, Etsy and Yelp. These companies are in an even tougher position than the big platforms because they don’t have the financial means to withstand all the challenges they might face under the law, said Alex Feerst, the platform’s former chief legal officer. of social media Medium and a consultant for technology companies on content moderation issues.
In theory, the law, he said, could prevent a company like Etsy from removing pro-Nazi statements posted as part of an offer for a personalized crib. It also allows anyone to sue on the grounds that they have been discriminated against, subjecting midsize businesses to a wave of litigation that could be crippling.
“It’s a headache for small companies because they don’t have the resources that big companies have, but they could still be sued by anyone,” Feerst said.
Keller said some of the options tech companies are considering would be a minefield to navigate — technically, legally and in terms of impact on a company’s business.
The strategy of shutting down the service in one state could technically be difficult and would be extremely expensive, since Texas is the second most populous state in the country (Florida is the third). It would also be difficult for businesses to detect if a Texas resident is logging in from another state.
The pop-up option might not be legally enforceable because Texas officials might argue that users aren’t really giving consent to moderation, Szabo said.
Removing all political material from a social media service probably wouldn’t work because just about anything could be construed as a political point of view, Schruers said.
Experts said the assumption that the court would strike down the law is also risky following the Dobbs verdict that overturned landmark abortion ruling Roe vs. Wade. Even a Supreme Court decision striking down some aspects of the law but allowing other parts to take effect would send shockwaves through Silicon Valley.
Keller said an outcome that leaves parts of the law intact would dramatically change the way technology and media companies make deals, perhaps forcing them to rewrite all the algorithms that serve content, fire thousands of moderators, and upend their speech-control practices.
“There’s a very turbulent legal landscape ahead,” she said. “It’s like Dobbs insofar as everyone feels that the law is up for grabs, that judges will act according to their political convictions and would be willing to disregard precedent.