Now we are all starting to accept that the volatility of 2020 unfortunately did not end with a turnaround in the schedule. Nothing has demonstrated this more than the attack on the United States Capitol, and since then it seems like battles are raging in just about every corner – even Robinhood and Wall Street. Social media, often a battleground in itself, became the subject of a heated new debate when Twitter and Facebook suspended the accounts of Donald Trump and Apple, Google and Amazon Web Services, then refused to support and host sites, such as Parler, for their services. It gave new life to the big question – platform or publisher? The distinction is important because it has far-reaching global implications.

As the aforementioned social media companies seek to define themselves as mere platforms, moderating and removing content (and the people who post it), they are actively taking control of the content and content providers on their sites. Web and make editorial decisions, just like a newspaper or online news source. In other words, they act as editors.

In life, as it is the law, it is not only what you say that matters, it is also what you do. And at the end of the day, what social media companies are doing – even if it’s different from what they say – means that they are likely to be subject to regulations when until now they were largely left to themselves.

Regulation of this sector has been slow in coming, but it is clear that demands for increased regulation are accelerating from governments and corporations around the world. Initially, there will undoubtedly be a patchwork of different types of regulations with different activities covered and a variety of ideals applied. Overall, we predict that these will dictate how Big Tech should apply its editorial power fairly and consistently, and they will determine the safeguards of that power and what people can do when they feel it is. is used inappropriately. It should also address what happens when an individual or institution has been damaged or violated by content posted by these publishers.

Currently, for platforms based solely in the United States, Section 230 of the Communications Decency Act generally grants website publishers immunity from liability arising from third-party content. While jurisdiction and enforcement have always provided them with protection there, the world is changing and getting smaller and smaller and big tech can’t hide there forever.

The focus of the UK’s online harm bill, which is currently due to be finalized and come into force later this year, will provide due diligence on social media platforms to protect users. Platforms will be required to assess what content might be harmful to users, even if it is not illegal, and to take steps to protect users from such harmful content. This will apply to businesses around the world where UK users access their content. The bill will include the power to impose significant fines and prevent businesses from providing access to the UK. Meanwhile, in Europe, the Bloc is considering pushing forward the Digital Services Act and the Digital Markets Act. These focus on content moderation and self-preference. The two proposed laws provide for substantial fines.

This is where the debate gets interesting. There is an editorial spectrum. On one side is the phone – essentially, just a platform. Anyone can say anything harmful to someone online, and a phone company could never be involved because they didn’t edit or control that content; they just passed the conversation on. The other end is a fully edited journal where every word of the journal is written by an employee of the company and completely under the editorial control of the company. ISPs are closer to phone operators, and claims by social media companies that they are just platforms have historically brought them closer to phone companies than newspapers across this spectrum. But when Big Tech makes important editorial decisions like removing the US president and Iranian leader from their offices and leaving the leaders of other countries active, it shows that they are actually much lower on the spectrum and much closer to the top. newspapers. The company needs clarity on how its editorial decisions are made, as well as consistent guidelines and opportunities to oppose such decisions.

So why would Big Tech choose to be a publisher and therefore become subject to publisher regulation? In short, to protect their reputation and because they have no choice if they want to maintain viable businesses that most of society wants to be a part of. Although they face regulation, it is still better than being completely shut down or becoming marginal platforms. And public opinion is changing and starting to demand that they take responsibility for their actions. It is undoubtedly better to have democratically elected governments regulating these publishers after debate, public consultation and input from the Big Tech themselves rather than letting a few billionaires decide what the rules are and how those rules are applied and when those rules are applied. must be changed.

As the ancient Chinese curse says: May you live in interesting times. Well, we certainly do with all the conflict and complexity that the internet, big tech, and social media bring. But whether you’re in tech, law, or a social media commentator of some sort, things certainly get interesting – and that will only continue when new leaders and new countries step in and out. that new legislation begins to be adopted and applied. We will monitor closely and be there to advise, advise and keep you informed every step of the way.


Source link