Facebook’s independent regulator on Wednesday regulates the platform’s ban on former US President Donald Trump in a case that could set a precedent for how social media handles harmful content from world leaders.
The ruling, which will be released at 1300 GMT on Wednesday, is likely to be a crucial moment for the leading social network’s so-called “supreme court” that the company’s founder Mark Zuckerberg is thinking of making thorny decisions about what to allow or remove from Facebook.
The board, which makes decisions that are binding on Facebook and can not be appealed, will decide whether the Trump ban should remain or allow him back on the platform. It can also provide related recommendations to the California-based social giant.
“This is a huge decision, it gets a lot of attention and deserves it,” said Daniel Kreiss, a professor and researcher at the University of North Carolina who specializes in politics and social media.
“This is significant for the global precedent it will create. If they uphold the decision, I think you will see more robust enforcement around the world.”
The oversight panel, made up of lawyers, police experts, journalists and others from around the world, will make perhaps its most important decision at a time when social platforms are struggling to remain open to political discourse while filtering out incentives for violence, misinformation and abuse. comments.
“This Facebook surveillance decision is an important litmus test of whether technology companies at the country level can effectively regulate themselves,” said Lindsay Gorman, an emerging technician in the non-profit Alliance for Securing Democracy.
The Supervisory Board is Facebook’s “best attempt to ward off threatening government regulation,” Gorman added.
Trump was suspended from Facebook after he posted a video during the deadly race on January 6 of his supporters at the US Capitol, in which he stated: “We love you, you are very special.”
The US leader was permanently banned by Facebook the following day and was removed from other platforms including Twitter and YouTube.
Some analysts said that Facebook and other social networks should have acted on Trump earlier, after years of giving him exceptions to hate content rules because of his “newsworthiness” as a political leader.
“If anything, the decision to initially ban … Trump would have come much earlier,” said Samuel Woolley, a professor at the University of Texas who specializes in computational propaganda.
“He used Facebook and other platforms to actively spread blatantly false content about election processes – very effectively undermining US democracy.”
Trump has remained unharmed by the bans, often sending out statements via email and on Tuesday launching a blog-like website he described as “a freedom lighthouse” and a “place to speak freely and safely.”
Visitors to the site can like Trump’s posts and post them on Facebook and Twitter.
Facebook itself referred Trump’s case to the supervisory board, in line with its position that business leaders should not be able to make important decisions about content and political speech. The panel has received more than 9,000 comments on the matter.
But the move from Facebook and others has also drawn a stream of criticism from Trump supporters, who claim that major tech platforms are biased and stiflingly opposing views.
And the ban has also raised concerns from others, including German Chancellor Angela Merkel, who called Facebook’s moves “problematic” and from civic freedom activists.
Jameel Jaffer, executive director of Columbia University’s Knight First Amendment Institute, said the issue is more complex than just evaluating Trump’s comments.
“I hope the board will use this matter as an opportunity to focus on Facebook’s decision on the design of its platform,” Jaffer said.
“These technical decisions are often invisible, but they determine what number is spreading on the platform, how fast it is spreading, who sees it and in what context they see it.”
In its submission to the board, the institute said that Facebook should conduct “an independent study of how its platform may have contributed to the events of January 6” and that the panel should decide Trump “only after the company has given it the results of that study.”
Elizabeth Renieris, head of the Notre Dame-IBM Technology Ethics Lab, said the verdict is unlikely to end the controversy over content moderation.
“The board’s analysis and reasoning in this case could very well help shape Facebook and other digital platform policies regarding how to treat political leaders and other public figures in the future,” she said.
“Regardless of the decision, we should be concerned that decisions of this kind are made by unelected, innocent companies and their self-appointed assessors.”