- Advertisement -spot_img
Wednesday, December 8, 2021

Facebook watched Trump incite hatred, according to leaked documents

COLUMBUS, OH (AP) – Reports of hateful and violent Facebook posts began rolling in on the night of May 28 last year, shortly after then-President Donald Trump posted a warning on social media that marauders in Minneapolis would be shot.

It’s been three days since Minneapolis Police Officer Derek Choven knelt on George Floyd’s neck for more than eight minutes until the 46-year-old black man passed out, showing no signs of life. The video, filmed by passers-by, has been viewed millions of times on the Internet. Protests gripped the largest city in Minnesota and soon spread to cities in America.

But it wasn’t until Trump posted the news of Floyd’s death that reports of violence and hate speech “quickly” increased on Facebook across the country, according to the company’s internal analysis of the ex-president’s social media posting.

“These thugs dishonor the memory of George Floyd, and I won’t let that happen,” Trump tweeted at 9:53 am on May 28 from his Twitter and Facebook accounts. “Any difficulty, and we will take control, but when the robbery starts, the shooting will start!”

READ MORE: Facebook language spaces allow terrorist content and hate speech to flourish

Since then, the former president has been removed from both Twitter and Facebook.

Leaked Facebook documents provide a first-hand view of how Trump’s social media posts sparked more anger in an already deeply divided country that was ultimately ignited by reports of hate speech and violence across the platform. Facebook’s own internal automated controls, designed to detect rule-breaking messages, predicted with almost 90% certainty that Trump’s message violated the tech company’s rules against incitement to violence.

However, the tech giant took no action in response to Trump’s message.

The next day, offline protests, some of which escalated into violence, gripped nearly every city in the United States, large and small.

“When people look back at the role that Facebook has played, they won’t say what caused Facebook, but Facebook was definitely a megaphone,” said Liner Holt, a communications professor at Ohio State University. “I don’t think there is any way to get away with the claim that they made the situation worse.”

Meanwhile, social media rival Twitter quickly responded by covering Trump’s tweet with a warning and barring users from sharing it further.

Internal Facebook discussions were disclosed in SEC disclosures and redacted by former Facebook employee-turned-whistle-blower legal counsel Francis Haugen to Congress. Edited versions received by Congress were obtained by a consortium of news organizations including the Associated Press.

The Wall Street Journal previously reported that Trump was one of many notable users, including politicians and celebrities, exempted from some or all of the company’s normal compliance policies.

Documents show that reports of hate speech and violence were largely confined to the Minneapolis region after Floyd’s death.

“However, after Trump’s publication on May 28, the situation has really escalated across the country,” said a memo released on June 5 last year.

Internal analysis shows that Facebook posts of violence have quadrupled and hate complaints have tripled in the days following Trump’s publication. The number of false news reports on the platform has doubled. The publication of Trump’s message prompted “a significant amount of hateful and violent comments,” many of which Facebook sought to remove. Some of these comments included calls to “start shooting those thugs” and “f … whites.”

READ MORE: Facebook struggles to curb controversial content in India

By June 2, “we can clearly see that the entire country was mostly on fire,” a Facebook employee wrote in a June 5 memo on the rise in hate speech and reports of violence.

Read Also:  Activist Greta Thunberg on politicians' climate promises: 'Blah, blah, blah'

Facebook says it is impossible to separate how many reports of hate speech were triggered by the Trump post itself or the controversy over Floyd’s death.

“This spike in user posts is the result of a critical moment in the history of the racial justice movement, not Donald Trump’s only post about it,” a Facebook spokesman said in a statement. “Facebook often reflects what’s going on in society, and the only way to prevent spikes in user reports at times like these is to not allow them to be discussed on our platform at all, which we will never do.”

But internal findings also raise questions about Facebook CEO Mark Zuckerberg’s public statements last year when he defended his decision to keep Trump’s post intact.

On May 29, for example, Zuckerberg said the company took a close look at whether Trump’s words violated any of its policies and concluded that it was not. Zuckerberg also said he left the post because he warned people about Trump’s plan to deploy troops.

“I know that many people are upset that we have left the presidency, but our position is that we should provide as many opportunities for expression as possible, unless it creates an imminent risk of causing the specific harm or dangers outlined in clear policy, “wrote Zuckerberg. on his Facebook account on the night of May 29, when protests broke out across the country.

However, Facebook’s own automated controls determined that the post was likely in violation of the rules.

“Our classifier of violence and incitement was almost 90% certain that this[Trump’s]post violated … Facebook policies,” the June 5 analysis said.

This contradicts Zuckerberg’s conversations with civil rights leaders last year to allay fears that Trump’s office poses a specific threat to blacks protesting Floyd’s death, said Rashad Robinson, president of civil rights group Color of Change. The group also spearheaded a Facebook boycott in the weeks following Trump’s publication.

“To be clear, I had a direct argument with Zuckerberg a few days after the post in which he extinguished me, and he deliberately pushed aside any opinion that it violates their rules,” Robinson told AP last week. …

A Facebook spokesman said its internal controls do not always correctly predict when a post is breaking the rules, and that the human scrutiny that was done in Trump’s post is more accurate.

READ MORE: Facebook froze over vaccine comments flooding users

To limit the ex-president’s ability to induce hate on his platform, Facebook employees last year suggested that the company restrict the re-posting of similar posts that could violate Facebook’s policies in the future.

But Trump continued to use his Facebook account, which has over 32 million followers, to cheer on his supporters for much of the remainder of his presidency. In the days leading up to the deadly siege of Washington on January 6, Trump regularly promoted false claims that he had lost the White House to massive electoral fraud, prompting hundreds of his fans to storm the US Capitol and demand the results of the fair. elections will be canceled.

It was only after the Capitol riot and when Trump was leaving the White House that Facebook pulled him off the platform in January, announcing that his account would be suspended until at least 2023.

There’s a reason Facebook has waited so long to take action, said Jennifer Mercieka, a professor at Texas A&M University who took a close look at the former president’s rhetoric.

“Facebook has really benefited from Trump and Trump’s ability to attract attention and interest through outrage,” Mercieka said. “They wanted Trump to keep going.”

World Nation News Deskhttps://www.worldnationnews.com
World Nation News is a digital news portal website. Which provides important and latest breaking news updates to our audience in an effective and efficient ways, like world’s top stories, entertainment, sports, technology and much more news.
Latest news
Related news
- Advertisement -

Leave a Reply