Facebook is launching a campaign to help people detect fake news amid a growing advertising boycott that puts pressure on the company to tackle misinformation and hate speech.
Steve Hatch, Facebook’s vice president for Northern Europe, says the campaign for media literacy launched by fact-maker FullFact is proof that the company is ‘listening and adapting’.
However, some experts and critics say the efforts in the UK, Europe, Africa, and the Middle East are ‘too little, too late’.
The campaign will lead people to the StampOutFalseNews.com website and ask users the key questions about what they see online: “Where does it come from?” “What’s missing?” and “How did you feel?”
Seven ways to prevent fake news from going viral In an exclusive interview with the BBC, Mr. Hatch’s “financial considerations” are not behind the new ads.
In recent days, more than 150 companies – including Coca-Cola, Starbucks, and Unilever – have announced a temporary halt to ad purchases on Facebook as a result of the #StopHateForProfit campaign.
“Night and day”
Misinformation or viral ‘fake news’ has been a persistent issue on the social network for years and it flared up dramatically after the inception of Covid-19.
In May, a BBC investigation found links between coronavirus misinformation and assaults, arson and deaths, with potential – and possibly much greater – indirect damage caused by rumors, conspiracy theories, and poor health advice.
The human cost of misinformation about viruses.
The (almost) complete history of ‘fake news. Hatch says Facebook employees worked “night and day” to tackle false claims during the pandemic.
“If people shared information that could reason real harm, we should remove it. We did it in hundreds of thousands of cases,” he says.
But the attempt at media literacy is ‘too little too late’, says Chloe Colliver, head of the digital research unit at the Institute for Strategic Dialogue, a think tank against extremism.
“We’ve seen Facebook try to take reactive and often fairly small steps to stem the flood of disinformation on the platform,” Ms. Colliver said. “But they have not been able to proactively develop policies that help safe users from seeing disinformation, fake identities, fake accounts, and fake popularity on their platforms.” Facebook also owns Instagram and WhatsApp.
Facebook and other social media outlets have also come under pressure over misleading information or comments that are likely to incite violence, particularly posts by US President Donald Trump.
After widespread protests following the death of George Floyd, the president warned: “Any problems and we will take control, but when the looting begins, the shooting will begin.”
The message was hidden by Twitter due to the ‘glorification of violence’, but remained on Facebook.
Mr. Hatch says the US president’s posts ‘come under high scrutiny’ by Facebook bosses. Mark Zuckerberg, the chief executive, had earlier commented that the comment violated Facebook’s rules, stating that the company interpreted it as a reference to the possible use of National Guard troops.
“Whether you are a political figure or everyone on the platform,” he said. Hatch, you will be reprimanded for sharing posts that could cause real harm.