For most companies, the publication of a 6,600-word internal memo penned by a recently-fired employee, in which they alleged their job had left them with "blood on their hands", would be a catastrophic day in the office.
But for social media platform Facebook, the incident was just one more in a catalogue of troubling stories to play out across 2020: from its view that a post by President Donald Trump that included the remarks “when the looting starts, the shooting starts” did not violate rules against encouraging or glorifying violence, to a month-long advertising boycott of the site.
Facebook is a massive fundraising tool for charities, with some spending hundreds of thousands of pounds a month on advertising on the platform. But as evidence of its allegedly dodgy dealings grows, do they have any real chance of avoiding it – while reaching the same number of supporters and service users?
Charities Against Hate
US-based campaign group Stop Hate For Profit organised a month-long boycott of Facebook in July, to protest against the platform’s perceived failure to remove or moderate hate speech and misinformation published on the site. The protest swelled to become the biggest advertiser boycott in Facebook’s history, with more than 1,100 companies, including major corporations Disney, Unilever and Volkswagen, pulling their adspend from the platform.
As news of the boycott spread, many column inches were spent questioning whether this time would be different to previous actions, or could even kill the platform off completely. But when the boycott came to an end in July, there were calls from some of Facebook’s more public-spirited moderators for it to be extended, or it would look like nothing more than a “PR stunt”.
In response to the US campaign, 37 UK charities – including Barnardo’s, Mind and Parkinsons UK – banded together to form a working group called Charities Against Hate. Each charity agreed to scale back its paid Facebook advertising where possible. The coalition also formed working groups that would review how participating charities communicate via Facebook and other social media platforms, how the site affects their fundraising efforts and their ability to support service users, and recommend change to social media owners.
Sarah Clarke, head of memberships at comms professionals’ network CharityComms and organiser of the Charities Against Hate campaign, says platforms need to take action to be more inclusive, and a place for connecting and debate, not hate.
“As charities, we recognise that these platforms have a role to play in allowing us to connect with supporters and beneficiaries from all backgrounds. But we also know that not enough is being done to stop posts which incite hate and violence being made visible,” she says.
“No one should have to see these messages in their day-to-day lives, especially not when trying to access ongoing information and support.”
Human rights charity Freedom from Torture paused all of its paid advertising on Facebook and associated platforms in July.
“We cannot ignore how Facebook algorithms have monetised and amplified hate speech, and failed to take responsibility for the racism, discrimination and bias that is disseminated across its platforms,” says Sam Afhim, the charity’s director of fundraising and comms.
“The unprecedented reach of fake news and dangerous, inflammatory far-right content has a direct impact on the survivors we work with.”
However, he adds, there are also trade-offs between taking a stand against online hate and fundraising, and amplifying campaigns and highlighting injustice.
Mental health charity Mind joined the boycott because of the clear links between racism – and other forms of discrimination – and mental ill-health. Aside from the general impact of such content on wellbeing, the charity says, people from marginalised groups are more likely to experience mental health problems and more likely to be failed by mental health services, which don’t always cater to their needs.
For Aisling Green, an organiser of the Charities Against Hate campaign, the hate speech and misinformation that abounds on social media is not just a problem for service users, but can affect the mental health and wellbeing of charity staff. It is something she has witnessed in her full-time job at Parkinson’s UK, which participated in the Facebook boycott.
“There are times when we witness hate speech, especially during the most recent Black Lives Matter movement,” she says. “We have to ensure staff have access to talking therapy – and it’s not just Facebook, but all social media.”
Working together, Charities Against Hate hope to make a case for Facebook and its associated platforms to drive meaningful, positive change and crack down on hate speech and misinformation.
“Together, we’ll review our own ethical marketing policies, and see how these align with those of our partners,” Clarke says.
“And where there is difference, we’ll be taking combined recommendations to the social media platforms to show them how they can do better for those we support.”
‘These advertisers will be back’
As the dust settled on the boycott, its impact was unclear. Estimates vary, but analysts suggest up to $200m may have been lost in pulled ads – yet Facebook’s share price remained resilient, with its Q2 earnings statement reporting an 11 per cent year-on-year rise in revenue.
Although that period doesn’t include the boycott, the company is bullish about Q3, which does; Facebook stated that ad revenue for the first three weeks of July’s boycott “tracked with the rest of the year”. Market analysts predict a similarly strong performance, with a further increase in revenue of around 10 per cent.
Facebook founder and chief executive Mark Zuckerberg remained outwardly unconcerned about the boycott. “We’re not gonna change our policies or approach [to] anything because of a threat to a small per cent of our revenue, or to any per cent of our revenue,” he told staff, according to a transcript leaked online. “My guess is that all these advertisers will be back on the platform soon enough.”
Whether or not Zuckerberg agrees to further tighten the site’s carefully crafted rules boils down to a more fundamental question: does Facebook need big-brand advertisers – and charities – more than they need Facebook?
A quick scan of the first page of the platform’s Ad Library, a tool that was set up to help improve transparency, is revealing: 12 of the 20 biggest spenders across a 90-day period that spanned the Charities Against Hate boycott were charities.
This is not really surprising: for many charities Facebook has become a vital tool for fundraising and reaching service users, building relationships and awareness, supporting their services, and providing vital and trusted information.
The platform claims to host the biggest online community of volunteers, donors and activists in the world. Its fundraising tools, which are now available in more than 20 countries, include the ability to have a donate button on Facebook posts and Instagram Stories, as well as a donate option on live videos.
And as the coronavirus pandemic has forced even more charities to pivot their services and fundraising efforts online, the platform’s value is likely to only increase over time.
“We use Facebook and Instagram to engage with people looking for information and support on mental health problems, as well as to deepen engagement with our supporters by showing the impact of our work, and asking them to get involved with our membership, campaigning and fundraising activity,” explains Emma Dalby Bowler, head of digital engagement at Mind.
“We find these platforms to be really useful tools to encourage sign-ups to fundraising events, as well as one-off and regular donations, which are vital to support our work.”
Other charities are reticent over the potential impact on fundraising if they left the platform.
“Advertising on the social network helps us raise vital funds to drive forward our campaigns against climate and nature breakdown,” Friends of the Earth told Third Sector in a statement.
“So, while financially we can’t stop all advertising with immediate effect, we have committed to working with the sector to force longer-term change, and will take further action if they fail to improve.”
The financial clout of the global social media monopoly is undeniable. Globally, people have pledged more than $65m to coronavirus-related fundraisers since January 2020 on Facebook and Instagram, and in February it announced that people have raised more than $3bn for personal fundraisers and non-profit causes on Facebook. More than 45 million people have started, or donated to, a fundraiser on Facebook.
Yet, as the platform’s catalogue of dubious choices grows, charities may have to start squaring tricky ethical questions about using it.
In May, Facebook left up a post from President Donald Trump that included the remarks “when the looting starts, the shooting starts”. The company determined that it didn’t violate its rules against inciting violence, unlike Twitter, which, while it didn’t remove it, did hide the content in Trump’s timeline, flagging the post with a brief explanation of its decision.
In August, Facebook failed to remove a Kenosha Guard militia event that called for violence, posted a week before a fatal shooting at a Wisconsin racial justice protest.
After the shooting, the company took down a page for the group, saying it hadn’t acted sooner because of an “operational mistake”. Two lawyers have since filed a case against Facebook over the incident, citing the platform’s negligence after 450 people flagged posts in the event before the killing took place. It could leave the company facing a federal judgement requiring it to enforce its own standards.
And in the months since the advertising boycott, several whistleblowers have come forward to publicly criticise the company for still not doing enough to combat hate speech.
Former Facebook engineer Ashok Chandwaney said in a post shared publicly on the platform and internally with co-workers: “I’m quitting because I can no longer stomach contributing to an organisation that is profiting off hate in the US and globally.”
And in mid-September, an explosive memo by recently fired Facebook employee Sophie Zhang alleged the platform ignored or was slow to act on evidence that fake accounts on its platform have been undermining elections and political affairs around the world. The memo included the dramatic line: “I’ve got blood on my hands.”
Part of the dilemma for charities is that there are no competitors offering a chance for organisations to take their adspend elsewhere and be able to reach the same number of service users. Parkinson’s UK usually has an average organic Facebook reach of about 35,000 users a month, Green says – but this rose by 115 per cent during the pandemic to 305,000 monthly users.
“If they were a supplier and we could go somewhere else for some of the services, then yes, I would,” she adds.
Third Sector reached out to the three biggest Facebook spenders, according to the platform’s Ad Tracker – International Rescue Committee, British Red Cross, and WWF UK – to discuss how important the site is to their organisation, and why they didn’t take part in the boycott.
All three declined an interview, although the WWF said it did take part in the boycott, but didn’t explain how. Housing and homelessness charity Shelter did not participate, but said it shares the values of the campaign and admires those organisations that did.
“It is not acceptable for the content of any social media platform to drive hatred towards people based on age, disability, race, religion, beliefs, sexual orientation or anything else,” a Shelter spokesperson said in a statement.
“We urge all social media platforms to make sure racism and other hate speech is eradicated, so they are a safer place for the communities that use them. As organisations seeking to be anti-racist in all we do, we recognise that we still have much to learn and to do.”
Shelter also questioned whether private companies that took part in the boycott could reinvest their budgets into the work of non-profit organisations either working with or aligned with the values of #StopHateForProfit.
“As organisations that don’t ‘profit’ from Facebook but raise vital funds… we show our solidarity by calling out hate on Facebook and highlighting the damage it causes to the groups we work with,” the spokesperson said.
Beyond a boycott
The reality is that the prolonged abstention of charities alone – or even the combined efforts of the many businesses and not-for-profits that boycotted the platform during Stop Hate For Profit – would barely make a dent in a company with a reported net worth of $740bn. Any discussion on tackling hate speech online therefore inevitably leads to questions on the need for more regulation of social platforms.
Jonathan Chevallier, the chief executive of Charity Digital, says the government could do more. “We need to spotlight and encourage more action – newspapers and other publishers are regulated, and this is a form of media that is arguably more pervasive,” he argues. “We don’t allow subliminal advertising, so why do we allow content that is often targeted at people who may be more susceptible to misinformation and hate speech to go unchecked?”
And yet, greater regulation has so far eluded legislators on both sides of the Atlantic. Zuckerberg’s 2018 appearance in the Senate to give evidence about the Cambridge Analytica data-sharing scandal revealed many Senators failed to understand Facebook’s business model.
During the grilling Senator Patrick Leahy produced printouts of various Facebook groups and asked if they were Russian propaganda groups, as if Zuckerberg reads every single post on the platform himself, while senator Orrin Hatch asked how Facebook is able to sustain a business model while running as a free service. Zuckerberg seemed barely able to keep a straight face when he responded: “Senator, we run ads.”
And while UK MPs appeared to have a much better grasp of the platform’s business model, they made no secret of the fact that they found it difficult dealing with Facebook during a DCMS select committee inquiry into the platform.
Then-chair Damian Collins had strong words for the social network and its founder: “We believe that in its evidence to the committee, Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions,” he said.
The committee’s report concluded that Zuckerberg failed to show “leadership or personal responsibility” over fake news, and called for a compulsory code of ethics for tech firms, overseen by an independent regulator.
The regulator would be given powers to launch legal action if companies breach the code, which includes recommendations that social media companies be forced to take down known sources of harmful content, including proven sources of disinformation.
Facebook welcomed the digital select committee’s report and said it would be open to “meaningful regulation”. But while the government agreed with most of the recommendations in the report, progress toward implementation has been slow, and disinformation continues to spread rapidly.
A Covid-19 Infodemic Report published in July details evidence on a range of harms, from dangerous hoax treatments to conspiracy theories that led to attacks on 5G engineers, which were largely spread on social media.
It’s an issue charity digital expert Zoe Amar has noticed over the past few months. “It needs some consistency, and more of it,” she says.
“There is lots of anti-Covid-19 stuff on the platform at the moment, but even though they say they can’t remove it, this sort of fake news is a public health problem, so there are huge implications from my perspective.”
Amar believes regulation is inevitable if Facebook and other platforms can’t get their house in order, but was unsure how regulation would be enforced – particularly when government priorities and resources are
being focused elsewhere.
In the meantime, she says, charities need to measure and manage their online presence like any other risk. It’s an issue she finds trustees are becoming increasingly concerned about.
“If you’re a health charity and there is some anti-vaccine debate or your page, what can you do about it?” she asks. “If it can’t be removed, you can set out your charity’s position at least.”
In a statement to Third Sector, Facebook said: “We take these matters very seriously and respect the feedback from the charities who use our platforms. We’ve made significant investments in technology and people to help keep hate speech off our platform – we don’t want or benefit from this kind of content.
“We find 95 per cent of hate speech we action before users report it to us, and between April and June 2020 we acted on 22.5 million pieces of content.”
It has been more than a year since Facebook announced it was forming an independent oversight board to tackle its misinformation problem, pledging $130m to fund the effort. But in the middle of a pandemic, and on the cusp of one of the most divisive elections in US history, the board has yet to meet.
As misinformation continues to flourish on Facebook, even when it violates the platform’s policies, academics, politicians and activists have had enough. They have what they are calling the ‘Real Facebook Oversight Board’, with former DCMS select committee chair Collins one of those looking to take matters into their own hands.
“This isn’t just about one election or one democratic system in particular,” explains Collins. “I’ve joined the Real Oversight Board to keep calling out the company on these issues, and to campaign for change as we wait for legislation like the Online Harms Bill.”
In the meantime, charities find themselves in a sort of dance with the devil – reliant on a platform they can’t trust, yet unable to abandon it. Yet CharityComms’ Clarke maintains the hope that the proposals put forward by Charities Against Hate in the coming months will show Facebook and others that they can improve their practices in return for the sector’s support.
“Everyone is in agreement that they don’t want to see this sort of [hate speech] content – but it’s really about how we become more effective in stopping it,” she says.