Facebook has failed on civil rights.

On Wednesday, after two years of work, the social media giant finally released the results of its independent audit, a wide-ranging report on the state of civil rights on Facebook, from hate speech to advertising to algorithmic bias. The auditors found that the company simply hasn’t done enough to combat hate and abuse on its platform.

Following up on two previous updates in December 2018 and June 2019, the audit concludes that the company’s handling of civil rights issues is “too reactive and piecemeal,” and ultimately raises doubts about whether Facebook is actually committed to addressing its myriad problems.

That’s especially concerning given that the November 2020 election is just months away.

Former ACLU director Laura W. Murphy, who led the report along with civil rights attorney Megan Cacace, compared Facebook’s work to climbing Mount Everest. She noted that though the social media company had made some progress, Facebook still hadn’t invested enough resources or moved quickly enough to address its many civil rights challenges, creating “legitimate questions about Facebook’s full-throated commitment to reaching the summit.”

The audit, which was commissioned by Facebook at the urging of civil rights leaders and politicians, comes amid a growing advertiser boycott of the platform called Stop Hate for Profit, which is led by civil rights groups including the NAACP, the Anti-Defamation League, and Color of Change, none of which seem to have any plans to halt their campaign. More than 1,000 companies have now signed on, despite CEO Mark Zuckerberg dismissing its impact.

For these leaders of the boycott, who have long tried to work alongside Facebook, the findings of the audit confirm much of what they’ve previously said about the company: that it isn’t taking issues around hate speech, bias, polarization, and diversity seriously enough.

“Ridding the platform of hate and misinformation against Black people only became a priority when there was a PR crisis to endure,” said Rashad Robinson, the president of Color of Change, who hinted that Congress may have a role in protecting civil rights on the ever-embattled platform.

The report is an important one for Facebook’s reputation, but it isn’t binding. Facebook can choose to implement the recommendations in the report or to dismiss them — which is what some advocates like Robinson fear. In a blog post announcing the report’s release on Wednesday, Facebook COO Sheryl Sandberg said that the company “won’t make every change they [auditors] call for,” but that it “will put more of their proposals into practice.”

Regardless of what the company ends up doing, the audit serves as a thorough examination of Facebook’s longstanding struggle to reconcile its stated values around free speech with the history of harm caused by unchecked vitriol and discrimination on its platform. With that overarching theme in mind, here are five key takeaways about Facebook and civil rights from the 89-page report.

1) Holding Trump to a different standard sets a troubling precedent

Facebook has failed to penalize Trump for violating its community guidelines, the auditors say, which stands “to gut policies” that had represented progress for civil rights on the platform. The report specifically highlights a group of Trump’s posts that made misleading claims about voting and the president’s infamous “looting … shooting” post about protesters. Echoing previous concerns from civil rights groups, the auditors say these posts clearly violate Facebook’s community guidelines and that not removing them establishes a concerning precedent for Trump and other politicians.

The voting-related posts by Trump referenced in the report include false claims about mail-in ballots in California, Michigan, and Nevada. Facebook ultimately decided that these posts did not violate its guidelines, arguing in the case of Michigan and Nevada that the language in the posts was merely “challenging the legality of officials.” The auditors explain that they “vehemently expressed” their view that the posts violated policy but were “not afforded an opportunity to speak directly to decision-makers” until after the final decision was made.

Facebook’s decisions, they said, constitute a “tremendous setback for all of the policies that attempt to ban voter suppression on Facebook.”

Trump’s “looting … shooting” post represents a similar pattern of self-justified inaction. In that post, the president appeared to threaten violence against Black Lives Matter protesters, using language that echoed civil rights-era white segregationists. Though Facebook executives called the White House requesting that Trump change or delete the post, the company ultimately did nothing about it. By contrast, Twitter chose to label an identical post by President Trump on its platform for violating its rules about glorifying violence.

Facebook defended its decision by arguing that threats of state action are allowed on the platform. The auditors say that logic ignored “how such statements, especially when made by those in power and targeted toward an identifiable, minority community, condone vigilantism and legitimize violence against that community.“ They added, “Random shooting is not a legitimate state use of force.” Again, the auditors say they were not included in the decision-making process in time. Facebook’s decision about the “looting … shooting” post, which Mark Zuckerberg later defended on a call with employees, prompted criticism from company executives and a virtual employee walkout. It was one of the incidents that inspired the Stop Hate for Profit boycott.

In June, Facebook announced it will label posts that violate its community guidelines but are left up because they’re deemed newsworthy (and if their public interest value eclipses the harm they cause), but that doesn’t seem to happen very often. The audit revealed that over the past year, the company only applied the newsworthy exception to politicians 15 times, and only once in the United States, and it was not immediately clear what those instances were.

Meanwhile, the company still hasn’t taken any action against Trump’s past posts, and the auditors concluded that for many civil rights advocates, “the damage has already been done.” Even if Facebook has policies supporting civil rights, the auditors concluded, the refusal to enforce them against Trump has eroded trust in the company and leaves room for other politicians to follow in Trump’s footsteps.

2) Valuing free speech above all else creates problems

While Facebook’s leadership has repeatedly emphasized the company’s commitment to free expression, the auditors found that this comes at a cost. Facebook systematically chooses to prioritize the speech of politicians over clamping down on harmful and hateful rhetoric, which hurts its users overall. Several times in the report, the auditors cite Zuckerberg’s 2019 speech at Georgetown as a “turning point,” where Facebook reiterated its commitment to free expression as “a governing principle of the platform.”

Facebook’s choice not to fact-check politicians — and to allow them to sometimes break Facebook’s own rules against posting harmful content because what politicians say is inherently newsworthy — represents another problem. Both steps have significantly hurt the company’s civil rights efforts, the auditors said. Allowing politicians to spread misinformation about voting, which Zuckerberg in his Georgetown speech argued was a form of free expression, particularly undermines Facebook’s commitment to its values. The auditors said they found Facebook’s prioritization of free speech over other values, like nondiscrimination and equality, “deeply troubling.”

By forming exemptions for politicians’ content, they argue, a “hierarchy of speech is created that privileges certain voices over less powerful voices.”

The report, however, acknowledges that Facebook is failing to address the tension between its civil rights promises and its monolithic commitment to free expression. Instead, the company should work to develop a more comprehensive understanding of free speech that acknowledges how typical users actually experience the platform.

“For a 21st century American corporation, and for Facebook, a social media company that has so much influence over our daily lives, the lack of clarity about the relationship between those two values is devastating,” lead auditor Laura W. Murphy wrote in the report’s introduction. “It will require hard balancing, but that kind of balancing of rights and interests has been part of the American dialogue since its founding and there is no reason that Facebook cannot harmonize those values, if it really wants to do so.”

3) Hate speech is still a problem for Facebook, and we don’t know how bad it really is…

Read The Full Article

Leave a Reply