Facebook ignored or was slow to act on evidence that fake accounts on its platform have been undermining elections and political affairs around the world, according to an explosive memo sent by a recently fired Facebook employee and obtained by BuzzFeed News.
The 6,600-word memo, written by former Facebook data scientist Sophie Zhang, is filled with concrete examples of heads of government and political parties in Azerbaijan and Honduras using fake accounts or misrepresenting themselves to sway public opinion. In countries including India, Ukraine, Spain, Brazil, Bolivia, and Ecuador, she found evidence of coordinated campaigns of varying sizes to boost or hinder political candidates or outcomes, though she did not always conclude who was behind them.
“In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions,” wrote Zhang, who declined to talk to BuzzFeed News. Her LinkedIn profile said she “worked as the data scientist for the Facebook Site Integrity fake engagement team” and dealt with “bots influencing elections and the like.”
“I have personally made decisions that affected national presidents without oversight, and taken action to enforce against so many prominent politicians globally that I’ve lost count,” she wrote.
The memo is a damning account of Facebook’s failures. It’s the story of Facebook abdicating responsibility for malign activities on its platform that could affect the political fate of nations outside the United States or Western Europe. It’s also the story of a junior employee wielding extraordinary moderation powers that affected millions of people without any real institutional support, and the personal torment that followed.
“I know that I have blood on my hands by now,” Zhang wrote.
These are some of the biggest revelations in Zhang’s memo:
- It took Facebook’s leaders nine months to act on a coordinated campaign “that used thousands of inauthentic assets to boost President Juan Orlando Hernandez of Honduras on a massive scale to mislead the Honduran people.” Two weeks after Facebook took action against the perpetrators in July, they returned, leading to a game of “whack-a-mole” between Zhang and the operatives behind the fake accounts, which are still active.
- In Azerbaijan, Zhang discovered the ruling political party “utilized thousands of inauthentic assets… to harass the opposition en masse.” Facebook began looking into the issue a year after Zhang reported it. The investigation is ongoing.
- Zhang and her colleagues removed “10.5 million fake reactions and fans from high-profile politicians in Brazil and the US in the 2018 elections.”
- In February 2019, a NATO researcher informed Facebook that “he’d obtained Russian inauthentic activity on a high-profile U.S. political figure that we didn’t catch.” Zhang removed the activity, “dousing the immediate fire,” she wrote.
- In Ukraine, Zhang “found inauthentic scripted activity” supporting both former prime minister Yulia Tymoshenko, a pro–European Union politician and former presidential candidate, as well as Volodymyr Groysman, a former prime minister and ally of former president Petro Poroshenko. “Volodymyr Zelensky and his faction was the only major group not affected,” Zhang said of the current Ukrainian president.
- Zhang discovered inauthentic activity — a Facebook term for engagement from bot accounts and coordinated manual accounts— in Bolivia and Ecuador but chose “not to prioritize it,” due to her workload. The amount of power she had as a mid-level employee to make decisions about a country’s political outcomes took a toll on her health.
- After becoming aware of coordinated manipulation on the Spanish Health Ministry’s Facebook page during the COVID-19 pandemic, Zhang helped find and remove 672,000 fake accounts “acting on similar targets globally” including in the US.
- In India, she worked to remove “a politically-sophisticated network of more than a thousand actors working to influence” the local elections taking place in Delhi in February. Facebook never publicly disclosed this network or that it had taken it down.
“We’ve built specialized teams, working with leading experts, to stop bad actors from abusing our systems, resulting in the removal of more than 100 networks for coordinated inauthentic behavior,” Facebook spokesperson Liz Bourgeois said in a statement. “It’s highly involved work that these teams do as their full-time remit. Working against coordinated inauthentic behavior is our priority, but we’re also addressing the problems of spam and fake engagement. We investigate each issue carefully, including those that Ms. Zhang raises, before we take action or go out and make claims publicly as a company.”
BuzzFeed News is not publishing Zhang’s full memo because it contains personal information. This story includes full excerpts when possible to provide appropriate context.
In her post, Zhang said she did not want it to go public for fear of disrupting Facebook’s efforts to prevent problems around the upcoming 2020 US presidential election, and due to concerns about her own safety. BuzzFeed News is publishing parts of her memo that are clearly in the public interest.
“I consider myself to have been put in an impossible spot – caught between my loyalties to the company and my loyalties to the world as a whole,” she said. “The last thing I want to do is distract from our efforts for the upcoming U.S. elections, yet I know this post will likely do so internally.”
Zhang said she turned down a $64,000 severance package from the company to avoid signing a nondisparagement agreement. Doing so allowed her to speak out internally, and she used that freedom to reckon with the power that she had to police political speech.
“There was so much violating behavior worldwide that it was left to my personal assessment of which cases to further investigate, to file tasks, and escalate for prioritization afterwards,” she wrote.
That power contrasted with what she said seemed to be a lack of desire from senior leadership to protect democratic processes in smaller countries. Facebook, Zhang said, prioritized regions including the US and Western Europe, and often only acted when she repeatedly pressed the issue publicly in comments on Workplace, the company’s internal, employee-only message board.
“With no oversight whatsoever, I was left in a situation where I was trusted with immense influence in my spare time,” she wrote. “A manager on Strategic Response mused to myself that most of the world outside the West was effectively the Wild West with myself as the part-time dictator – he meant the statement as a compliment, but it illustrated the immense pressures upon me.”
A former Facebook engineer who knew her told BuzzFeed News that Zhang was skilled at discovering fake account networks on the platform.
“Most of the world outside the West was effectively the Wild West with myself as the part-time dictator.”
“She’s the only person in this entire field at Facebook that I ever trusted to be earnest about this work,” said the engineer, who had seen a copy of Zhang’s post and asked not to be named because they no longer work at the company.
“A lot of what I learned from that post was shocking even to me as someone who’s often been disappointed at how the company treats its best people,” they said.
Zhang’s memo said the lack of institutional support and heavy stakes left her unable to sleep. She often felt responsible when civil unrest took hold in places she didn’t prioritize for investigation and action.
“I have made countless decisions in this vein – from Iraq to Indonesia, from Italy to El Salvador,” she wrote. “Individually, the impact was likely small in each case, but the world is a vast place.”
Still, she did not believe that the failures she observed during her two and a half years at the company were the result of bad intent by Facebook’s employees or leadership. It was a lack of resources, Zhang wrote, and the company’s tendency to focus on global activity that posed public relations risks, as opposed to electoral or civic harm.
“Facebook projects an image of strength and competence to the outside world that can lend itself to such theories, but the reality is that many of our actions are slapdash and haphazard accidents,” she wrote.
“We simply didn’t care enough to stop them”
Zhang wrote that she was just six months into the job when she found coordinated inauthentic behavior — Facebook’s internal term for the use of multiple fake accounts to boost engagement or spread content — benefiting Honduran President Juan Orlando Hernández.
The connection to the Honduran leader was made, Zhang said, because an administrator for the president’s Facebook page had been “happily running hundreds of these fake assets without any obfuscation whatsoever in a show of extreme chutzpah.” The data scientist said she reported the operation, which involved thousands of fake accounts, to Facebook’s threat intelligence and policy review teams, both of which took months to act.
“Local policy teams confirmed that President JOH’s marketing team had openly admitted to organizing the activity on his behalf,” she wrote. “Yet despite the blatantly violating nature of this activity, it took me almost a year to take down his operation.”
That takedown was announced by Facebook in July 2019, but proved futile. Soon, the operation was soon back up and running, a fact Facebook has never disclosed.
“They had returned within two weeks of our takedown and were back in a similar volume of users,” Zhang wrote, adding that she did a final sweep for the fake accounts on her last day at Facebook. “A year after our takedown, the activity is still live and well.”
In Azerbaijan, she found a large network of inauthentic accounts used to attack opponents of President Ilham Aliyev of Azerbaijan and his ruling New Azerbaijan Party, which uses the acronym YAP. Facebook still has not disclosed the influence campaign, according to Zhang.
The operation detailed in the memo is reminiscent of those of Russia’s Internet Research Agency, a private troll farm that tried to influence the 2016 US elections, because it involved “dedicated employees who worked 9-6 Monday-Friday work weeks to create millions of comments” targeting members of the opposition and media reports seen as negative to Aliyev.
“Perhaps they thought they were clever; the truth was, we simply didn’t care enough to stop them.”
“Multiple official accounts for district-level divisions of the ruling YAP political party directly controlled numerous of these fake assets without any obfuscation whatsoever in another display of arrogance,” she wrote. “Perhaps they thought they were clever; the truth was, we simply didn’t care enough to stop them.”
Katy Pearce, an associate professor at the University of Washington who studies social media and communication technology in Azerbaijan, told BuzzFeed News that fake Facebook accounts have been used to undermine the opposition and independent media in the country for years.
“One of the big tools of authoritarian regimes is to humiliate the opposition in the mind of the public so that they’re not viewed as a credible or legitimate alternative,” she told BuzzFeed News. “There’s a chilling effect. Why would I post something if I know that I’m going to deal with thousands or hundreds of these comments, that I’m going to be targeted?”
Pearce said Zhang’s comment in the memo that Facebook “didn’t care enough to stop” the fake accounts and trolling aligns with her experience. “They have bigger fish to fry,” she said.
A person who managed social media accounts for news organizations in Azerbaijan told BuzzFeed News that their pages were inundated with inauthentic Facebook comments.
“We used to delete and ban them because we didn’t want people who came to our page to be discouraged and not react or comment,” said the person, who asked not to be named because they were not authorized to speak for their employer. “But since [the trolls] are employees, it’s easy for them to open new accounts.”
They said Facebook has at times made things worse by removing the accounts or pages of human rights activists and other people after trolls report them. “We tried to tell Facebook that this is a real person who does important work,” but it took weeks for the page to be restored.
Zhang wrote that a Facebook investigation into fake accounts and trolling in Azerbaijan is now underway, more than a year after she first reported the issue. On the day of her departure, she called it her “greatest unfinished business” to stop the fake behavior in the country.
“Many others would think nothing of myself devoting this attention to the United States, but are shocked to see myself fighting for these small countries,” she wrote. “To put it simply, my methodologies were systematic globally, and I fought for Honduras and Azerbaijan because that was where I saw the most ongoing harm.”
“I have blood on my hands”
In other examples, Zhang revealed new information about a large-scale fake account network used to amplify and manipulate information about COVID-19, as well as a political influence operation that used fake accounts to influence 2018 elections in the US and Brazil. Some of these details were not previously disclosed by Facebook, suggesting the company’s regular takedown announcements remain selective and incomplete.
Zhang said Facebook removed 672,000 “low-quality fake accounts” after press reports in April that some of the accounts had been engaging with COVID-19 content on the Spanish Health Ministry’s page. She said accounts in that network also engaged with content on US pages. Facebook did not disclose how many accounts it removed, or that those accounts engaged with content in other countries, including the US.
Zhang also shared new details about the scale of inauthentic activity during the 2018 midterm elections in the US, and from Brazilian politicians that same year. “We ended up removing 10.5 million fake reactions and fans from high-profile politicians in Brazil and the U.S. in the 2018 elections – major politicians of all persuasions in Brazil, and a number of lower-level politicians in the United States,” she wrote.
A September 2018 briefing about Facebook’s election work in the US and Brazil disclosed that it had acted against a network in Brazil that used “fake accounts to sow division and share disinformation,” as well as a set of groups, pages, and accounts that were “falsely amplifying engagement for financial gain.” It did not fully mention Zhang’s findings.
The scale of this activity — 672,000 fake accounts in one network, 10.5 million fake engagement and fans in others — indicates active fake accounts are a global problem, and are used to manipulate elections and public debate around the world.
As one of the few people looking for and identifying fake accounts impacting civic activity outside of “priority” regions, Zhang struggled with the power she had been handed.
“We focus upon harm and priority regions like the United States and Western Europe,” Zhang wrote, adding that “it became impossible to read the news and monitor world events without feeling the weight of my own responsibility.”
In Bolivia, Zhang said she found “inauthentic activity supporting the opposition presidential candidate in 2019” and chose not to prioritize it. Months later, Bolivian politics fell into turmoil, leading to the resignation of President Evo Morales and “mass protests leading to dozens of deaths.”
The same happened in Ecuador, according to Zhang, who “found inauthentic activity supporting the ruling government… and made the decision not to prioritize it.” The former Facebook employee then wondered how her decision led to downstream effects on how Ecuador’s government handled the COVID-19 pandemic — which has devastated the country — and if that would have been different if she’d acted.
“I have made countless decisions in this vein – from Iraq to Indonesia, from Italy to El Salvador. Individually, the impact was likely small in each case, but the world is a vast place. Although I made the best decision I could based on the knowledge available at the time, ultimately I was the one who made the decision not to push more or prioritize further in each case, and I know that I have blood on my hands by now.”
Zhang also uncovered issues in India, Facebook’s largest market, in the lead up to the local Delhi elections in February 2020. “I worked through sickness to take down a politically-sophisticated network of more than a thousand actors working to influence the election,” she wrote.
Last month, Facebook’s Indian operation came under scrutiny after reports in the Wall Street Journal revealed a top policy executive in the country had stopped local staffers from applying the company’s hate speech policies to ruling party politicians who posted anti-Muslim hate speech.
“Haphazard Accidents”
In her “spare time” in 2019, Zhang…
Privacy 2024 Recap – some significant decisions, slow progress for reform
The past year saw a few court decisions of note as well as halting progress toward privacy…