Russia and Iran were the top sources of influence operations on Facebook, the company said in a threat report released this week.
The 44-page report, which examined activity occurring between 2017 and 2020, found that the operators usually try to mimic domestic, authentic audiences “so they can more credibly exploit contentious political and societal issues in a given country.”
The United States was one of the most-targeted countries.
Facebook has previously described the Russia-based Internet Research Agency (IRA) as being behind “much of the abuse” surrounding the 2016 election on the social media platform. It removed a network linked to the group in July 2018.
The group not only runs influence operations, but it brags about running operations that turned out to be much smaller than they claimed, a phenomena Facebook labels “perception hacking.”
The IRA said it was running thousands of fake accounts with the capacity to sway results in the waning hours of the 2018 midterm elections but a Facebook probe found only a small network of accounts originating in Russia, ultimately deeming the claim as false.
Russia, Iran, and the United States each had five networks that were removed by Facebook in the year leading up to the 2020 election, including one network linked to Russian military intelligence. A separate effort, linked to the IRA, “tricked” freelance journalists, including ones in America, to write on hot-button social and political issues to target Americans on the right and the left by purporting to be a legitimate news site, according to Facebook.
Iranian networks were associated with the country’s government and its state broadcaster, IRIB. In one instance, Iranian actors conducted a campaign primarily through email posing as the Proud Boys, a libertarian-minded group that has been involved in brawls with leftist groups in major U.S. cities.
U.S. intelligence officials said in March that they found Russia sought to denigrate President Joe Biden and boost former President Donald Trump in the 2020 election while the Chinese Communist Party (CCP) “did not deploy interference efforts and considered but did not deploy influence efforts intended to change the outcome of the U.S. presidential election.”
Operations originating in China manifested differently from other foreign actors, mostly as strategic communication using state-affiliated channels like official diplomatic accounts “or large-scale spam activity that included primarily lifestyle or celebrity clickbait and also some news and political content,” according to the Facebook threat report.
“These spam clusters operated across multiple platforms, gained nearly no authentic traction on Facebook, and were consistently taken down by automation,” it said.
The U.S.-based operations included an attempt funded by liberal billionaire Reid Hoffman to divide conservatives in Alabama in 2017 and lead some to vote for a write-in candidate, benefiting the Democrat nominee. They also included a network operated by marketing firm Rally Forge, which was working on behalf of clients including Turning Point USA, a conservative group.
Facebook has utilized a combination of automatic and investigations to remove influence operations. The team leading the effort has grown to over 200 people.
“In the United States, we’ve seen these types of activities linked to marketing firms, linked to sort of political actors, and linked to fringe groups or conspiracy theory groups,” Nathaniel Gleicher, Facebook’s head of cybersecurity policy, said on “CBS This Morning.”
“Imagine a PR firm or a marketing firm that hires a thousand people in the United States to use their social media accounts or maybe create fake social media accounts to advocate for particular issues or on behalf of a particular candidate.”
Actors have increasingly pivoted from widespread campaigns to smaller, more targeted operations because of the company’s crackdown on such operations, according to the report. Sophisticated actors have also worked to obscure their identities, using technological methods and employing proxies.
Be the first to comment