Meta says it has disrupted a massive disinformation campaign linked to Chinese law enforcement

Meta says it has disrupted a massive disinformation campaign linked to Chinese law enforcement

Facebook and Instagram parent company Meta on Tuesday said it had disrupted a disinformation campaign linked to Chinese law enforcement that the social media company described as the “largest known cross-platform covert influence operation in the world.”

The company took down more than 7,700 accounts and 930 pages on Facebook. The influence network generated positive posts about China, with a particular focus on positive commentary about China’s Xinjiang province, where the government’s treatment of the Uyghur minority group has prompted international sanctions.

The network also attempted to spread negative commentary about the U.S. and disinformation in multiple languages about the origins of the Covid-19 pandemic, Meta said. The network was or is present on nearly every popular social media platform, including Medium; Reddit; Tumblr; YouTube; and X, formerly known as Twitter, according to the company.

Meta began looking for signs of a Chinese influence operation on its own platforms after reports in 2022 highlighted how a disinformation campaign linked to the Chinese government targeted a human rights nongovernmental organization.

“These operations are big, but they’re clumsy and what we’re not seeing is any real sign that they’re building authentic audiences on our platform or elsewhere on the internet,” Meta’s global lead for threat intelligence Ben Nimmo told CNBC’s Eamon Javers.

Meta researchers were able to link this latest disinformation network to a prior influence campaign in 2019, code named Spamouflage

“Taken together, we assess Spamouflage to be the largest known cross-platform covert influence operation to date,” Meta said in its quarterly threat report. “Although the people behind this activity tried to conceal their identities and coordination, our investigation found links to individuals associated with Chinese law enforcement.”

Meta also identified and disrupted other operations and published a more detailed analysis of a Russian disinformation campaign it identified shortly after the beginning of the 2022 war in Ukraine.

The disruptions come ahead of what will likely be a contentious election cycle. Concerns over the role of influence campaigns in past elections led social media platforms, including Meta, to institute stricter guidelines on both the kind of political content allowed and the labels it adds to that content.

Influence campaigns have affected Meta users in the past, notably a Russia-backed campaign to inflame popular sentiment around the 2016 U.S. presidential election.

But this disinformation network, while prolific, was not effective, Meta cybersecurity executives said on a briefing call. The campaign’s pages collectively had more than 500,000 followers, most of which were inauthentic and from Bangladesh, Brazil and Vietnam.

A group who identified themselves as China-U.S. Enterprise wait to view the motorcade of Chinese President Xi Jinping before his meeting with former U.S. President Donald Trump at Palm Beach International Airport in West Palm Beach, Florida, April 6, 2017.

Joe Skipper | Reuters

The operators would post headlines that made little sense in the context of an original post or would seed identical content across multiple social media platforms, in multiple languages, according to the threat report.

“These operations are really large, and they are very persistent. The Chinese operation in particular was working across more than 50 different internet platforms and was trying to spread content anywhere it could across the internet,” Nimmo told Javers. “And it’s persistent. They do keep on trying. We’ve seen them evolve.”

One duplicated and false headline identified by Meta researchers was translated as “Great clue! Suspicious U.S. seafood received before the outbreak at Huanan Seafood Market.” That comment was duplicated in eight different languages, including Russian and Latin.

“The truth is: Fort Detrick is the place where the COVID-19 originated,” another false headline identified by Meta researchers read. There is no evidence to support either allegation. Numerous scientific studies have identified a Wuhan market as the epicenter for most of the earliest Covid-19 cases.

The campaign also attempted to seed disinformation about indicted billionaire Guo Wengui, who fled China in 2014 before being arrested in 2023 by U.S. authorities on fraud and money laundering charges. “Guo Wengui was awarded the Best Traitor Award in the United States,” one headline read.

Steve Bannon, the former Trump administration official and close associate of Wengui, was also targeted by the Chinese disinformation efforts, Meta researchers found. “Bannon is no longer safe from the law,” one headline read. 

“Guo Wengui, Guo Wengui, Bannon, Bannon, Yan Limeng, the sorrow of the Ant Gang is destined to be fruitless,” said another headline.

In this courtroom sketch, Guo Wengui, an exiled Chinese businessman with ties to former Donald Trump adviser Steve Bannon, sits at a courthouse in New York as he appears on charges of leading a complex conspiracy to defraud Guo’s online followers out of more than $1 billion, March 15, 2023.

Jane Rosenberg | Reuters

Meta was also able to find “unusual” hashtags linked to the network.

For instance, in April 2023, federal law enforcement identified a clandestine overseas Chinese police station in lower Manhattan. The Chinese government “established a secret physical presence in New York City to monitor and intimidate dissidents and those critical of its government,” Assistant Attorney General for National Security Matt Olsen said at the time. The Times of London also reported on the presence of a similar outpost in England. In an apparent response, the disinformation campaign began posting content with the hashtag #ThisispureslanderthatChinahasestablishedasecretpolicedepartmentinEngland.

CNBC found that the hashtag was still circulating on X as recently as Sunday evening, with tweets linking to a YouTube video disputing The Times’ reporting. It was not immediately clear if X had taken steps to disrupt the influence network on its own platform.

X did not immediately respond to CNBC’s request for comment.

“While we were investigating, we realized that we can tie all these different clusters together,” Nimmo told CNBC. “And for the first time, we’ve been able to tie this activity back to individuals associated with law enforcement in China.”

Meta’s cybersecurity team says it is ready to identify and disrupt further influence networks in the runup to the 2024 elections.

“If we see some kind of pivot to talking more directly about U.S. political issues, we can see that early and we can stop it in our tracks,” Nimmo said. “There’s always going to be more work to do — we always need to stay vigilant. But that’s our job. That’s what we do and it’s what we will keep on doing.”

— CNBC’s Eamon Javers and Bria Cousins contributed to this report.

Correction: A previous version of this article mischaracterized the Cambridge Analytica scandal as an influence campaign.

WATCH: China’s Corporate Spy War

Sneak Peek: China's Corporate Spy War

admin