Facebook bans some, but not all, QAnon groups, accounts

Facebook says it will restrict the right-wing conspiracy movement QAnon and will no longer recommend that users join groups supporting it, although the company isn't banning it outright.

Facebook said Wednesday it is banning groups and accounts associated with QAnon and a variety of U.S.-based militia and anarchist groups that support violence. But the company will continue to allow people to post material that supports these groups — so long as they do not otherwise violate policies against hate speech, abuse and other provocations.

QAnon groups have flourished on Facebook in recent years. Twitter announced a similar crackdown recently.

RELATED: Facebook beefs up anti-misinformation efforts ahead of US election

The QAnon conspiracy theory is centered on the baseless belief that President Donald Trump is waging a secret campaign against enemies in the “deep state” and a child sex trafficking ring run by satanic pedophiles and cannibals. For more than two years, followers have pored over tangled clues purportedly posted online by a high-ranking government official known only as “Q.”

The conspiracy theory emerged in a dark corner of the internet but has recently crept into mainstream politics. Trump has retweeted QAnon-promoting accounts and its followers flock to his rallies wearing clothes and hats with QAnon symbols and slogans.

Last week, Marjorie Tyler Greene, a House candidate who openly supports QAnon, won her Republican primary in Georgia. She's part of a growing list of candidates who have expressed support for QAnon. Lauren Boebert, another candidate who has expressed support for QAnon, recently upset a five-term congressman in a Republican primary in Colorado.

Facebook said it will only remove groups and accounts outright if they discuss potential violence, including in veiled language.

“We will continue studying specific terminology and symbolism used by supporters to identify the language used by these groups and movements indicating violence and take action accordingly," the company said.

Facebook will still restrict the material it doesn't remove, initially by no longer recommending it. For instance, when people join a QAnon group, Facebook will not recommend similar groups to join. Neither will it suggest QAnon references in searches or, in the near future, allow it in ads.

RELATED: Facebook to allow users to promote and sell hand sanitizer, disinfecting wipes on platform

As a result of the policy changes, Facebook says it has removed over 790 groups, 100 pages and 1,500 ads tied to QAnon on Facebook and has blocked over 300 hashtags across Facebook and Instagram. There are 1,950 other groups and 440 pages Facebook says it has identified that remain on the platform but face restrictions, along with 10,000 accounts on Instagram.

For militia organizations and those encouraging riots, including some who may identify as antifa, the company said it has removed over 980 groups, 520 pages and 160 ads from Facebook.

Facebook said it is not banning QAnon outright because the group does not meet criteria necessary for the platform to designate it a "dangerous organization." But it is expanding this policy to address the movement because it has “demonstrated significant risks to public safety.”

Technology Social MediaPoliticsOrganization Facebook2020 Election