The Taliban Are Banned by Facebook. Then Why Are They Allowed to Post?

Just one more example of Facebook’s selective community standards policy.
Rimal Farrukh
Islamabad, PK
Taliban, Facebook, Blacklist, Afghanistan
Taliban fighters pose for a selfie in Kabul on Sept. 11. Photo: Wakil KOHSAR / AFP 

For years, Facebook has prohibited the Taliban from using its platform under its counter-terrorist and hate group community standards. Although the Taliban remains blocklisted, recently revealed internal memos show that the company has allowed certain exceptions, mostly for the armed group’s Ministry of Interior, following its takeover as the de facto government of Afghanistan. 


According to The Intercept, Facebook contradicted its own community standards under its “Dangerous Individuals and Organizations'' policy in late September. The community standard, which blocklists around 4,000 groups and individuals and removes content by alleged terrorists and organized hate groups, has been criticized for its vague and allegedly discriminatory implementation especially among marginalized groups.

“Any community guideline or standard that a company has, such as the [one on] ‘Dangerous Individuals and Organizations’, needs to be very transparent around how exactly they have set this standard, about what kind of data they are using, who the dangerous individuals are and who are not,” Facebook’s Oversight Board member Nighat Dad told VICE World News. 

The memo obtained by The Intercept showed Facebook granted the Taliban government’s Ministry of Interior permission to post “important information about new traffic regulations” at the end of September. The memo went on to state that the public value of the posted content outweighed its potential for harm. However, the exception from the ban went beyond just traffic updates, according to the report.

The Intercept noted that a second exception of two posts from the Ministry of Health was about to be added, which would include COVID-19 related information. However, despite these exceptions, the Ministry of Interior’s Facebook page was deleted in late October, and the Ministry of Health still has not posted any content since Oct. 2. 


Internal documents also revealed that for 12 days in August, Facebook allowed certain government figures to share posts acknowledging the Taliban as the official government of Afghanistan. From the end of August to Sept. 3, local users were able to post content that contained public statements made by the Taliban. 

Facebook said the Taliban remains banned from its platform under its Dangerous Individuals and Organizations policy, and exceptions were made in the interest of relaying necessary information to the public.

“We continue to review content and Pages against our policies and last month removed several pages including those from the Ministry of Interior, Ministry of Finance and Ministry of Public Works,” Facebook spokesperson Sally Aldous told The Intercept. “However, we’ve allowed some content about the provision of essential public services in Afghanistan, including, for example, two posts in August on the Afghan Health Page.”

Facebook has not yet publicly revealed its decision-making process behind the exceptions and has been criticized for its lack of transparency around them.

Despite Facebook’s relatively harsh stance on banned groups and individuals, the recently leaked Facebook Papers showcase deep flaws in the platform’s hate-speech content moderation. Internal company documents reveal that the company’s detection of hate speech in Afghanistan, particularly in the local languages of Dari and Pashto, is weak and needs major improvement. 


“The hate speech reporting process in the Afghanistan market (in local languages) is defective and needs significant amelioration,” the documents state, “Greater emphasis needs to be made on the Dari language classifiers, as the overwhelming majority of the hate speech contents are in Dari.”

Some critics have also highlighted the questionable power dynamics of a U.S.-based social media company controlling content related to the country’s current government. 

“Honestly, I think it's a bit strange that we've reached a point where we're relying on a corporation to make these decisions in the first place,” Jillian C. York, the director of International Freedom of Expression at Electronic Frontier Foundation, told VICE World News. 

“The decision itself reflects a nuanced perspective of the situation on the ground, and I can appreciate that, but for me, it also exposes a need for a more global and inclusive standard.”

Follow Rimal Farrukh on Twitter.