'No Topless Women' Rule Causes Facebook Problems

John Lister's picture

Facebook's oversight board says it should rethink its rules on topless images. Tech experts believe the current rules can't be effectively enforced with automated moderation anyway.

The ruling comes from the independent body that looks into cases where people believe Facebook has wrongly moderated content. The idea is to concentrate on cases where Facebook's rules may need clarifying.

In this situation, the board looked at two connected cases. Both involved posts on Instagram, which is owned by the same company as Facebook. The two services share content rules set by parent company Meta.

The posts included images of a couple "who identify as transgender and non-binary" in which they were "bare-chested with the nipples covered." The caption noted that the couple were fundraising for "surgery to create a flatter chest."

Topless Female Ban

Although no nipples were visible in the images, they were flagged up by automated moderation and sent for human review for potentially breaching a ban on "images containing female nipples other than in specified circumstances, such as breastfeeding and gender confirmation surgery." The human reviewers then removed the posts for violating a ban on sexual solicitation.

The oversight board overturned the decision to remove the images. It said the rules on exceptions to the "no female nipples" rules were "convoluted and poorly designed." While the case will no doubt provoke heated debate on trans issues, both the oversight board and independent experts have noted it's as much an issue of practicality as principle. The board went as far as to call the current policy "unworkable in practice." (Source: theguardian.com)

Automation Inadequate

That's because automated systems cannot reliably rule on whether an image breaks the content guidelines. However, the time it takes for a human to review the case and judge the context "is not practical when moderating content at scale."

The board says the only practical solution is Meta "defining clear criteria to govern the Adult Nudity and Sexual Activity policy, which ensure all users are treated in a manner consistent with human rights standards." (Source: oversightboard.com)

What's Your Opinion?

Is automated moderation reliable enough to deal with such cases? Should Facebook change its policies as the board suggests? Should sites like Facebook ban topless female images at all?

Rate this article: 
Average: 4.8 (8 votes)


Unrecognised's picture

"Human rights standards' are as slippery and hard to pin down as nipple rules- in fact nipple rules will probably in future be regarded as simple to administer compared to human rights standards in nudity if/when they set the bar further back.

Just one example; what do you do with all the self-images posted by people who have no understanding of, or regard for, their own health, welfare or rights? Hit them with penalties for self-sabotage, while prurient voyeurs without respect or compassion consume their fill and demand moar?

There is no degree of regulation that will not be a minefield, and companies must factor that into their operating costs at all our peril.