A Facebook bug showed moderators to the terror groups they monitored

Health Should Be Easy
A Facebook bug showed moderators to the terror groups they monitored

Facebook's moderators were exposed to suspected terrorist networks.

Image: Camus/AP/REX/Shutterstock

A glitch in Facebook’s content moderation tools exposed its moderators’ personal profiles to members suspected of having ties to radicalism and terror, potentially putting the moderators at risk and causing one to upend his life in fear of retaliation.

The security flaw was discovered in Nov. 2016, according to a report by The Guardian, after Facebook moderators started getting friend requests from profiles affiliated with the radical groups they were suppressing. The personal profiles of content moderators were visible to group admins within the activity logs of Facebook groups after other administrators were banned for posting inappropriate material, like explicitly sexual or violent images. If a group had multiple admins, they would then be able to view a log about the changes and see exactly who edited the content.

Facebook created a task force of data scientists, community operations and security investigators, in response to the flaw, according to company emails obtained by The Guardian, and reportedly alerted the employees and contracted staff it believed were potentially at risk. The flaw was reportedly active for nearly a month, and it also retroactively showed moderators’ activity on accounts stretching back to Aug. 2016.

The bug reportedly affected over one thousand of Facebook’s content moderators across 22 departments, most notably a 40-person unit dedicated to counter-terrorism moderation operating out of the company’s European HQ in Dublin. An internal investigation found that six members of the unit had profiles that were likely viewed by “potential terrorists,” and Facebook tabbed the moderators as being “high priority” risks.

Facebook reportedly offered the six “high priority” moderators new home security systems, company transport to and from work, and counseling through its employee assistance program. For one of the six, however, that wasn’t enough for his peace of mind.

The moderator, an Iraqi-born Irish citizen contracted to work for Facebook by staffing firm Cpl Recruitment, spoke to The Guardian about his experience. He fled to Eastern Europe for five months and went into hiding following the investigation, and has now returned to Ireland and filed a legal claim against Facebook and Cpl with the country’s Injuries Board.

The moderator said he went into hiding after learning that seven members of a pro-Hamas and ISIS-sympathizing group he banned from Facebook viewed his profile. He also claimed that his fellow high risk moderators’ profiles were seen by accounts with ties to ISIS, Hezbollah, and the Kurdistan Workers Party.

Facebooks head of global investigations Craig DSouza was reportedly in direct contact with moderator six, offering support, but the moderator who fled wasn’t convinced that he and his family would be safe; violence at the hands of terrorists in Iraq drove them to Ireland in the first place.

“Im not waiting for a pipe bomb to be mailed to my address until Facebook does something about it, the moderator reportedly told D’Souza before he left Ireland.

Facebook’s response

When reached for comment about the report, a Facebook spokesperson acknowledged the security flaw and subsequent investigation, but asserted the company feels it took the necessary actions to keep its contractors safe.

“Last year, we learned that the names of certain people who work for Facebook to enforce our policies could have been viewed by a specific set of Group admins within their admin activity log,” the spokesperson wrote. “As soon as we learned about this issue, we fixed it and began a thorough investigation to learn as much as possible about what happened.”

The spokesman said the company assessed the level risk for each affected moderator, and the company contacted each of them individually to offer support, which didn’t end after the investigation wrapped. “We have continued to share details with them about a series of technical and process improvements we’ve made to our internal tools to better detect and prevent these types of issues from occurring,” they wrote.

Facebook also said that the there was never any evidence that the six “high priority” moderators or their families were being targeted for retaliation, and that the investigation into the flaw didn’t result in any profile views by accounts tied to suspected members of ISIS. Facebook claims that in most of the groups, the moderators’ pages were never viewed because there were no direct notifications alerting group admins that the edits had taken place.

The network adjusted its policies to prevent a flaw like this from popping up again by changing its infrastructure to make it much harder for its workers’ information to become available externally, and is running tests with new administrative accounts so moderators won’t have to use their personal profiles at work.

These steps, along with Facebook’s new plans to combat terrorism on the network using algorithms are steps in the right direction to make content moderation safer and more efficient but there are still major questions about how the company can keep its massive platform safe and free from the worst the world has to offer, especially in light of a leak of internal documents outlining Facebook’s content moderation policies last month.

The task falls to the moderators, who are forced to sift through the worst of the muck to keep everyone else’s internet experience (relatively) pleasant. Facebook says it offers psychological support and wellness resources, and pledges to do more in the future but, in the case of the Iraqi-born Irish monitor, that was too little too late.

They never warned us that something like this could happen, he told The Guardian. Thanks to his efforts to go public with his story, however, that warning is is now out there for everyone else.

Read more: http://mashable.com/2017/06/16/facebook-moderators-exposed-to-terrorists/

Leave a Reply

Your email address will not be published. Required fields are marked *