Self-Employed Eligible for Unemployment Benefits: Ethics and Moral Hazard Considerations
KONY 2012: Doing the Right Thing

Facebook's Dirty Laundry

Facebook Follows Ethically Inappropriate Actions in Moderating Online Content

On March 2, 2012, The Telegraph brought attention to one of the worst jobs in the digital world. While the Facebook content most users see is innocuous, this is not by chance. Facebook is richer and more populous than many countries, and keeping it sanitized requires a small army of content moderators to strip the site of the darker side of humanity.

With around four billion pieces of content shared daily, Facebook has come to rely upon a system whereby photos or posts may be flagged by regular users as unsuitable. Gawker  notes that this results in a deluge of “porn, gore, racism, cyberbullying, and so on” that must be reviewed for international legality and compliance with the site's terms of use. While removing unsavory content on a site that does not claim to be a forum for unrestricted speech may not seem problematic, the way in which it is being done should be a source of concern.

With a market capitalization upwards of $75B, established digital prowess, and a legion of strategists and legal advisers, one would expect Facebook to have a sound process established for dealing with the unpalatable content that comes to attention, yet this does not seem to be the case. Despite the relatively low number of moderators required to police its content, estimated at around 800 to 1,000, Facebook uses the outsourcing company oDesk to have this work haphazardly carried out by laborers in developing countries for a miserly $1 an hour.

oDesk is a company with a global job marketplace  and a series of tools targeted at businesses that hire and manage remote workers.  Moderators follow guidelines to determine if flagged content should be ignored, deleted, or escalated, which refers to sending it to a Facebook employee in the U.S. who will report it to authorities if necessary.

Even though someone must do the dirty laundry, the brazen manner in which Facebook conducts its moderation process is worrying. New moderators must be proficient in English and they receive basic training by oDesk, but they use their personal, unsecured computers at home, no criminal checks are believed to be in place, and they are given nearly unfettered access to user information for determining the “context” of flagged content. This leads to a horrifying situation: unscreened laborers are flooded with grotesque material that can erode their mental health.

For example, Amine Derkaoui, a 21-year-old man living in Morocco, went public with his experience as a Facebook moderator in February. He found the job humiliating and exploitative, adding that “the job itself was very upsetting – no one likes to see a human cut into pieces every day.” His experience is hardly unique; educated workers in Asia, Africa, and Central America hired through oDesk describe being shaken by the frequent exposure to pedophilia, necrophilia, beheadings, suicides, animal or child abuse, fighting, racism, violent threats, and other troubling content.

Moreover, disgruntled moderators have all the information they need to blackmail users over content they posted (even privately), can easily leak intimate material on the web, may harvest illegal content such as child pornography for redistribution, and can even frame innocent users by escalating falsified content. As if this wasn’t bad enough, the New York Times reports that moderators in the U.S. have struggled to get safeguards for mental health, so it is unlikely those being paid pittance are offered counseling as part of the job.

One of my ethics students provided the following ethical analysis of Facebook’s current moderation practices using utilitarianism and “Rights Theory.” A table exploring the key benefits, harms, and rights for the primary shareholders is presented below:

Facebook

 

Benefits

Moderation provides a clean outward image.

Removing and reporting certain content helps ensure legal compliance in some jurisdictions.

Harms

Inadequate program oversight could pose a significant legal liability.

Greater awareness of the program’s details could tarnish the corporate image.

Knowingly transferring content that has been identified as illegal, such as child pornography, internationally from the moderators back to Facebook almost certainly conflicts with laws, treaties, and other sources of binding regulation.

Rights & Duties

To operate efficiently to the extent that laws are followed.

 

Facebook Users

Benefits

A minimum of filth provides a sanctuary from darker areas of the internet.

Human review of flagged content helps avoid unwarranted content removal.

Harms

Facebook's secretive moderation practices leave most users in the dark.

Even content that has been posted “privately” is vulnerable to being leaked on the internet or used for blackmail, and users can easily be framed by the moderators.

Rights & Duties

Regardless of the terms of service, to have their privacy respected when they have a reasonable expectation of it, such as after having marked content as “private.”

To know what happens with their self-generated content, including who has access to it.

For user-generated content to not be arbitrarily deleted from baseless flagging by users.

To have personal information protected through reasonable security policies.

 

Moderators

Benefits

Income is being offered in parts of the world where the pay is competitive even given the nature of the job.

The flexible working arrangement is good for students.

Harms

By having illegal content such as child pornography on their personal computers, they are placed in a very precarious legal situation.

Their mental health may deteriorate from being subjected to an endless stream of unsettling content, and counseling or mental health monitoring policies may not exist.

Rights

To not be taken advantage of by being placed in a legally-vulnerable situation.

To not have their mental health stressed without adequate safeguards.

By following the utilitarian approach and comparing the harms and benefits, at a first glance the current moderation program can be seen to have clear benefits despite the significant harms. However, closer examination reveals that, without fail, every one of the harms identified could be mitigated through more responsible moderation methodology, ultimately rendering Facebook’s current practices deplorable. It is unlikely that it would cost more than $5M annually to have a domestic moderation program in which all activity is carried out on secure company computers and is carefully audited and logged for inappropriate moderator behavior, all employees are screened for criminal histories, and counseling as well as periodic mental health assessments are provided. By having the questionable content stay on company computers, this would also avoid nearly all of the potentially illegal aspects of the current approach.

Facebook does not fare any better under the “Rights Theory” approach. The current approach may operate outside the law, in which Facebook does not have the right to use it and many of the rights of Facebook’s user’s moderators are not respected.  Facebook has obligations to these moderators as agents of the company to provide a safe workplace environment.  

In conclusion, Facebook appears to have chased profitability while sliding down the proverbial ethical slippery slope.

Blog posted by Steven Mintz, aka Ethics Sage, on March 19, 2012

Comments