Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed
The campuses of the tech industry are famous for their lavish cafeterias, cushy shuttles, and on-site laundry services. But on a muggy February afternoon, some of these companies most important work is being done 7,000 miles away, on the second floor of a former elementary school at the end of a row of auto mechanics stalls in Bacoor, a gritty Filipino town 13 miles southwest of Manila. When I climb the buildings narrow stairwell, I need to press against the wall to slide by workers heading down for a smoke break. Up one flight, a drowsy security guard staffs what passes for a front desk: a wooden table in a dark hallway overflowing with file folders.
Past the guard, in a large room packed with workers manning PCs on long tables, I meet Michael Baybayan, an enthusiastic 21-year-old with a jaunty pouf of reddish-brown hair. If the space does not resemble a typical startups office, the image on Baybayans screen does not resemble typical startup work: It appears (explicit description removed), before he disappears it with a casual flick of his mouse.
Baybayan is part of a massive labor force that handles content moderationthe removal of offensive materialfor US social-networking sites. As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem: Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internets panoply of jerks, racists, creeps, criminals, and bullies. They wont continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video. Social medias growth into a multibillion-dollar industry, and its lasting mainstream appeal, has depended in large part on companies ability to police the borders of their user-generated contentto ensure that Grandma never has to see images like the one Baybayan just nuked.
So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of thema vast, invisible pool of human labor. Hemanshu Nigam, the former chief security officer of MySpace who now runs online safety consultancy SSP Blue, estimates that the number of content moderators scrubbing the worlds social media sites, mobile apps, and cloud storage services runs to well over 100,000that is, about twice the total head count of Google and nearly 14 times that of Facebook.
Past the guard, in a large room packed with workers manning PCs on long tables, I meet Michael Baybayan, an enthusiastic 21-year-old with a jaunty pouf of reddish-brown hair. If the space does not resemble a typical startups office, the image on Baybayans screen does not resemble typical startup work: It appears (explicit description removed), before he disappears it with a casual flick of his mouse.
Baybayan is part of a massive labor force that handles content moderationthe removal of offensive materialfor US social-networking sites. As social media connects more people more intimately than ever before, companies have been confronted with the Grandma Problem: Now that grandparents routinely use services like Facebook to connect with their kids and grandkids, they are potentially exposed to the Internets panoply of jerks, racists, creeps, criminals, and bullies. They wont continue to log on if they find their family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video. Social medias growth into a multibillion-dollar industry, and its lasting mainstream appeal, has depended in large part on companies ability to police the borders of their user-generated contentto ensure that Grandma never has to see images like the one Baybayan just nuked.
So companies like Facebook and Twitter rely on an army of workers employed to soak up the worst of humanity in order to protect the rest of us. And there are legions of thema vast, invisible pool of human labor. Hemanshu Nigam, the former chief security officer of MySpace who now runs online safety consultancy SSP Blue, estimates that the number of content moderators scrubbing the worlds social media sites, mobile apps, and cloud storage services runs to well over 100,000that is, about twice the total head count of Google and nearly 14 times that of Facebook.
http://www.wired.com/2014/10/content-moderation/
(Warning: Story at link does describe some of the banned pictures...not too graphic, but could be a trigger)
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
4 replies, 1630 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (5)
ReplyReply to this post
4 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed (Original Post)
jeff47
Oct 2014
OP
jberryhill
(62,444 posts)1. "family photos sandwiched between a gruesome Russian highway accident and a hardcore porn video"
Author obviously never met my family.
sharp_stick
(14,400 posts)2. HA HA HA HA
Beautiful response, I love it.
GeorgeGist
(25,320 posts)3. $1.21/hr?
jakeXT
(10,575 posts)4. Careful with those elbows