These people keep your social media feeds from becoming a sleaze pit

You've not seen the Internet the way they have.

Joshua Lee | April 21, 2017, 11:50 AM

More than 1.23 billion people use Facebook every day and despite its massive member base, it's not easy to come across a pornographic image on your newsfeed.

That is thanks to the unceasing work of Internet moderators that your newsfeed is not overrun by sleaze.

Media company Field of Vision released a documentary titled The Moderators on April 14, giving us an insight on the people who work behind the scenes to ensure we have a safe online space.

Screengrab via Vimeo.

Directed by Adrian Chen and Ciarán Cassidy, the The Moderators features a content moderation team based in India. These content moderation services are utilised by social media networks, such as Facebook, YouTube, and Pinterest to ensure that user-submitted content do not cross the boundaries of decency - as one of them puts it, 'nowadays everyone has access to [the] internet, and if it's not controlled well, it becomes a porn factory'.

Be mentally prepared

It's quite telling when the training facilitator warns the new employees to 'be mentally prepared for your job'. At first, he shows them relatively harmless stuff - a woman in a skimpy outfit, a vulgar finger gesture - eliciting giggles from the trainees.

Screengrab via Vimeo.

However, the laughter quickly faded into horrified stares as the pictures become more gruesome. A decapitated head, an abused child covered in cuts and blood.

Screengrab via Vimeo.

According to the documentary, shocking pictures such as these are encountered by the content moderators daily. The teams sift through close to one million images each day and about 20% of them do not pass guidelines.

Content moderation takes many forms and levels - each complementing the other in order to keep dark content to a minimum on social platforms. Outsourced centres (in countries like the Philippines or India) like the one featured in The Moderators do the more basic moderation while in-house departments of social media platforms handle more nuanced cases. These are also complemented by algorithms and automation that are specially designed to identify, track, and over time learn what would pass muster.

While it would be ideal to depend on machines to help us filter the web, that scenario is still a long way off. In the meantime, content moderation still requires the human mind to throw out the nasty stuff online - and man, is there a whole lot of nasty on the Internet.

A legion of scrubbers

A 2014 Wired article estimated that there are more than 100,000 content moderators worldwide working to scrub our Internet clean. As you can imagine, the work takes a toll on the people who take in all the depravity that the human mind can dream up.

In the documentary, a content moderator describes how the content he comes across can be very creepy and vulgar. The facilitator advises his trainees to adopt a detached mind when dealing with their work - "Sometimes, it will be so, so...offensive you will feel like...how to look (at) those images...but it's our job".

According to the same Wired article, most content moderators don't stay long in their jobs because of the toll it takes on their mental and emotional health, comparing it even to Post-Traumatic Stress Disorder (PTSD).

Screengrab via Vimeo.

As you watch the fresh faces in The Moderators take on the immense task of moderating the Internet, perhaps we should take a moment to appreciate that there are people who guard this virtual world we live in.

Check out The Moderators below:

 

Top screengrab from The Moderators via Vimeo.

If you like what you read, follow us on Facebook and Twitter to get the latest updates.