Home / Tech news / Time limit 'protects YouTube moderators' from disturbing content

Time limit 'protects YouTube moderators' from disturbing content

YouTube logoImage copyright
Getty Images

Image caption

YouTube plans to employ 10,000 more moderators to address violent and unsuitable content

Workers employed to sift through and remove disturbing content on YouTube will be limited to doing so for four hours per day.

YouTube had made the change to protect their mental health, chief executive Susan Wojcicki told the South by Southwest conference in Austin, Texas.

It comes as the video-sharing platform faces scrutiny over how it deals with unsuitable and violent content.

It has pledged to hire 10,000 extra people to address the issue.

Ms Wojcicki said: “This is a real issue, and I myself have spent a lot of time looking at this content over the past year. It is really hard.”

The workers would also receive “wellness benefits”, she said.

Ms Wojcicki also announced YouTube would introduce “information cues” to debunk videos promoting conspiracy theories, with links to fact-based articles on Wikipedia.

It comes in the wake of criticism levelled at YouTube for showing a hoax video about David Hogg, one of the pupils who survived the Parkland school shooting in Florida.

Psychological toll

The workers used by YouTube and other social media platforms to moderate content are often employed on a contract basis and paid relatively low wages.

Many leave within a year of starting the job, partly due to the psychological toll it takes on them.

Videos can include images of child sexual abuse, violence to animals, murder and suicide.

Some have spoken publicly about how viewing illegal videos and posts has affected their mental health.

A woman employed as moderator by Facebook told the Wall Street Journal in December that she had reviewed as many as 8,000 posts a day with little training on how to handle the distress.

YouTube is increasingly turning to machine-learning algorithms to help root out such content.

In December it said it had removed 150,000 videos for violent extremism since June, and 98% of the videos had been flagged by algorithm.

It did not clarify whether humans had viewed the videos after they had been flagged.

The BBC contacted YouTube to clarify other issues, including:

  • how many moderators it employed in total
  • whether all of them would benefit from the new contract limits
  • whether this would affect current pay
  • what the wellness benefits would include

YouTube responded, saying it had “nothing more to share at this time”.

Source link


Check Also

Hotel guests 'secretly filmed and live-streamed'

Image copyright Getty Images More than 1,500 hotel guests were secretly filmed with footage of …