Facebook hires 3,000 to monitor violent content

“We’re going to make it simpler to report problems”, Mark Zuckerberg says. Since 2016, at least 50 criminal or violent incidents have been broadcast over Facebook Live.

Evangelical Focus

Agencies, Reuters · CALIFORNIA · 04 MAY 2017 · 15:08 CET

Photo: William Iven (Unsplash).,
Photo: William Iven (Unsplash).

Facebook will hire 3,000 more people over the next year to speed up the removal of videos showing murder, suicide and other violent acts.

They will be added “to our community operations team around the world -- on top of the 4,500 we have today -- to review the millions of reports we get every week, and improve the process for doing it quickly”, Chief Executive Mark Zuckerberg announced in a post on his Facebook profile.

 

ARTIFICIAL INTELLIGENCE

The world's largest social network has been turning to artificial intelligence to try to automate the process of finding pornography, violence and other potentially offensive material.

In March, the company said it planned to use such technology to help spot users with suicidal tendencies and get them assistance.

But AI techniques would take “a period of years ... to really reach the quality level that we want”, Zuckerberg told investors after the company's earnings late on Wednesday.

“Given the importance of this, how quickly live video is growing, we wanted to make sure that we double down on this and make sure that we provide as safe of an experience for the community as we can”, he added.

 

EASIER AND FASTER TO REPORT

In his Facebook profile, he said the company would develop new tools to manage the millions of content reports it received every week.

"We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help", Zuckerberg pointed out.

They are also working “with local community groups who are in the best position to help someone if they need it, either because they're about to harm themselves, or because they're in danger from someone else.”

Facebook says that every person reviewing its content is offered psychological support and wellness resources, and that the company has a support program in place.

 

“RAW AND VISCERAL COMMUNICATION”

The problem has become more pressing since the introduction last year of Facebook Live, a service that allows any of Facebook's 1.9 billion monthly users to broadcast video, which has been marred by some violent scenes.

When Facebook launched its live service in April 2016, Zuckerberg spoke about it as a place for "raw and visceral communication. “

 

Facebook CEO Mark Zuckerberg.

"Because it's live, there is no way it can be curated. And because of that it frees people up to be themselves. It's live; it can’t possibly be perfectly planned out ahead of time", Zuckerberg told BuzzFeed News in an interview then.

Since then, at least 50 criminal or violent incidents have been broadcast over Facebook Live, including assault, murder and suicide, The Wall Street Journal reported in March.

 

MURDER, SUICIDE, ABUSES

The move follows cases of murder and suicide being broadcast live on the social network.

Last week, a father in Thailand broadcast himself killing his daughter on Facebook Live, police said. After more than a day, and 370,000 views, Facebook removed the video.

A man in Cleveland, Ohio, last month was accused of shooting another man on a sidewalk and then uploading a video of the murder to Facebook, where it remained for about two hours. The man later fatally shot himself.

In January, four African-Americans in Chicago were accused of attacking an 18-year-old disabled man on Facebook Live while making anti-white racial taunts. They have pleaded not guilty.

"Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate", Zuckerberg wrote in his post.

 

LAWMAKERS THREAT FACEBOOK

UK lawmakers this week accused social media companies including Facebook of doing a "shameful" job removing child abuse and other potentially illegal material.

In Germany, the company has been under pressure to be quicker and more accurate in removing illegal hate speech and to clamp down on so-called fake news.

German lawmakers have threatened fines if the company cannot remove at least 70 percent of offending posts within 24 hours.

Published in: Evangelical Focus - life & tech - Facebook hires 3,000 to monitor violent content