YouTube has said it is making all efforts needed to ensure the wrong videos gets down rated and prevented from generating ad revenue while the right ones are allowed to flourish. Among the videos that YouTube said they are targeting include those that portray contents related to terrorism, have hate speech or show stuff that might not be suitable for children to see.
However, while the company stated they have artificial intelligence software in place to automatically identify videos it considers illegitimate as per its standards, there have also been a lot of dissent from content developers who claim the software hasn’t proved to be fool proof so far. In other words, the software has flagged a lot of videos, hundreds and thousands of them though the developers claim those to be fully compliant with YouTube’s terms.
YouTube CEO Susan Wojcicki admitted they removed 150,000 videos from YouTube since the crackdown began in June, an exercise that otherwise would have taken 180,000 people working the 40-hours week term. Similarly, ads from about 2 million videos were also removed after reports of those having being identified as unsafe for kid’s viewing.
Addressing content developer’s concerns, YouTube has stated they are redefining their approach to identify videos or channels that portray inappropriate content. Towards this, the company is also paying heed to the huge cache of data it is getting from its human moderators to further refine the AI software.
That is not all as the company is also looking to increase headcount of its video review team to allow for speedy resolution of content that has been flagged by others. Wojcicki said their target is to have more than 10,000 people on board in 2018 who will be tasked with the job of identifying videos that fail to comply with its terms and policies. That would make for a 25 percent increase in its workforce dedicated to this task.
Wojcicki further added there would be a new comment moderation tool to identify offensive comments. Those might even be shut down in its entirety as the most drastic action. Wojcicki said one of their goals is to prevent demonetizing the right videos while preventing inappropriate videos from escaping the net.