YouTube says it will hire more than 10,000 people in part to address a disturbing trend among its videos.

The new moderators will try and stop the spread of bizarre, potentially damaging videos across its site. In recent weeks, the site has been increasingly criticized for hosting the posts, which seem aimed to target children but in fact show graphic, extremist and violent content.

The videos often pose as containing footage from TV shows for children, concentrate on well known characters, or actively claim to be showing things aimed at kids. But when they click through, they show videos that might not even be suitable for adults – such as Peppa Pig swinging a chainsaw, or children being forced to pretend to be sick.

YouTube is adding more human moderators and improving its machine learning, said CEO Susan Wojcicki in a blog post. In 2018, YouTube will raise its content moderation workforce to over 10,000 employees.