Would You like a feature Interview?
All Interviews are 100% FREE of Charge
According to one source, YouTube recommendations are directing young children to videos about school shootings and other gun-related content. new report. According to the non-profit watchdog group Tech Transparency Project (TTP), YouTube’s recommendation algorithm “redirects boys interested in video games to scenes from school shootings and how to use and modify weapons.” Description,” said it directs to other gun-centric content.
The researchers behind the report opened four new YouTube accounts posing as two nine-year-old boys and two 14-year-old boys. All accounts watched playlists of content about popular video games. roblox, lego star wars, Hello and grand theft auto. Researchers then tracked his account recommendations for 30 days last November.
“Our investigation found that while YouTube was pushing content about shootings and weapons to all gamer accounts, it was pushing far more content to users who clicked on YouTube-recommended videos. ’” wrote the TTP. “These videos contained scenes depicting school shootings and other mass shootings. Graphic demonstrations of how much damage a gun can do to the human body. A how-to guide for modding into weapons is also included.”
As the report points out, some of the recommended videos appear to violate YouTube’s own policies. Recommendations included a video of a girl firing a gun, and tutorials on how to convert a handgun into a “fully automatic” weapon and other mods. Some of these videos are also monetized with ads.
In a statement, a YouTube spokesperson pointed to the YouTube Kids app and its in-app tools that “create a safer experience for teens” on the company’s platform.
“We welcome research on our recommendations and are exploring additional ways to invite academic researchers to study our system,” the spokesperson said. “However, it is difficult for us to draw strong conclusions when considering the methodology of this report. It does not provide any insight into how the test account was set up, such as whether YouTube’s Supervised Experiences tool was applied.”
The TTP report isn’t the first time researchers have questioned YouTube’s recommendation algorithm. The company has also spent years working to reduce so-called content (videos that aren’t outright against the rules, but may not be suitable for large-scale distribution) from appearing in recommendations. did it. And the company announced last year that it was considering sharing some of that content outright.