Home / Technology News / YouTube will no longer recommend conspiracy or medically inaccurate videos

YouTube will no longer recommend conspiracy or medically inaccurate videos

Print Friendly, PDF & Email

San Francisco: Google-owned video sharing platform YouTube has announced that it will no longer recommend videos that “come close to” violating its community guidelines, such as conspiracy or medically inaccurate videos. The video sharing platform will no longer recommend videos “claiming the earth is flat or making blatantly false claims about historic events like 9/11”, NBC News reported on Monday.

The original blog post from YouTube, published on January 25, said videos the site recommends, usually after a user has viewed one, would no longer lead to similar videos and instead would “pull in recommendations from a wider set of topics”. YouTube said in the post that the action is meant to “reduce the spread of content that comes close to – but doesn’t quite cross the line of – violating” its community policies.

The change will not affect the videos’ availability. And if users have subscribed to a channel that, for instance, produces conspiracy content, or if they search for it, they will still see related recommendations, the company said. On Saturday, Guillaume Chaslot, a former engineer for Google – YouTube’s parent company – hailed the move as a “historic victory”.

According to the report, he helped build Artificial Intelligence (AI) used to curate recommended videos. “It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable,” said Chaslot.

Source: Times Now

x

Check Also

Google admits humans listen to our conversations with Assistant, but its policies never clarified that

Google is defending the policy that allows humans in the company to ...