YouTube says it will soon recommend fewer conspiracy theory videos on its platform (GOOG, GOOGL)
- YouTube announced in a company blog post on Friday that it would recommend less "borderline" content, or videos that are untruthful in potentially harmful ways.
- Examples of videos YouTube hopes to promote less often include ones that claim that the Earth is flat or promote phony cures for serious illnesses.
- "We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," YouTube said in its blog post.
- YouTube has long struggled with its recommendations algorithm, catching backlash for promoting conspiracy theories and leading users to more extreme corners of the Internet.
The Earth is not flat and soon, you should start seeing fewer videos on YouTube that say that it is.
On Friday, YouTube announced in a company blog post that it would recommend less "borderline" content, or videos that are untruthful in potentially harmful ways.
Essentially, YouTube, which is owned by Google, thinks it has created a better solution for stopping the spread of conspiracy theory videos on its platform.
Examples of videos YouTube hopes to promote less often include the Earth is flat claim, as well as those that promote phony cures for serious illnesses or make blatantly false claims about historical events like 9/11.
Many of these "borderline" videos don't necessarily violate YouTube's Community Guidelines, but the company says that limiting their reach will provide a better experience for its users. "We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," YouTube said in its blog post.
These videos will not be removed entirely from the platform, and they may still appear in search results or recommendations if a user follows certain channels, the company explained.
YouTube also provided a bit of insight into how its recommendation model works, which involves "human evaluators and experts from all over the US" reviewing videos and using that feedback to train its machine learning systems.
YouTube has long struggled with its recommendations algorithm, catching backlash for promoting conspiracy theories and facing criticism for leading its users to more extreme corners of the Internet.
"It's just another step in an ongoing process," the company said in its blog post on Friday. "But it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube."
Got a tip? Contact this reporter via Signal at +1 (209) 730-3387, email at nbastone@businessinsider.com, or Twitter DM at @nickbastone.
Join the conversation about this story »
Contributer : Tech Insider https://read.bi/2CNzovL
No comments:
Post a Comment