YouTube are continuing their fight against questionable content by launching ‘information cues’ a feature that provides users and viewers with alternate viewpoints against videos containing conspiracy related content.
Details and Implications:
After the Parkdale, Florida shooting in February, YouTube’s top trending video was a conspiracy theory surrounding the event, which claimed that one of the survivors was an actor. YouTube eventually took down the viral video, which garnered over 200,000 views. They were heavily criticized for their auto-play algorithms that promote radical content, in efforts to keep viewers on the platform longer. The proposed launch of this feature, dubbed ‘information cues’, is YouTube’s answer to the spread of misinformation, and will start to roll out in the coming weeks according to Susan Wojcicki, YouTube’s CEO, who announced the new strategy at SXSW.
How does it work? Surprisingly enough, not via an algorithm, but through text boxes. Textboxes will be displayed on videos and content promoting alternate facts and will provide a Wikipedia-description on the topic. For example, a video that questions whether humans landed on the moon would have a text box with Wikipedia’s description on the Apollo 11 lunar landing mission.
An official partnership with Wikipedia hasn’t been created. It’s important to note that Wikipedia is an open source platform where users can add/edit/remove content and information. Though Wikipedia may not always be the best source, the text boxes will act as guidance allowing video-watchers to create better informed opinions.
This isn’t the Alphabet company’s first foray into fighting the spread of fake news. In 2017, Google removed 3.2 billion ‘bad ads, and blocked over 300K publishers, 90k sites and 700k mobile apps, for reasons including violating policies on harmful, misleading and offensive content. In addition, Google added 28 new policies for advertisers and an additional 20 policies for publishers, as part of their battle to stop the spread of misleading information.
YouTube’s battle against misleading content continues with the announcement of their newest feature, which will provide Wikipedia-sourced ‘information cues’ alongside flagged videos.
The video platform’s plan currently only involves videos surrounding internet conspiracy theories. It will be interesting to see how YouTube’s newest conspiracy-debunking feature plays out, how they deem questionable content, and whether it will roll out to a wider variety of videos.