YouTube will accompany conspiracy theory videos with links to Wikipedia to better inform viewers, YouTube CEO Susan Wojcicki announced at the South by Southwest (SXSW) conference on Tuesday in Austin, Texas.
“If there is an important news event, we want to be delivering the right information,” Wojcicki said on stage. She qualified that by saying, “we are not a news organization.”
The feature will roll out in the coming months. The Wikipedia links will not appear solely on conspiracy-related videos, but will instead show up on topics and events that have inspired significant debate. A YouTube spokesperson used videos about the moon landing (a historical topic with many conspiracy theories surrounding it) as an example and noted that moon landing videos would appear with Wikipedia links below to provide additional information, regardless of whether the video was a documentary or a video alleging the landing was staged.
The spokesperson told BuzzFeed News that the new information from Wikipedia, which the company has dubbed “information cues,” is not meant to be seen as a full-scale solution to a complex problem. Instead, the company suggested that this is just a first small step in a series of announcements to come over the next year about the company’s efforts to provide more information about videos on its platform.
But in a statement posted on its Twitter account on Wednesday, the Wikimedia Foundation, which runs Wikipedia, said it had not been given advance notice of YouTube’s announcement. “We are always happy to see people, companies, and organizations recognize Wikipedia’s value as a repository of free knowledge. In this case, neither Wikipedia nor the Wikimedia Foundation are part of a formal partnership with YouTube,” the company wrote.
Wikipedia is a crowdsourced digital encyclopedia — anyone can edit it — and editors sometimes engage in fierce partisan battles over divisive topics. It remains unclear how YouTube will ensure factual accuracy of suggested pages. The reliability of Wikipedia’s information has been disputed over the years, as detailed on the encyclopedia’s page about its own reliability and its catalogue of hoaxes that have appeared there.
Similarly unclear is how “informational cues” might work for breaking news events, where subjects involved may not have a complete or even partial Wikipedia presence.
YouTube has struggled with how to handle conspiracy videos on its platform. Just yesterday, YouTube surfaced conspiracy theorist Alex Jones’ video on a search for “Austin explosions” in relation to the exploding packages that killed two people in the Texas capital.
And in February, a video claiming that a survivor of the Marjory Stoneman Douglas shooting was a “crisis actor” topped YouTube’s Trending chart. The video received 200,000 views and spawned copies before YouTube removed it. A researcher found a network of thousands of conspiracy theory videos on the platform in the same month.
“It’s already tipped in favor of the conspiracists, I think,” the researcher, Jonathan Albright, told BuzzFeed News in February.
When asked at SXSW about why YouTube can’t decide what is true or false but can decide what is hateful, Wojcicki said, “Hatefulness is more clear than if something is true or if something is false.”
Ryan Mac is a senior technology reporter for BuzzFeed News and is based in San Francisco. He reports on the intersection of money, technology and power.
Contact Ryan Mac at [email protected].
Charlie Warzel is a senior writer for BuzzFeed News and is based in New York. Warzel reports on and writes about the intersection of tech and culture.
Contact Charlie Warzel at [email protected].
Got a confidential tip? Submit it here.