YouTube revamps health videos as Biden scolds social media
YouTube to start adding fact-checking tags and revamping its health video ranking system with help from a nonprofit group just days after President Joe Biden took to social media for “killing people” by spreading lies about vaccines.
YouTube, part of Alphabet Inc.’s Google, announced the changes Monday morning. Below some health videos, the company will add information boards, like the ones currently found at the bottom of clips on popular conspiracy theory topics such as the moon landing. YouTube will also start showing some videos more prominently on the site when people search for health terms, in the same way it now deals with some hot topics.
For the material on those panels and its ranking system, YouTube said it would use a recent set of guidelines to verify online information from the National Academy of Medicine, a non-governmental organization.
“These principles for health sources are the first of their kind,” wrote Garth Graham, director of health care for YouTube, in a blog post. “We hope other companies will also examine and consider how these principles might help inform their work with their own products and platforms.”
Graham, a former health insurance executive, was hired by YouTube earlier this year to lead a new effort to highlight and produce these videos.
Like social networks Facebook Inc. and Twitter Inc., YouTube has worked to better moderate its flow of user-generated media to deal with misinformation about Covid-19 and vaccines. The platform has removed thousands of videos for violating its disinformation rules since the start of the pandemic, which has led to criticism for being overly censored, especially from the political right.
Still, the Biden administration has gone on the offensive against tech companies in an attempt to get more Americans vaccinated, and the surgeon general has released a report on health misinformation. And on Friday, Biden slammed social media for their role in spreading anti-vaccination material.
Democratic Senator Amy Klobuchar said on Sunday that misinformation on social media about vaccines adds urgency to her call to change accountability standards for what is posted on their platforms.
“YouTube removes content in accordance with our Covid-19 disinformation policies,” the company said. “We’re also demoting borderline videos and highlighting authoritative content for Covid-19-related search results, recommendations, and pop-up panels.”
The company said it will continue to work with health organizations and other medical experts to prevent the spread of misinformation.
Twitter also said it would do its part to “elevate authoritative health information,” while Facebook extended its defense over the weekend with a blog post about how it can’t be blamed for the target. missed American vaccinations.
“At a time when cases of Covid-19 are on the rise in America, the Biden administration has chosen to blame a handful of American social media companies,” Facebook vice president of integrity Guy Rosen said in the message. “Facebook is not the reason for this goal. was missed. “
YouTube also argued that it doesn’t work the same way as social media because it doesn’t have the same type of viral sharing. But its algorithm feeds people on the majority of videos watched on the site.
The Surgeon General’s report didn’t mention YouTube by name, but it did cite a recent academic study that looked at vaccine information on YouTube. As of late 2019, pro-vaccine videos have appeared higher in YouTube’s search rankings, according to the research. But once viewers watch an anti-vaccine video, YouTube’s system ends up exposing them to more of the same, he added.
A YouTube spokesperson said the company has been working on these recent changes since February. They are reaching viewers in the United States first, and the company plans to add more countries.
This story was posted from an agency feed with no text editing.
Never miss a story! Stay connected and informed with Mint. Download our app now !!