Search

As Covid-19 Continues to Spread, So Does Misinformation About It - The New York Times

korslapaja.blogspot.com

Doctors are exasperated by the persistence of false and misleading claims about the virus.

Nearly three years into the pandemic, Covid-19 remains stubbornly persistent. So, too, does misinformation about the virus.

As Covid cases, hospitalizations and deaths rise in parts of the country, myths and misleading narratives continue to evolve and spread, exasperating overburdened doctors and evading content moderators.

What began in 2020 as rumors that cast doubt on the existence or seriousness of Covid quickly evolved into often outlandish claims about dangerous technology lurking in masks and the supposed miracle cures from unproven drugs, like ivermectin. Last year’s vaccine rollout fueled another wave of unfounded alarm. Now, in addition to all the claims still being bandied about, there are conspiracy theories about the long-term effects of the treatments, researchers say.

The ideas still thrive on social media platforms, and the constant barrage, now a yearslong accumulation, has made it increasingly difficult for accurate advice to break through, misinformation researchers say. That leaves people already suffering from pandemic fatigue to become further inured to Covid’s continuing dangers and susceptible to other harmful medical content.

“It’s easy to forget that health misinformation, including about Covid, can still contribute to people not getting vaccinated or creating stigmas,” said Megan Marrelli, the editorial director of Meedan, a nonprofit focused on digital literacy and information access. “We know for a fact that health misinformation contributes to the spread of real-world disease.”

Twitter is of particular concern for researchers. The company recently gutted the teams responsible for keeping dangerous or inaccurate material in check on the platform, stopped enforcing its Covid misinformation policy and began basing some content moderation decisions on public polls posted by its new owner and chief executive, the billionaire Elon Musk.

From Nov. 1 to Dec. 5, Australian researchers collected more than half a million conspiratorial and misleading English-language tweets about Covid, using terms such as “deep state,” “hoax” and “bioweapon.” The tweets drew more than 1.6 million likes and 580,000 retweets.

The researchers said the volume of toxic material surged late last month with the release of a film that included baseless claims that Covid vaccines set off “the greatest orchestrated die-off in the history of the world.”

Naomi Smith, a sociologist at Federation University Australia who helped conduct the research with Timothy Graham, a digital media expert at Queensland University of Technology, said Twitter’s misinformation policies helped tamp down anti-vaccination content that had been common on the platform in 2015 and 2016. From January 2020 to September 2022, Twitter suspended more than 11,000 accounts over violations of its Covid misinformation policy.

Now, Dr. Smith said, the protective barriers are “falling over in real time, which is both interesting as an academic and absolutely terrifying.”

“Pre-Covid, people who believed in medical misinformation were generally just talking to each other, contained within their own little bubble, and you had to go and do a bit of work to find that bubble,” she said. “But now, you don’t have to do any work to find that information — it is presented in your feed with any other types of information.”

Tom Brenner/Reuters

Several prominent Twitter accounts that had been suspended for spreading unfounded claims about Covid have were reinstated in recent weeks, including those of Representative Marjorie Taylor Greene, a Georgia Republican, and Robert Malone, a vaccine skeptic.

Mr. Musk himself has used Twitter to weigh in on the pandemic, predicting in March 2020 that the United States was likely to have “close to zero new cases” by the end of that April. (More than 100,000 positive tests were reported to the Centers for Disease Control and Prevention in the last week of the month.) This month, he took aim at Dr. Anthony S. Fauci, who will soon step down as President Biden’s top medical adviser and the longtime director of the National Institute of Allergy and Infectious Diseases. Mr. Musk said Dr. Fauci should be prosecuted.

Twitter did not respond to a request for comment. Other major social platforms, including TikTok and YouTube, said last week that they remained committed to combating Covid misinformation.

YouTube prohibits content — including videos, comments and links — about vaccines and Covid-19 that contradicts recommendations from the local health authorities or the World Health Organization. Facebook’s policy on Covid-19 content is more than 4,500 words long. TikTok said it had removed more than 250,000 videos for Covid misinformation and worked with partners such as its content advisory council to develop its policies and enforcement strategies. (Mr. Musk disbanded Twitter’s advisory council this month.)

But the platforms have struggled to enforce their Covid rules.

Newsguard, an organization that tracks online misinformation, found this fall that typing “covid vaccine” into TikTok caused it to suggest searches for “covid vaccine injury” and “covid vaccine warning,” while the same query on Google led to recommendations for “walk-in covid vaccine” and “types of covid vaccines.” One search on TikTok for “mRNA vaccine” brought up five videos containing false claims within the first 10 results, according to researchers. TikTok said in a statement that its community guidelines “make clear that we do not allow harmful misinformation, including medical misinformation, and we will remove it from the platform.”

Dr. Anish Agarwal, an emergency physician in Philadelphia, said some patients continued to believe “crazy” claims about Covid-19 vaccines.Michelle Gustafson for The New York Times

In years past, people would get medical advice from neighbors, or try to self-diagnose via Google search, said Dr. Anish Agarwal, an emergency physician in Philadelphia. Now, years into the pandemic, he still gets patients who believe “crazy” claims on social media that Covid vaccines will insert robots into their arms.

“We battle that every single day,” said Dr. Agarwal, who teaches at the University of Pennsylvania’s Perelman School of Medicine and serves as deputy director of Penn Medicine’s Center for Digital Health.

Online and offline discussions of the coronavirus are constantly shifting, with patients bringing him questions lately about booster shots and long Covid, Dr. Agarwal said. He has a grant from the National Institutes of Health to study the Covid-related social media habits of different populations.

“Moving forward, understanding our behaviors and thoughts around Covid will probably also shine light on how individuals interact with other health information on social media, how we can actually use social media to combat misinformation,” he said.

Years of lies and rumors about Covid have had a contagion effect, damaging public acceptance of all vaccines, said Heidi J. Larson, the director of the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine.

“The Covid rumors are not going to go away — they’re going to get repurposed, and they’re going to adapt,” she said. “We can’t delete this. No one company can fix this.”

Some efforts to slow the spread of misinformation about the virus have bumped up against First Amendment concerns.

A law that California passed several months ago, and that is set to take effect next month, would punish doctors for spreading false information about Covid vaccines. It already faces legal challenges from plaintiffs who describe the regulation as an unconstitutional infringement of free speech. Tech companies including Meta, Google and Twitter have faced lawsuits this year from people who were barred over Covid misinformation and claim that the companies overreached in their content moderation efforts, while other suits have accused the platforms of not doing enough to rein in misleading narratives about the pandemic.

Dr. Graham Walker, an emergency physician in San Francisco, quit Twitter this month over frustrations with Covid misinformation.Jason Henry for The New York Times

Dr. Graham Walker, an emergency physician in San Francisco, said the rumors spreading online about the pandemic drove him and many of his colleagues to social media to try to correct inaccuracies. He has posted several Twitter threads with more than a hundred evidence-packed tweets trying to debunk misinformation about the coronavirus.

But this year, he said he felt increasingly defeated by the onslaught of toxic content about a variety of medical issues. He left Twitter after the company abandoned its Covid misinformation policy.

“I began to think that this was not a winning battle,” he said. “It doesn’t feel like a fair fight.”

Now, Dr. Walker said, he is watching as a “tripledemic” of Covid-19, R.S.V. and influenza bombards the health care system, causing emergency room waits in some hospitals to surge from less than an hour to six hours. Misinformation about easily available treatments is at least partly responsible, he said.

“If we had a larger uptick in vaccinations with the most recent vaccines, we probably would have a smaller number of people getting extremely ill with Covid, and that’s certainly going to make a dent in hospitalization numbers,” he said. “Honestly, at this point, we will take any dent we can get.”

Adblock test (Why?)



COVID-19 - Latest - Google News
December 28, 2022 at 05:00PM
https://ift.tt/36DVKmc

As Covid-19 Continues to Spread, So Does Misinformation About It - The New York Times
COVID-19 - Latest - Google News
https://ift.tt/D1zZbFV


Bagikan Berita Ini

0 Response to "As Covid-19 Continues to Spread, So Does Misinformation About It - The New York Times"

Post a Comment

Powered by Blogger.