Technology|When Fighting Disinformation, Even the Best Tools Are Not Enough

https://www.nytimes.com/2024/07/11/technology/disinformation-tools.html

You have a preview view of this article while we are checking your access. When we have confirmed access, the full article content will load.

Ruth Quint sitting in a wooden chair at a table next to her kitchen. Behind her is a window and framed art.
Ruth Quint volunteers as the webmaster for the League of Women Voters of Greater Pittsburgh. “It’s really hard to get through to anybody,” she said regarding holding the line against misinformation and disinformation online. Credit…Ross Mantle for The New York Times

Researchers have learned plenty about misinformation and how it spreads. But they’re still struggling to figure out how to stop it.

To fight disinformation in a chaotic election year, Ruth Quint, a volunteer for a nonpartisan civic group in Pennsylvania, is relying on tactics both extensively studied and frequently deployed. Many of them, however, may also be futile.

She has posted online tutorials for identifying fake social media accounts, created videos debunking conspiracy theories, flagged toxic content to a collaborative nationwide database and even participated in a pilot project that responded to misleading narratives by using artificial intelligence.

The problem: “I don’t have any idea if it’s working or not working,” said Ms. Quint, the co-president and webmaster of the League of Women Voters of Greater Pittsburgh, her home of five decades. “I just know this is what I feel like I should be doing.”

Holding the line against misinformation and disinformation is demoralizing and sometimes dangerous work, requiring an unusual degree of optimism and doggedness. Increasingly, however, even the most committed warriors are feeling overwhelmed by the onslaught of false and misleading content online.

Researchers have learned a great deal about the misinformation problem over the past decade: They know what types of toxic content are most common, the motivations and mechanisms that help it spread and who it often targets. The question that remains is how to stop it.

A critical mass of research now suggests that tools such as fact checks, warning labels, prebunking and media literacy are less effective and expansive than imagined, especially as they move from pristine academic experiments into the messy, fast-changing public sphere.


Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times.


Thank you for your patience while we verify access.

Already a subscriber? Log in.

Want all of The Times? Subscribe.

Advertisement

SKIP ADVERTISEMENT

Read More

Spread the love