国产偷拍

Skip to Content, Navigation, or Footer.

Experts discuss misinformation, artificial intelligence, grassroots solutions at panel

Panel hosted by Information Futures Lab featured various media, communication experts

<p>Speakers discussed social media, algorithms, and AI as a part of the Tuesday panel.</p>

Speakers discussed social media, algorithms, and AI as a part of the Tuesday panel.

Misinformation experts discussed social media, algorithms and artificial intelligence at a Tuesday panel hosted by The Information Futures Lab. 

Titled 鈥淓verything We Know (And Don鈥檛 Know) About Tackling Rumors and Conspiracies,鈥 the panel was moderated by Claire Wardle, a co-director of the IFL and a professor of the practice of health services, policy and practice.

Despite its societal impact, research on media misinformation remains 鈥渁 young field,鈥 according to Stefanie Friedoff, another co-director of the IFL and an associate professor of the practice of health services, policy and practice.

Having worked as a senior policy advisor on the White House COVID-19 Response Team, she later contributed to a on pandemic misinformation interventions: a topic she discussed at the panel.

ADVERTISEMENT

鈥淲e鈥檙e significantly understudying this,鈥 Friedoff said, citing a lack of longitudinal research on non-American and video-based misinformation. 鈥淲e don鈥檛 have a lot of useful evidence to apply in the field, and we need to work on that.鈥 

Evelyn P茅rez-Verdia, founder of , a strategic consulting firm, spoke about her work to combat misinformation at the panel. She aims to empower Spanish-speaking diasporas in South Florida through community-based trust-building: Recently, she has worked with the IFL as a fellow to conduct a survey of information needs in Florida.  

According to P茅rez-Verdia, non-English-speaking and immigrant communities are prone to misinformation because of language and cultural barriers. When people are offered accessible resources, she argues, communities become empowered and less susceptible to misinformation. 鈥淧eople are hungry for information,鈥 she said.

Abbie Richards, another panelist and senior video producer at Media Matters for America, a watchdog journalism organization, identified social media algorithms as an exacerbating factor. In a shown during the panel, Richards highlighted the proliferation of misleading or inaccurate content on platforms like TikTok. As a video producer, she looks to distill research and discourse on this topic for 鈥渁udiences who wouldn鈥檛 necessarily read research papers,鈥 she said.

She researched AI-generated content on social media, which is often designed to take advantage of the various platform鈥檚 monetization policies. 鈥淭here鈥檚 a monetization aspect behind this content,鈥 Richards elaborated.

Algorithms are 鈥渄esigned to show (users) what they want to see and what they鈥檒l engage with,鈥 she said. When viewers 鈥渇eel disempowered 鈥 it makes it really easy to gravitate towards misinformation."

When discussing AI-generated misinformation that is designed to be entertaining, Freidhoff noted that only 鈥渟ome of us have the luxury to laugh鈥 at misinformation.

鈥淏ut from the perspective of somebody behind the paywall, who doesn鈥檛 necessarily speak English,鈥 factual information becomes increasingly difficult to access,鈥 she added. She describes this as 鈥渕isinformation inequities,鈥 which all speakers acknowledged existed in their projects. 

In an interview with The Herald, Friedhoff and Wardle emphasized how the 鈥渙nline information ecosystem鈥 connects different types of misinformation. Vaccine skepticism, Wardle said, is a slippery slope towards climate change denial: 鈥淲e have to understand as researchers and practitioners that we can't think in silos.鈥

Many of the speakers agreed that misinformation spreads in part because people tend to prioritize relationships 鈥 both in real life and parasocial 鈥 over fact. 鈥淭here鈥檚 nothing more powerful than someone you trust and close to you,鈥 P茅rez-Verdia said.

ADVERTISEMENT

Richards said emotional literacy is the backbone to navigating both AI and misinformation. This includes 鈥渢eaching people how to recognize (confirmation bias) within themselves鈥 and understanding common misinformation techniques.

When asked to offer potential solutions, the speakers offered a range of responses. Richards suggested a 鈥渕arketing campaign for federal agencies鈥 to facilitate increased governmental literacy that allows for all citizens to understand how the government functions. P茅rez-Verdia also identified diverse and culturally conscientious government messaging as key, while Friedhoff recommended creating 鈥渃ommunity conversations鈥 to explore perspectives rather than further polarizing them. 

Audience member Benjy Renton, a research associate at the School of Public Health, was 鈥渋nspired by鈥 community-based approaches like P茅rez-Verdia鈥檚 work: 鈥渋t was great to see the diverse range of perspectives on misinformation.鈥

The speakers told The Herald that they found each other鈥檚 perspectives enlightening. 鈥淚'm somebody that people feel like they can go to because I've spent years talking about (misinformation),鈥 Richards said in an interview with The Herald after the event. 鈥淏ut the idea of how you measure (trust) is fully beyond me.鈥 

Get The Herald delivered to your inbox daily.

P茅rez-Verdia ended the discussion by re-iterating the fight against misinformation as founded on teamwork: 鈥淲hen you look at all of these pieces, the women here, a collaboration where we all have our individual gifts鈥 that鈥檚 exactly what needs to be done on a larger spectrum.鈥


Megan Chan

Megan is a Senior Staff Writer covering community and activism in Providence. Born and raised in Hong Kong, she spends her free time drinking coffee and wishing she was Meg Ryan in a Nora Ephron movie.



Powered by Solutions by The State 国产偷拍
All Content © 2024 国产偷拍, Inc.