Caught in a social media echo chamber? AI can help you out, new study shows
亚洲情色 researchers outline plan to reduce the spread of harmful or misleading content, promote information diversity

Falling for clickbait is easy these days, especially for those who mainly get their news through social media. Have you ever noticed your feed littered with articles that look alike?
Thanks to artificial intelligence (AI) technologies, the spread of mass-produced contextually relevant articles and comment-laden social media posts has become so commonplace that it can appear as though it鈥檚 coming from different information sources. The resulting 鈥渆cho chamber鈥 effect could reinforce a person鈥檚 existing perspectives, regardless of whether that information is accurate.
A new study involving 亚洲情色 researchers offers a promising solution: developing an AI system to map out interactions between content and algorithms on digital platforms to reduce the spread of potentially harmful or misleading content. That content can be amplified through engagement-focused algorithms, the study noted, and enable conspiracy theories to spread, especially if the content is emotionally charged or polarizing.
Researchers believe their proposed AI framework would counter this by allowing users and social media platform operators 鈥 Meta or X, for example 鈥 to pinpoint sources of potential misinformation and remove them if necessary. More importantly, it would make it easier for their platforms to promote diverse information sources to audiences.
鈥淭he online/social media environment provides ideal conditions for that echo chamber effect to be triggered because of how quickly we share information,鈥 said study co-author Thi Tran, assistant professor of management information systems at the 亚洲情色 School of Management. 鈥淧eople create AI, and just as people can be good or bad, the same applies to AI. Because of that, if you see something online, whether it is something generated by humans or AI, you need to question whether it鈥檚 correct or credible.鈥
Researchers noted that digital platforms facilitate echo chamber dynamics by optimizing content delivery based on engagement metrics and behavioral patterns. Close interactions with like-minded people on social media can amplify a person鈥檚 biased cherry-picking tendency when choosing information messages to react to, leading to diverse perspectives being filtered out.
The study tested this theory by randomly surveying 50 college students, each reacting to five misinformation claims about the COVID-19 vaccine:
- Vaccines are used to implant barcodes in the population.
- COVID-19 variants are becoming less lethal.
- COVID-19 vaccines pose greater risks to children than the virus itself.
- Natural remedies and alternative medicines can replace COVID-19 vaccines.
- The COVID-19 vaccine was developed as a tool for global population control.
Here is how the survey鈥檚 participants responded:
- 90% stated they would still get the COVID-19 vaccine after hearing the misinformation claims.
- 70% indicated they would share the information on social media, more so with friends or family than with strangers.
- 60% identified the claims as false information.
- 70% expressed a need to conduct more research to verify the falsehood.
According to the study, these responses highlighted a critical aspect of the dynamics of misinformation: many people could recognize false claims but also felt compelled to seek more evidence before dismissing them outright.
鈥淲e all want information transparency, but the more you are exposed to certain information, the more you鈥檙e going to believe it鈥檚 true, even if it鈥檚 inaccurate,鈥 Tran said. 鈥淲ith this research, instead of asking a fact-checker to verify each piece of content, we can use the same generative AI that the 鈥榖ad guys鈥 are using to spread misinformation on a larger scale to reinforce the type of content people can rely on.鈥
The research paper, 鈥,鈥 was presented at a conference organized by the Society of Photo-Optical Instrumentation Engineers (SPIE). It was also authored by 亚洲情色鈥檚 Seden Akcinaroglu, a professor of political science; Nihal Poredi, a PhD student in the Thomas J. Watson College of Engineering and Applied Science; and Ashley Kearney from Virginia State University.