UNIVERSITY PARK, Pa. — Every day, social media users are exposed to fake news and political polarization on social networks. What makes people vulnerable to believing false information they find online?
According to researchers at the Penn State College of Information Sciences and Technology, users can easily fall into an echo chamber — a sort of online rabbit hole through which users consume only one-sided news and political arguments, eventually distrusting any opposing views. To combat this phenomenon, the researchers have developed a new tool that applies psychological concepts to help individuals become more aware of and responsive to an echo chamber effect.
The tool, a game named ChamberBreaker, is a theory-based game that enables a player to test their own awareness of content that could result in echo chambers and to observe how they are accelerated by the spread of fake news. Their goal is to help players resist echo chambers in the future and ultimately reduce the rate of fake news dissemination.
“Since people who fall into an echo chamber tend to consume the information they want to see, whether and how much the information is the same as their belief is usually more important than how credible the information is,” said Kyungsik Han, associate professor at Hanyang University in Korea, who earned his doctorate at the Penn State College of IST and is the corresponding author of the research paper. “This indicates a necessity to conduct research on how to help people fundamentally understand an echo chamber and experience its negative consequences.”
“We all tend to conform to and agree with the group opinion. Hence, people naturally get together with others who hold the same opinion,” said Dongwon Lee, professor of information sciences and technology at Penn State and one of the paper’s authors. “But if you’re not careful and not thinking critically, there is a high risk for someone to fall into an echo chamber. Hopefully, in the future, this kind of tool helps people learn a sort of online hygiene — similar to washing your hands to protect yourself from illness.”
Added Lee, “Ultimately, the success of fake news research is based on how people will perceive information and how they will change their behavior accordingly. No matter how accurate AI-based fake news detectors are, ultimately, if users do not accept and change their behavior, then nothing is going to work.”
In ChamberBreaker, a player is tasked with trying to misinform the audience in hopes of having community members fall into an echo chamber. To begin, the player is randomly assigned a scenario that focuses on a health, political or environmental issue, and are presented six tweets on that topic. Then, the player selects tweets that could cause the other members to fall into an echo chamber while simultaneously maintaining their trust.
The player can monitor their efforts through two gauges on ChamberBreaker’s interface — one that tracks the community’s echo chamber effect and one that measures the player’s reliability or credibility. The objective is for the player to keep both gauges above a certain threshold. If successful, the community members will fall into an echo chamber and the player will witness the resulting negative effects on the community. The player then receives a score after each scenario.