Beibei Dong and Eric Fang say AI in your newsfeed is making you lazy.
By Jack Croft
Image by iStock/Brankospejs
Your online news is being fed to you in two ways, user subscription or artificial intelligence algorithms.
The question is, are there fundamental differences between these two models of delivering the news that affect the way you navigate news and process information?
The answer is yes, says Beibei Dong, associate professor in marketing, and Eric Fang, professor in marketing and Iacocca Chair at Lehigh, based on their research in progress.
User subscription means your news is coming to you chronologically from sources you’ve selected by clicking on a “Subscribe” button. AI algorithms are offering you news based on that moment’s viewing behavior, previous viewing history, your location, your device, the time of the day and the day of the week, among other factors.
Google News presents these two options via tabs called “Following” and “For You” respectively.
Most platforms, like Google and Facebook, offer a blend of features. “That makes it difficult to study the effectiveness of Subscribe over AI,” says Dong. “Our research does not mix the two models, and that gives us the opportunity to precisely differentiate and compare their distinct impacts.”
The Elaborate Likelihood Mode is a classic theory, according to Dong, that outlines two ways people process information. Central processing is when you spend high levels of cognitive resources to digest information in a systematic, rigorous, and comprehensive way. Peripheral processing is when you care more about the efficiency and tend to use heuristics to quickly form conclusions. In other words—you read the headlines, look at the pictures.
Four factors decide how you would navigate and process news information. The first factor coming into play is source uncertainty, which means you’re more likely to engage in central processing if you know the source of the news. “You are extra motivated to process news rigorously from your friends or a media source you are familiar with,” says Dong.
AI is really good at collecting news of the same topics and viewpoints from a broad variety of news outlets, Dong explains. And many of those outlets actually are unknown to the readers. One example she gives is, if you’ve done a Google search of “negative side effects of COVID vaccine” in the next few days in that “For You” tab, AI will present you with dozens of stories reporting on the negative impact of the vaccine from a wide variety of media sources.
The researchers say that in such a case, you are less motivated to read the news from those unknown and unfamiliar sources, and peripheral processing is more likely to take place. “As AI continuously personalizes the content, it gradually narrows and then limits the topics and viewpoints delivered to the readers,” says Dong. “AI is wrapping you in a filter bubble.” This is the second factor, content relevance.
The researchers say that digesting information consistent with your expectations demands less mental power and cognitive resources. As such, readers become lazy and adopt a low-effort mode where they skip through the content—peripheral processing—rather than doing deep reading and critical thinking.
This is less true for a subscription model, say Dong and Fang. Although the news content you subscribe to could also be relevant, it is not subjected to a filtering process dominated by AI.
News fed to you through AI offer what’s called presentation fluency (the third factor)—an effortless viewing experience which glues readers to their news apps, says Dong. Exactly what the Googles and the Facebooks of the world want.
“The AI model features a lack of content control,” says Dong about the final factor in how you navigate and process news. “Combine that with the wealth of user data and these AI platforms are better at hijacking your instincts than you are at controlling them. Shifting the control from the readers to the news app probably is the most defining differentiator between the two models. Selecting ‘For You’ is putting the steering wheel in the hands of thousands of engineers, allowing them to guide you through the world of news.”
“What would be the downside of that?” asks Dong. “Research says that when humans delegate tasks to machines, they are likely to settle for a level of effort that is just good enough.”
“AI actually makes us lazy,” says Dong. “It’s eating up our ability in deep reading and critical thinking. While being fed everything we love that’s in our comfort zone, we may gradually lose the capacity to discern different views.”
Why it Matters
"Think about all the kids who are glued to TikTok these days," says Dong. "TikTok is purely based on AI algorithms. We should be worried about the future of this next generation in terms of their ability to think independently and critically."