Researchers have found that people are significantly more likely to buy products after reading AI-generated summaries of online reviews, even though the technology frequently produces incorrect information.
The study was presented by researchers from the University of California, San Diegoin December 2025 at the International Joint Conference on Natural Language Processing and the Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics.
Researchers examined how summaries created by large language models influence consumer behaviour. The experiment used six AI models, 1,000 electronics product reviews, 1,000 media interviews, and a news database of 8,500 items.
In the study, 70 participants were divided into two groups. One group read original product reviews written by humans, while the other read summaries generated by AI systems.
Participants who read the original reviews said they would buy the product 52% of the time. By contrast, participants who read the AI summaries said they would purchase the product 84% of the time.
Lead author Abeer Alessa, a research assistant and lecturer in machine learning and human-computer interaction, said the results reveal how the framing used by AI systems can influence people’s decisions.
“Models tend to be wrong on whether the news description happened or not,” Alessa said. “It may incorrectly state that an event never occurred, even if it did occur after the model’s training was completed.”
During testing, researchers found that the AI systems changed the sentiment of user reviews in 26.5% of cases and produced hallucinated information 60% of the time when asked questions about the products.
The researchers said the results show how subtle shifts in wording created by AI can distort judgment and influence consumer behaviour.
Steven Bartlett Stops Using AI for LinkedIn Posts
Meanwhile, British entrepreneur and podcaster Steven Bartlett, host of the podcast The Diary of a CEO, has stopped using artificial intelligence to write posts on LinkedIn.
Bartlett had previously encouraged the use of AI across his media company FlightStory, but his team later decided to avoid AI-generated content on the professional networking platform.
Christiana Brenton, chief revenue officer and cofounder of FlightStory, said the decision was made after noticing a large amount of automated content on the platform.
“You can really see the AI slop,” Brenton said in an interview. “What Steven detected very early on, and as we did for all of our creators, is that when the world swings left, the opportunity is right.”
She added that posts written by Bartlett and his team perform better than those created with AI.
“So he now personally and the team write every single piece of social copy and content that goes out into the world,” Brenton said.
According to Brenton, human-written posts sometimes include minor spelling mistakes that are left intentionally because they make the content appear more authentic.
“When you’re inundated with AI content, it starts to feel less human,” she said.
Although Steven Bartlett continues to use AI across other parts of his business operations, his team says the final editing stage of content, including projects such as the upcoming animated series “Steven’s World” is still handled by human staff to preserve emotional tone and storytelling quality.

