Before you spend your hard-earned cash on a restaurant dinner, you want to make sure it’ll be worth it, and online reviews have become a big part of deciding where to eat.
A quick check of an online-review platform can show you how thousands of other people rated the food and service of restaurants, which thrive on the feedback.
There has been a lot of research on online reviews, but an Arizona State University professor’s paper is breaking new ground by looking at the actual words that people use in their reviews. He and his colleaguesHong’s colleagues on the paper were Nina (Ni) Huang of the Fox School of Business, Temple University, and Gordon Burtch of the Carlson School of Management, University of Minnesota. The paper, “Social Network Integration and User Content Generation: Evidence from Natural Experiments,” was published in MIS Quarterly. found that when the online-review platform Yelp started allowing users to post simultaneously on Facebook, it changed their nature.
The result was a double-edged sword for the sites — more reviews, which Yelp wants, but more “emotional” language, which users say is less helpful, according to previous research. In other words, more quantity but less quality.
“Online reviews help consumers make decisions about which products to purchase, and firms want to leverage that to advertise their products and have good word of mouth,” said Yili Hong, an assistant professor of information systems in the W. P. Carey School of Business.
“There is a long stream of research in our discipline looking at user-generated content, but one thing the research really hasn’t delved into is the textual aspects.”
That’s because it’s much more difficult to quantify words compared with counting the number of stars or words in a review.
Hong and his colleagues wanted to see how integrating with Facebook — where consumers’ reviews could be seen by their friends — would change their words.
“How will this affect people’s behavior in writing reviews? They don’t want to disagree with their friends,” Hong said.
The team had a natural control situation when Yelp integrated with Facebook in July 2009 and TripAdvisor integrated 15 months later, providing a window that the team examined. They then randomly selected nearly 4,000 restaurants in New York City, Los Angeles, Chicago, Philadelphia and Phoenix that were reviewed on both platforms from 2008 to 2012.
They used automated text-mining software to calculate the presence of words in threeAn example of a review with “emotional” words is “yummy shakes and malts … my favorite place.” Cognitive wording: “Worn-out place, trying to make it charming without really succeeding.” Negation: “Not a lot of parking … food is nothing special.” linguistic categories — emotional, cognitive and negation, or disagreeing, language — in reviews of the restaurants on both Yelp and TripAdvisor. Then they compared the wording.
They found that when reviewers knew that their Facebook friends would see their reviews, they used more emotional language — and more positive emotions — and less cognitive language. There also is a big decrease in “negation” words.
That’s not necessarily a good thing for the review sites, Hong said.
“If you’re very emotional, people will think you’re just coming to dump your emotions in the reviews as opposed to being logical,” he said.
One takeaway for review sites: Consider ways to encourage users to be more logical and less emotional when writing reviews.
Online reviews are a huge business, both for the platforms — which make money by selling ads based on number of views on the site — and the businesses that are reviewed. Hong has two other papers on user-generated content that were recently published in the journal Management Science. In one, he and his team measured how people could be persuaded to write online reviews. The best way? A combination of financial incentives and peer pressure — telling them how many of their peers had contributed reviews. In another study, he found that using push alerts to tell reviewers how many “likes” they had compared with other reviewers only prompted them to produce more if they were winning. Low-performing reviewers would slack off when they discovered they weren’t competitive in “likes.”
Analyzing the linguistic features of reviews is the next frontier of research as more sophisticated evaluation techniques are developed, Hong said.
“Nowadays people are thinking about tweets and other social-media things, and they want to look into the text to measure things,” he said.
“Maybe they will even be able to measure sarcasm one day.”
More Science and technology
ASU graduate student researching interplay between family dynamics, ADHD
The symptoms of attention deficit hyperactivity disorder (ADHD) — which include daydreaming, making careless mistakes or taking risks, having a hard time resisting temptation, difficulty getting…
Will this antibiotic work? ASU scientists develop rapid bacterial tests
Bacteria multiply at an astonishing rate, sometimes doubling in number in under four minutes. Imagine a doctor faced with a patient showing severe signs of infection. As they sift through test…
ASU researcher part of team discovering ways to fight drug-resistant bacteria
A new study published in the Science Advances journal featuring Arizona State University researchers has found vulnerabilities in certain strains of bacteria that are antibiotic resistant, just…