FTC Economist Analyzes Review Quality On Amazon, Google, HomeAdvisor, Facebook, Yelp

FTC Economist Analyzes Review Quality On Amazon, Google, HomeAdvisor, Facebook, Yelp

by , Staff Writer @lauriesullivan, October 30, 2020

FTC Economist Analyzes Review Quality On Amazon, Google, HomeAdvisor, Facebook, Yelp | DeviceDaily.com

Reviews online at places like Amazon, Google, and Yelp will play a more important role this holiday season as consumers turn to their PCs, tablets and smartphones to search for products and make purchases rather than shopping in physical stores.

A report from Federal Trade Commission Economist Devesh Raval provides an interesting analysis of ratings, comparing Yelp’s star ratings with the Better Business Bureau, Google, Facebook and HomeAdvisor.

The analysis examines consumer-generated reviews across platforms and how likely the differences are to account for some of the fake reviews.

He examines the platforms from the Better Business Bureau (BBB), Yelp, Google, Facebook, and HomeAdvisor by matching a sample of more than one hundred thousand businesses to review listings. All measure online ratings on a 5-point scale, but the distribution of review ratings is very different.

BBB average ratings are bimodal — with most ratings either very low or very high — while Yelp ratings are much more uniform across the rating distribution.

Facebook, Google, and HomeAdvisor have heavily skewed distributions, and most businesses have very high ratings.

In addition, businesses with signals of poor quality, such as a large number of complaints or an “F” grade from the BBB, have lower average ratings on the BBB and Yelp compared to ratings on Google, Facebook, and HomeAdvisor.

One of the most important factors he highlights is the reviews that are likely to be fake based on multiple review-filtering algorithms that increase the ratings for low-quality businesses.

Through a linear decomposition, he shows that fake reviews can account for about half of the higher average ratings for low-quality businesses on Google compared to Yelp.

The longer and more in-depth the review is the more likely it is to be genuine. For example, Yelp’s average review contains approximately 593 characters, whereas 45% of Google reviews contain less than 100 characters.

Raval examines how review ratings vary with quality across these online platforms, estimating quality tiers for businesses through a finite mixture model using signals from consumer protection authorities and review ratings.

These are higher for Google and Facebook compared with the BBB and Yelp. Raval points out greater differences for low-quality businesses.

The research is intended to provide guidance to consumers, platforms, and regulators.

For consumers, the research shows that reliance on a business’s star rating level may not provide a reliable guide to business quality. A 4.0 rating could imply a very different level of quality on one platform compared with another.

The ranking of a business on a platform does seem to be consistent across platforms, he wrote. This may require consumers to search more to learn the distribution of ratings for a particular type of business and platform.

Raval examined how review ratings vary across platforms. There is evidence that fake reviews explain some of these differences. The number of reviews is significantly higher on Google, Facebook, and HomeAdvisor compared to those the BBB and Yelp.

In addition, ratings for reviews that are more likely to be fake — reviews hidden on Yelp, and with high scores on an internal filtering algorithm for the BBB — are higher than published reviews for low-quality businesses.

Interestingly, Raval estimates about half of the difference between Google and Yelp ratings of low-quality businesses likely is due to fake reviews, and at least about a quarter of Google reviews for low-quality businesses are likely fake.

MediaPost.com: Search & Performance Marketing Daily

(19)