The internet has led to a massive increase in the amount of information available.
Often, this is a good thing. For example, shopping around to find the cheapest price for something has become far easier.
But it can have its downsides. A report last week from the consumer magazine Which highlighted one such disadvantage. An investigation claimed that the review system on parts of Amazon was being undermined by fake five-star reviews.
The magazine analysed the listings of hundreds of popular tech products in 14 online categories, such as headphones and smartwatches.
Researchers sorted the headphone reviews, for example, by the average scores of the brands. The first page of results – those with the highest scores – consisted almost entirely of little-known brands, with nearly 90 per cent of the reviews from unverified buyers.
In other words, there was no evidence that the reviewer had ever bought the item in the first place.
Companies like Amazon are well aware of these potential problems. They take steps to try to guard against them. A flurry of very good posts for a less well-known brand is one of the classic footprints which enable fake reviews to be identified.
But Which suggested that the volume and variety of fake reviews was so large that the defences are currently being overwhelmed.
A similar problem arose almost from the very start of email, when spam first appeared. Ever since then, a complicated evolutionary game has been played between the spammers and the spam filters.
It is a game because spam wins if it gets through, and the filters win if it does not. It is evolutionary because both sides are constantly adjusting their strategies. The filters seem gradually to be getting the better of it, though I am currently being plagued by emails from China offering to sell me plastic moulds.
The fake review – and more generally the fake news – problem has not been an issue for quite as long, but concern over it is growing.
The instinct of many people is to reach for the law, and in particular to regulate. Set up a body, staff it with bureaucrats who of course have the public interest at heart, and the problem will be solved, goes the logic. The European Commission is a strong proponent of this approach.
But there are already some good illustrations of the private sector reducing what economists describe as “reputation systems failures”.
For example, a 2017 paper by Andrey Fradkin and colleagues at the MIT School of Management analysed experiments by Airbnb.
A particularly successful one appears to be that of the simultaneous review: both the buyer and seller post their reviews, and only then are they allowed to see what was written about them.
Not all consumers give feedback. Many who have a bad experience do not bother to rate the seller or product – they just stop buying from the platform. Platform providers therefore have a strong incentive to verify posts and encourage real reviews, perhaps using monetary payments to reduce selection bias.
Just as we didn’t need to regulate against spam, given time, markets will find solutions to what is currently a pressing problem.
Paul Ormerod