More and more, your users are listening to each other more than they are listening to you. I am sure you are doing it as well every time you buy a book or other item on amazon.com. Studies have shown that users trust those words more than yours. However, there is an underbelly to ratings and reviews. Spam, fake reviews and blatant astroturfing. These all bring into question this mechanism, but also opens up opportunities for vendors, but also for you as a brand.
When the issue of trusting reviews came up over at Wired, they prepared a very useful flowchart. Directed at Yelp reviews, the reality is that most online review sites can be treated the same way. While I tend to trust reviews a bit more than they do, it the flowchart does give you a good quick laugh.
It is troubling what steps we have all taken to do to consume these reviews. Some take the best reviews and worst reviews and then throw them out. This is a great system if you are figure skating or gymnastics. Though subjective scoring (which is what an online review is) is open to abuse, as the figure skating community found out in 2002. If your system of reading reviews is also to throw the top and bottom out, take a page from figure skating’s new system and throw out a few more at random.
Astroturf – It’s Not Just for Football Anymore
If you think that astroturf refers to the fake grass on your neighbors patio, think again. And really, fake grass on a patio? Astroturfing is a play on the grassroots term. When you pay for the appearance of grassroots support, you are laying artificial grass. Hence, astroturf.
The problem with astroturfing on online reviews is rampant. The New York Times, in 2011, found that you could buy a fake review for as little as $5. Even when the review wasn’t directly paid for, was it compensated for in another way? In the first FTC case against this, Reverb was found to have directed it’s employees to post positive reviews for their clients. By the way, in both cases, someone is running afoul of the FTC’s published guidance on endorsements.
It Will Only Get Worse
Today most of us view this problem as mere nuisance. We look for reviews that are not as polished, sound authentic and are not over the top in their praise. At least that is my barometer. The problem is only going to get worse. Gartner predicts that by 2014 as much as 15% of online reviews will be fake and paid for by companies in one form or another.
Do you find it troubling that despite clear guidance from US regulators that companies still think this is acceptable?
Fixing the Problem, or How Should You Do It?
Folks in the financial services industry provide one way to weed fakes out. Most do not allow un-authenticated users to post anything. While amazon.com does the same thing, financial services get the benefit of their paranoia and the oversight of government. Their users aren’t fake users because those logins are tied to a financial account. They track your behavior on their website to ensure you aren’t doing something you shouldn’t be doing with your money. So, it is very hard for a fake reviewer to get into the system.
If you can find some way to prove the identity of the poster, this is a great way to weed out fakes. I do not envy the folks at amazon.com in trying to address the problem as discussed in a December, 2012 New York Times article. If you are submitting a review for your mom’s book, you are materially connected to the product and you need to disclose your relationship. This is just basic FTC endorsement and disclosure guidelines, folks. I find it troubling that the writer of the article doesn’t acknowledge that. Even author J.A. Konrah thinks that fake reviews on books is ok because there is no harm. He is from England so I will give him a pass on knowing what the FTC has said, but the FTC has clearly stated this is unacceptable.
So proving identity is one way to go. How you go about it is the troubling part when most online services, such as Yelp aren’t tied to anything material like a banking account.
After the review is posted, or perhaps just after it is submitted, doing a review of the review is another way to ferret out the fakes. Researchers at the World Wide Web 2012 conference presented a software algorithm that is tuned to spot the fakes. The software is aimed at spammers more than the paid for reviews. A human written, paid-for-review will not be caught by the software as it was written by a human, and not a spam bot.
The End of the Day
At the end of the day, ratings and reviews are something you want as a brand. While there are huge problems with the systems out there, this is one of those instances where you need to mitigate the risk as much as possible. Do what you can before the post is made, and what you can after the post is made. Have staff monitor, either systemically or by hand, the flow of postings periodically.
One other thing. Train your people on the FTC’s guidance in this space. Stress that while it may be easier to pay for reviews, you want to be authentic out there and fake reviews about your stuff will not help. Especially when you get caught.