Manipulating social media, part 1

 

By YouAreBeingManipulated.

On the Internet, who can you trust?

There’s an old joke that on the Internet, nobody knows that you’re a dog. That may be true, but it does raise the question of who can you trust online? Or, more prosaically, what kind of manipulation are you subjected to online?

index Manipulating social media, part 1

In the dawn of the Internet (roughly 12 years ago), the rise of review sites like Amazon or Ebay was supposed to essentially solve the problem of who or what to trust online. The idea was that there was wisdom in the crowd: the more people rated an object or a service, the more trustworthy the rating. It was the perfect solution, and trust and transparency were going to reign over the new Internet.

Of course, this didn’t happen. Review sites have indeed gotten very popular – yelp, tripadvisor, and many many more sites have largely replaced the magazines and reviewers that once preceded them as the guides to what was good and what was not.

Social Media 300x141 Manipulating social media, part 1

However, pretty quickly, corporations figured out that review sites, far from being immune to manipulation, were actually fairly easy to manipulate. Early on, some businesses figured that anonymous review sites were a good way to trash competitors, and so began to leave negative reviews on their competitors’ products rather than try to boost their own ratings. To some extent, this makes sense when you don’t have a lot of time: a few negative reviews will make potential customers hesitate a lot more than a lot of positive reviews. So business owners started to trash their competitors online. This was particularly nasty at Amazon, for example, where authors would regularly trash other authors’ books, out of a belief that publishing is a zero-sum game.

MP900433180 300x200 Manipulating social media, part 1

It wasn’t just authors that got into the act, of course. Even large corporations did it. Samsung staffers began to trash other manufacturers’ phones as they got released, for example.

But as the sites like Amazon grew in popularity, leaving negative fake reviews became more difficult, since they could get ‘buried’ by real positive ones. So corporations started to shift their attention towards boosting their own products, and they were plenty of people that sprung up to help them, in a number of different ways. Belkin, for example, simply posted ‘job wanted’ ads for folks who would write positive reviews about their products, and paid $0.65 per good review on sites like Amazon.

Companies calling themselves “online image consultants”  started to sell services to increase ratings for clients, mostly by filing fake reviews by either real people (which cost a bit more) or by imaginary accounts. One full-time employee, for the price of around $700 a month, can file 25,000 fake reviews in the space of 3 months.

Such firms claim (and seem to be able to deliver) a substantial spike in ratings – a restaurant that had 3 stars, for example, can, for the low low price of around $1,000, raise its ratings to 4 stars and even beyond. The effect is even more pronounced the lower the actual rating – a one-star restaurant, for example, can usually become an ‘average’ three-star restaurant more easily than a good restaurant can become a great restaurant.

Online Reputation Management Infographic cutoff 300x221 Manipulating social media, part 1

How bad is the problem? A group of researchers recently discovered that around 20% of Yelp’s reviews are faked.

That’s… a lot. To understand why, in 2006, only around 5% of Yelp’s reviews were faked. A 20% fake review rate is significant, because it doesn’t take that many reviews to skew a restaurant’s true rating: a restaurant with 66 ratings, with an average of 2.2 stars (i.e. not a place that is great). With around 16 fake ratings, it’s relatively easy to raise the restaurant to a 3 star status (i.e. average). And with each rating costing between 1 and 10 dollars, (mostly from Bangladeshi and Indian authors), 16 ratings would cost less than $200. Not a bad investment to shift a restaurant from the ‘poor’ category to the average bucket…

dirty kitchen 300x225 Manipulating social media, part 1

Now a 5 star restaurant!!

 

Review sites have argued that ‘fake’ reviews are relatively easy to spot: poorly written, stilted, and often without any details or depth. That’s true, to some extents – you get what you pay for, and a $1 review written by a poor Bangladeshi is not going to be particularly compelling. But that misses the point – a review is a review, and many sites’ features don’t differentiate on quality: sort the restaurants on Yelp by ratings, for example, and the ones with the most stars come out on top, regardless of the quality of the ratings themselves.

Rating sites have also started to deploy bots, automatic software that tries to detect fake reviews and stop them before they are published, but this is both hard and, to some extent, against some of the sites’ own metrics: after all, the more reviews a site gets, the more ‘trustworthy’ it seems. The more reviews, the more analytics the site can offer (recommendations, links to other sites like Facebook, etc…); in that context, the efforts that review sites deploy against fake reviews are always somewhat stilted, since most of their value-added services depend on a lot of ratings.

 

Even if a site (or its users) can spot obvious fakes, most of them rely on spotting pretty obvious fakes – reviews written by foreigners, with very little depth or details. But it turns out that volume is not the only way to manipulate online ratings…

This last September, Popular Science shut down its comments section – it was no longer possible to comment on the Popular Science stories. Why did the editors decide to do this? They posted a long explanation here, but the bottom line is that the editors found that a few negative, racist or otherwise ignorant comments could actually skew an entire discussion – and actually weaken the readers’ understanding of the science and of the article. Basically, the editors found out that comments, left unchecked, could derail an entire discussion. Rather than have their articles essentially trolled, Popular Science decided to shutter their online comments section altogether.

Others have found the same thing. Researchers discovered, for example, that a single comment – or even just a vote – can have a completely disproportionate impact: a single  ‘like’ on Facebook, or a upvote on Reddit, creates a cascade of positive comments – on average, the positive ratings on a conversation after a single manipulated upvote increased by 25%!

fake 300x185 Manipulating social media, part 1

Equally importantly, the researchers found that the impact didn’t work both ways: positive upvotes created a cascade of positive comments. Negative ones did not. It seems that we are social animals: give us a positive nudge, and it encourages us to become more positive. If, on the other hand, you nudge us down, it actually pushes us to ‘correct’ the negative perception. Basically, we tend to tag along with the positive opinions of others, but are skeptical of the negative opinions of others and we like to ‘correct’ the injustice.

The power of that single review, that single comment, is surprisingly strong. Whether it’s a true review or not doesn’t seem to matter too much, since even a content-free upvote can influence the conversion a great deal. This explains the popularity of services to ‘enhance’ a firm’s reputation online – a few dollars can buy you a couple of good reviews, and that seems to be enough to shift the conversation a fair bit to the positive. Where else can a few dollars or a few hundred dollars buy you this much impact?

Of course, this type of manipulation can be too much for some corporations, and they choose a different path: lawsuits.

When a woman gave a bad review to a construction company on Yelp, for example, the company’s owner sued her for $750,000 for “lost work and opportunities”. Those lawsuits sometimes backfire – see the Streisand effect here – but sometimes they do work and generate payouts for the companies and force the takedown of negative reviews.

positive feedback conceptual meter 300x220 Manipulating social media, part 1

So, back to the original question. If reviews can be manipulated for a few dollars, if negative reviews can be stifled by lawsuits, how can you trust anyone online anymore?

asterism Manipulating social media, part 1

Ironically, we may see a return to the trusted advisers of yore, the magazines and the critics that the review sites were going to replace. If you want a good dishwasher, for example, will you look at the reviews at Amazon, knowing how easily they can be manipulated, or would you check out consumer reports? What about next year, when marketing firms have  figured out six new ways to manipulate the ratings sites in new and innovative manners?

http://youarebeingmanipulated.com/manipulating-social-media-part-1/

Leave a Reply

You must be Logged in to post comment.

What Next?

Recent Articles