For five months, every comment submitted by a user randomly received an "up" vote (positive); a "down" vote (negative); or as a control, no vote at all. The team then observed how users rated those comments. The users generated more than 100,000 comments that were viewed more than 10 million times and rated more than 300,000 times by other users.
At least when it comes to comments on news sites, the crowd is more herdlike than wise. Comments that received fake positive votes from the researchers were 32% more likely to receive more positive votes compared with a control, the team reported online last week in Science. And those comments were no more likely than the control to be down-voted by the next viewer to see them. By the end of the study, positively manipulated comments got an overall boost of about 25%. However, the same did not hold true for negative manipulation. The ratings of comments that got a fake down vote were usually negated by an up vote by the next user to see them.
"Our experiment does not reveal the psychology behind people's decisions," Aral says, "but an intuitive explanation is that people are more skeptical of negative social influence. They're more willing to go along with positive opinions from other people."
Duncan Watts, a network scientist at Microsoft Research in New York, agrees with that conclusion. "[But] one question is whether the positive [herding] bias is specific to this site" or true in general, Watts says. He points out that the category of the news items in the experiment had a strong effect on how much people could be manipulated. "I would have thought that 'business' is pretty similar to 'economics,' yet they find a much stronger effect (almost 50% stronger) for the former than the latter. What explains this difference? If we're going to apply these findings in the real world, we'll need to know the answers."
Will companies be able to boost their products by manipulating online ratings on a massive scale? "That is easier said than done," Watts says. If people detect — or learn — that comments on a website are being manipulated, the herd may spook and leave entirely.
This is adapted from ScienceNOW, the online daily news service of the journal Science. http://news.sciencemag.org