Newswise — If you consult Angie’s List before hiring a plumber or a landscaper, Yelp before making a reservation at a new restaurant or Consumer Reports before upgrading your electronics, you’re not alone.
The growing number of crowd-sourcing sites shows the extent to which consumers rely on popular opinion. New research from the University of Notre Dame has found a way to improve their accuracy.
“Harnessing the Wisdom of Crowds” is forthcoming in the journal Management Science. Written by Zhi Da, professor of finance in Notre Dame’s Mendoza College of Business, and Xing Huang of Washington University in St. Louis, the study examines the effect of “herding” on the accuracy of quarterly earnings estimations on the crowd-sourcing platform Estimize.com. Estimize provides quarterly earnings-per-share estimates for publicly traded companies by some 86,000 professional analysts, amateurs and students.
“We analyzed individuals’ estimates of quarterly corporate earnings and found that the average of their estimations becomes more accurate when these individual estimates are made blindly or concurrently,” Da says. “When people have access to others’ estimates as they make their decisions, they tend to ‘herd’ with the group and the average group estimate can actually become less accurate. In essence, we become ‘individually smarter but collectively dumber.’”
The researchers worked closely with Estimize to track and randomize the information sets of users, allowing them to cleanly isolate the impact of herding. The data came from 2,516 Estimize users who made estimates ahead of 2,147 earnings releases from 730 firms.
What they discovered was that when Estimize users could not see other estimates when forming their own estimates, the resulting consensus estimate beat the Wall Street consensus 64 percent of the time. When they did have access to other estimates, the consensus was more accurate only 57 percent of the time.
The results were so noteworthy, Estimize switched to a “blind” platform in 2015. Consequently, its consensus earnings forecasts have greatly increased in accuracy.
“We were pleased that our study had an immediate effect on the improvement of market earnings expectations,” Da says. “This owes in part to Leigh Drogen, the founder and CEO of Estimize.com, who facilitated our research from the very beginning.”
Da and Huang also discovered that the effect of “herding” had the most impact when those they identified as “influential” users made their forecasts early.
“Users tend to put a lot of weight on the prior estimates of these influential users when forming their own forecasts, thereby conforming their own estimates,” Da says. “In an extreme case, everyone copies an influential user exactly. The forecast made in group consensus becomes a clone of that made by the influential user, and the so-called wisdom of crowds in effect disappears.”
Many important decisions are made in group settings. Consider jury verdicts, the setting of interest rates by the Federal Open Market Committee or the appointment of a CEO by a board of directors. The research findings are relevant for any situations where the decision or outcome is based on a group consensus.
“Take for example a presidential election,” Da says. “Ideally, we want each voter to vote with an independent mind no matter how they have been informed up to the election itself. This explains why many countries ban exit polls. Another example can be found in corporate governance. Most firms prefer a board of directors with diverse backgrounds and therefore require independent directors. A board dominated by one or two directors can often result in corporate scandals like Enron. In truth, instances where there is need to ward off herding mentality are countless.”