Experts or the Crowd? Contributor Tim Rotolo of TryMyUI examines the nature of UX feedback and the wisdom of listening to a group versus listening to an expert. Whom would you trust, and why?
Experts or the masses? It’s a debate that divides numerous and diverse areas of thought, from sociology and psychology to government (efficient authoritarianism or raucous democracy?), economics (controlled central planning or unregulated free markets?), information dissemination (venerable Encyclopedia Britannica or broad-based Wikipedia?) and more.
So when it comes to UX, who can tell you more – the experts, or the crowd? The answer may not be as clear-cut as you think.
The wisdom of crowds
In 2004, James Surowiecki wrote a book that gave a name to the truth and accuracy of the aggregated many: “the wisdom of crowds.” The idea, basically, is that the collected knowledge of a large number of people tends to be astoundingly correct.
The philosophical roots of this particular strain of thought have a rather unexpected source: a 19th-century British scientist named Francis Galton, a stuffy elitist convinced that proper breeding and the concentration of power in the hands of a suitable few were the keys to a successful society. Observing a contest to guess the weight of a fat ox at a country fair, Galton was inspired to run statistical tests on the participants’ responses, and discovered, to his surprise, that the average of all 787 responses deviated from the ox’s true weight by just a single pound.
The wisdom of crowds lies in the great diversity of independent opinion: as overestimation and underestimation, opposition and endorsement, half-truths and whole truths are averaged together, the voice of the crowd converges on correctness.
Take Wikipedia, for example. The free encyclopedia, available to read and edit for anyone online, has compiled an extraordinary assembly of articles contributed in bits and pieces by millions of different users. While the site has its share of detractors, comparative studies by Nature, the Journal of Clinical Oncology, and others have found the resource to have a level of reliability on par with Encyclopedia Britannica.
In other words, vast, anonymous crowds have built a thorough and reliable encyclopedia just about as well as a premiere group of certified experts. And when it comes to breadth of topics covered, the free encyclopedia far outstrips its less agile rivals.
The application of the wisdom of crowds to UX
Clearly the seemingly unremarkable crowd has plenty to offer you. But how to access it? Think of remote usability testing as your guess-the-weight-of-the-cow contest. Sure, the participants aren’t actually competing, but each of them, with their varying knowledge, experience, and skill levels, contributes a new point of view that will lead you closer to an accurate and precise evaluation of the subject at hand.
But are they better than experts? At some things, they certainly can be (after all, none of the livestock experts guessed within one pound of the prize ox’s real weight). Aldo Matteucci has this to say:
“Why are experts not that smart? Because experts tend to be and think alike, and thus do not reflect maximum diversity of opinions.”
That’s not to say that experts don’t have anything to offer; on the contrary, the wealth of deeper understanding, analytical thinking, and issue resolution abilities that a UX expert brings to the table are fantastic tools.
But they, too, are human, subject to personal biases and the biases of the field, trapped in the bubble of their own minds. No individual, no matter their expertise, can compete with the crowd for completeness of understanding. There are too many angles for one person’s opinion to be 100% accurate; aggregation will always be able to paint a more perfect picture.
The next step
How can we maximize what we learn from the crowd? The most accurate feedback derives from aggregating ideas from lots of different people. But right now, remote usability tests are performed completely independently – there is no aggregation mechanism beyond the mental compiling done by the person watching the tests (which in itself is still quite powerful, though time-consuming).
In fact, the independence of the tests is a very good thing, because it allows testers to form their own opinions and conclusions without the distorting social influences of a group of people. Independent thinking is the very foundation of the wisdom of crowds.
But what if a platform to collect the best ideas and suggestions of testers were implemented across every test for the same site? Here’s one way it could work: at the end of each test, the tester is asked for 3 things they liked and didn’t like about the website as well as other suggestions they would offer. Then, they are shown a compiled list of responses from other takers of the same test, and have the ability to vote up or down on these responses based on whether they agree or disagree.
That way, the best ideas make it to the top, and the end result for the test owner is a complete and prioritized list of how best to improve their website’s UX. Rankings and vote count would be kept hidden from testers themselves, to prevent undermining their judgment or independent thinking with outside information that could cause groupthink or bandwagoning.
It’s an exciting thought, and could help us to take full advantage of the wisdom of the crowd and bring it to bear on getting UX feedback. Such a system could identify and prioritize usability stress points and even draw on the crowd as a reservoir of innovative usability solutions.
That’s what listening to the crowd can achieve, and we’re looking forward to exploring the possibilities and putting them into action here at TryMyUI. So next time you decide to weigh your ‘ox’ and trim the fat, remember where wisdom lies.
Opinions expressed in this article are those of the author solely and may not reflect the opinions of UsefulUsability or its staff.