Sound design decisions are often hindered by brain quirks or cognitive biases. These quirks include an aversion to ambiguity, overconfidence, and the confirmation bias. Follow the tips in this article to identify and mitigate these brain quirks as you strive to make sound design decisions based on research and customer needs.
Humans hate ambiguity. Uncertainty makes us emotionally and physiologically uncomfortable. It’s not our fault; it’s how we’re wired. In the past, the reptilian brain (the oldest part of the human brain (where many of our instincts reside) literally kept us alive by allowing instantaneous decisions such as—is that creature moving in the bush something I can eat or something that will eat me?
While most of us no longer need to make life-or-death decisions on a daily basis, we still dislike uncertainty because this old, reptilian part of our brain influences our emotions and reactions. In business, this aversion to uncertainty often compromises our judgment.
The decline of former film titan Kodak makes the point. Kodak dominated the film market to the point where “Kodak Moment” became part of the American vernacular. In January 2012, Kodak filed for bankruptcy (After emerging from bankruptcy in 2013, the company emerged in much reduced form as Kodak Alaris.) What happened?
It’s now common knowledge that Kodak was slayed by the digital dragon. The strange part of this story is that Kodak saw the digital revolution 25 years before it came to pass. Indeed, company leaders were able to imagine customers’ desire to crop their own photos, easily remove red-eye, and print photos at home.
Where Kodak went wrong was their inability to see that digital technology would fundamentally alter photography. Instead, thought Kodak execs, digital would simply be another way to produce prints. In fact, Kodak went so far as to manufacture the Advantix, a camera with a digital preview that still required film to get an actual print. Decisive authors Chip and Dan Heath liken this approach to selling a smartphone that must be plugged into the wall to make a call.
By staying true to film and print, Kodak leaders avoided ambiguity and went bankrupt. Their survival instincts led to the company’s death.
While we likely enjoyed show and tell in kindergarten, we know that showing usually carries the day. Adobe employees made the point when they realized that customers were struggling to buy multiple copies of software licenses. Rather than complain to I/T or plead to upper management for support, they staged a demonstration. They politely asked executives to log on and buy several copies of a license for one of the Adobe product suites, something that the system did not allow at that time. When executives experienced the same frustration as Adobe customers, they quickly ordered their staff to fix the problem.
By taking a known problem to the real decision makers, Adobe employees improved their customers’ user experience in short order. Furthermore, not only did these employees save money by avoiding a long, drawn-out process but also by giving customers what they wanted, the ability to buy more from Adobe.
A second way to manage our aversion to uncertainty is to become an active problem seeker. This idea may sound counterintuitive. Why would we look for problems? Don’t we already have enough to deal with? In truth, however, organizations that seek problems often demonstrate impressive performance. Examples include air traffic control centers in the U.S. and the navy’s aircraft carriers, both of which have incredibly low error and failure rates. Experts refer to them as high-reliability organizations (HROs).
As decision expert Michael Roberto explains, leaders of HROs do not wall themselves off from the possibility that failure might occur. On the contrary, they pre-occupy themselves with failure. For example, they:
In the design world, problem seeking is most often successful with frequent design reviews and usability tests.
The point is that problems are not the enemy; hidden problems are because these hidden problems become serious threats down the road. Review, assess, and test frequently to identify problems before the software goes live.
Unchecked, our aversion to ambiguity drives us straight into the welcoming arms of the confirmation bias, the tendency to favor information that reinforces our existing beliefs. This bias often emerges when we gather facts and data selectively meaning that we tend to pursue self-serving information.
While the usability tests and heuristic evaluations mentioned above are useful, there is another way. If you find yourself favorably disposed to the latest iteration of your company’s site or mobile app, find the counterargument by actively seeking a different opinion.
Find the person in your organization who often disagrees with your point-of-view and ask him to counter. Seeking a different point-of-view is worth the effort because a bad design imposes direct costs such as lost online sales, customer frustration, increased calls to the customer care center, and re-coding and re-design.
Cognitive bias is not limited to evaluating the present; it also influences our ability to assess the future. The optimism bias refers to our tendency to rely on the best-case scenario. As optimism bias expert Tali Sharot explains: “We are more optimistic than realistic, and we are oblivious to the fact.”
When we act on this tendency, we overlook potential problems. Let us imagine, for example, that last year our company rolled out a web app that proved wildly successful. Customers loved it and advertisers flocked to embed ads in the app.
When brainstorming ideas for a new app, it might be tempting to rely on the same business rules and design concepts that drove the first, successful app. Why argue with success?
Yet, one year in Internet time means more than one year did in the old, brick and mortar world. If we assume that the same ideas we used last year will drive the company to new revenue heights, we might be suffering from the optimism bias.
Why does this happen? Why do we assume the best-case scenario when planning projects, considering new designs, or making revenue projections? The reason is our tendency to see the future as a variant of the present meaning that we can’t quite bring ourselves to imagine truly radical threats or dramatic changes such as a significant shift in the market, the loss of our visionary CEO to a competitor, or an economic downturn. We tend to assume that things will go on more or less as they have.
As author David DiSalvo explains in his book What Makes Your Brain Happy and Why You Should Do the Opposite, “We tend to simulate the future by re-constructing the past, and the re-construction is rarely accurate.”
The optimism biased is related to the overconfidence bias; both contribute to dangerous assumptions and unrealistic expectations about revenue, profits, and the success of important projects and initiatives.
Psychologist Gary Klein outlines a technique for reducing the impact of the optimism bias. Klein’s “pre-mortem” is simple, yet powerful (we’ve modified it here to apply specifically to design):
When the team has almost come to an important design decision but hasn’t yet committed, gather a group of people knowledgeable about the decision to listen to a brief speech: “Imagine that we are a year into the future. We rolled out the mobile app as it now exists. The outcome has been a disaster. We missed our quarterly ad revenue targets by $5 million. Please take five to ten minutes to write a brief history of that disaster.”
This pre-mortem technique:
By remaining aware of our tendency to engage in the optimism bias we are more likely to identify its effects at work and address its consequences, quite possibly averting disaster and lost revenue.
A high level of confidence is good for your health, right? Wrong! A study of patients who died in a hospital ICU compared the doctor’s diagnosis to the actual autopsy results. The doctors who were completely confident in their diagnosis were wrong 40% of the time.*
We tend to trust highly confident experts such as doctors, scientists, executives, and financial analysts because we believe that their extensive training and experience serve as a solid basis for making accurate predictions.
Unfortunately, this confidence is misplaced, and not only in hospital ICUs. We need only recall the absurd stock prices in the late 1990s or the disastrous bet on rising real estate prices before the 2008 housing crash.
The tech and housing crashes illustrate a key aspect of the contemporary business climate. Confidence is valued over uncertainty, and, as psychologist Daniel Kahneman explains, there is a prevailing censure against disclosing uncertainty. Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors who are better able to gain their clients’ trust. The irony is thick. Experts who honestly acknowledge the limitations of their ability to predict outcomes are shunned in favor of overconfident experts whose attitude can literally prove fatal.
We can mitigate the effects of overconfidence bias by seeking data and reference points outside our immediate context. For example:
Our aversion to uncertainty hinders our design decisions by causing us to jump to conclusions rather than carefully weigh various design alternatives. This uncertainty is exacerbated by the confirmation bias, our tendency to seek information that reinforces what we already believe to be true such as our belief that our design preference are equivalent to sound design.
A second pair of cognitive biases further impedes sound design decisions. Our tendency to be too optimistic about the quality and likely success of our design coupled with overconfidence can cause us to overlook critical problems in our web and mobile designs.
We can counter these brain quirks by actively seeking problems, using visuals to literally demonstrate design flaws, seeking a counterargument, using Gary Klein’s pre-mortem technique, and conducting frequent usability tests to assess design changes before rolling our a site or app for customer consumption.
* “diagnosis antemortem”: Eta S. Berner and Mark L. Graber, “Overconfidence as a Cause of Diagnostic Error in Medicine” American Journal of Medicine 121 (2008): S2-S23. Cited in Thinking Fast and Slow by Daniel Kahneman.