Transcending the Right-Wrong Dichotomy

Orin Davis
4 min readNov 19, 2017

--

Every semester, my business students have a struggle. They’re so used to having to provide the “right answer” in school that they hold back on tough questions for fear of getting it wrong, and they are totally flummoxed by the open-ended business case questions I give them because there’s no “right” answer — in fact, most of them are written so that entirely contradicting answers can both be correct.

Though that seems impossible, the trick is to realize that frame, context, and how pieces of information are weighted relative to one another make all the difference in the world. One analyst looks at a business trend and discounts it in favor of a different one, while another looks at the same trend and emphasizes it because of the effects of a third one (not considered by the first analyst). One revenue graph looks at a 20-year trend, while the other is just 5. One eminently competent CEO fits the culture of the company, while another equally competent CEO doesn’t. There are a slew of factors that affect whether the answer is “right” or “wrong,” and, in business, how would you know if it was right or wrong anyway? After all, if the business fails, it turned out wrong even if everything was done right, and if it succeeds, it turned out right even if everything was done wrong.

It makes a lot more sense to break out of the right-wrong dichotomy and move to a system that provides just a bit more clarity and a bit less judgment. When it comes to the question of facts, Sam Arbesman doesn’t regret to inform you that they’re a lot more debatable than you think. Even numbers, which are often treated as the be-all and end-all of facts, are only as good as the tools that collect them. For instance, if you say that something is one meter long while measuring with a yardstick, you’re going to be a bit off (about .0856 m or 3.37 inches). Depending upon how much leeway you need, however, it may not matter that your measurement is technically incorrect, and that gives us a clue about the dimensions we need to assess contentions.

Whether technically correct or not, what often matters more is whether the notion is useful. If the measurement is off by a bit, but still useful, then we have what we need. A paraphrase of work by statistician George Box posits that all models are wrong, but some models are useful. That is, they give us enough information to guide our thinking towards building/doing something positively effective. Of course, none of that helps when someone’s trying to tell us that the moon is made of cheese, because that argument is not defensible. Frequently, we hear people make claims and statements that, upon investigation, turn out to be contentions that they cannot support and that crumble with the slightest perturbation. Then again, we also run into people making technically solid arguments that are also a colossal waste of our time, and thus they’re not useful, even if they are defensible. This brings us to two axes of assessment: Useful and Defensible.

Most of the solid arguments that we encounter are both defensible and useful. They get us thinking in directions that lead to efficacious action. It’s not so much that they are fact-based so much as they can be supported by one or more lines of thought that have unequivocal logic (at least, given that all premises are accepted), and/or stand up to rigorous assessment and challenge.

Some notions are pretty useful, like Freud’s Oedipus Complex, but they tend to fall short when examined in detail and challenged. Even though these contentions are a bit short on accuracy, they get us cogitating in effective ways and lead us to finding lines of thought that later get us to solid arguments. In the spirit of Box, these models, though essentially wrong, are helpful analogies that build bridges between the realm of the unknown and the world of understanding and action.

Every so often, we encounter people that love to point out one technicality after another, and bust out one fact (as it were) after another, in a seemingly endless array of details. At best, this is a fun exercise for trivia buffs, and at worst is self-aggrandizement for a twit with a fragile ego. Either way, this sort of pedantry is not time well-spent for most people.

Finally, we come across situations where people are giving us fuzzy information that sounds like the weird theories you see in the tabloids. It doesn’t help our thinking, and it actually tends to send us down wild tangents. This sort of misleading nonsense, ranges from being a mind-numbing waste of time to a dangerous threat. When you find yourself facing misleading nonsense, unless provided by an adorable kid (in which case it is useful, if only for the “aw!” factor), don’t walk away, run.

As noted above, however, even when you have a solid argument the results can be mixed. Some solid arguments lead to effective results, while others can still lead to a crash-and-burn. There are always factors that we cannot control and details and caveats that we cannot take into account. Even though we could theoretically add an axis of efficacy to this system, the reality is that it would overcomplicate matters by demanding an impractical (if not impossible) level of precision and control. It is sufficient that solid arguments are the ones most likely to lead to effective results, and helpful analogies are often a path to them (typically via solid arguments). In business, as well as other aspects of life, having clear, defensible, useful reasons for what we do is often the best we can hope for, and it usually works.

Sure beats having to be right all the time.

--

--

Orin Davis

Self-actualization engineer who makes workplaces great places to work. PI at Quality of Life Lab (www.qllab.org). Consultant. Professor. Startup Advisor.