The critical performance of decisions

There is no doubting that the decisions we make form the cornerstone of any organisation’s performance.  Every action we take is prefaced by a decision on whether to take it.  Ostensibly millions of decisions, big and small, which when added up, form the shape and direction of our businesses.

How often though, do we think about the process we have used for those decisions and their impacts?  That the process used was effective, unbiased (1), or even legal (2).  How often have you considered the direct and indirect costs of those decisions, be it financial or otherwise?

I can say with a fair amount of reliability that you probably haven’t.  Why?  Because most people barely understand their own decision processes.  Let alone know if they just made a good one or a bad one.

Meanwhile, the cost of those decisions goes unnoticed.  We give little thought to whether that decision should have taken 2 days or 2 minutes.  We see a decision made the next day as normal, even good because that’s what we’ve become accustomed to. 

Make a thousand such decisions across your organisation and the difference could be 2000 days, instead of just 2.  2000 days spent waiting for a decision to occur. Does that still seem normal? Or good?  And we haven’t yet asked how many of those decisions were even good ones.

Measuring Decision Performance

Making a decision is just a process.  A process that is applied to knowledge and information that enables us to make a choice.  We don’t always understand the process or realise we’re following one, but it is a process none the less.

Because decisions are a knowledge process, the performance of those decisions depends on the quality and efficacy of that knowledge; Knowledge needed for the decision, and knowledge of the right process to make it.  Poor decisions are made because we have gaps in that knowledge.    

What we are looking for is a set of metrics, heuristics, or factors that help us to understand those knowledge gaps.  Like any process, we can do that by looking at the outputs, the inputs, and process being applied.  In the case of decisions, we have identified these four factors to consider:

  • Did the outcomes and impacts of the decision match those that were anticipated?

Maybe your decision leads to a positive outcome.  Or perhaps it didn’t and now your world is on fire.  Just because things didn’t pan out as you had hoped doesn’t mean the decision itself was bad.  The question is did you know the outcome was possible when you made the decision. 

If you didn’t, then for good or bad that’s a problem.  It means that somewhere, you had a gap in knowledge, information, or understanding of the problem at hand. 

  • Were the factors that lead to this outcome understood and identified at the time of the decision?

So you achieved the outcome you were expecting from your decision.  Was that because you understood how things were going to play out? Or did you just get lucky?  Even getting lucky doesn’t inherently mean a bad decision, if you knew what sort of odds you were playing.

Again, what we’re looking for is that gap in knowledge or information.  If we don’t fully understand the circumstances, then good or bad, we could have had another outcome entirely.

  • Was the decision process efficient and effective in establishing those factors and outcomes?

Whether you understood the outcomes of the decision or not, how effective were you in getting that information?  Did you have to wait for the information, or were their challenges in understanding it?  Did you spend time and effort obtaining information that at the end of the day, had no bearing?

  • Was the method we used to reach our decision appropriate?

It’s one thing to have all the information you need, it is another to apply that information appropriately towards your decision.  Collectively as humans, we like to think we’re logical, rational beasts.  The truth is we’re biased and lazy.  Our choice in methods often reflects these natural human tendencies.

Ultimately, what we are trying to understand is for a given decision, did we really know what we were doing? If not then what can we learn from the experience?

There is a corollary to all this:  Given the information we have, what kind of decision can we make?

As they say, “perfect is the enemy of good”, most decisions only need to be good enough for their purpose, not perfect.  Exploring these factors proactively can help us get a handle on what “good enough” means.  It also means we can design our decisions to be more effective, more efficient, or just better tailored to our circumstances.

Autonomous Systems and Decisions by Design

If there is a need for a specific framework for decision performance, it is especially with our pursuit of autonomous systems.  Why? Because (perhaps obviously) there is a reduced role for humans in the loop.  We lose that that tacit knowledge a human provides of what is going within that decision and more importantly, the ability to intercede if it has gone off the rails.

The system itself must be capable of assessing its performance and act accordingly.  Alternatively, there must be sufficient information and control in the system for a human to do it.  The more critical the system, the more likely the answer is probably both (3). 

Human or otherwise, what we end up with is a control loop that sits on top of our decision process.  A process in of itself, it takes events from our decision process and other external sources and translates them into controlling actions for that decision.  In some circles, this capability is called Closed Loop Automation Management or Closed Loop Performance Management.

Our ability to drive better decisions is limited by this management process (4).  If we want better decisions, we need the capability to design and review our decisions like we design and review any process.  Not just to streamline the decision process, but to ensure we have captured and applied the knowledge needed to make better ones.

The difference in performance between a well-designed decision process and a bad one can be significant.  NASA was able to reduce mission design times from 3-9 months to a matter of days by ruthlessly cutting decision waste from their design processes (5).  Our own experience suggests targeting decision waste can double the productivity of most organisations without breaking much of a sweat.

In Short

We make thousands of decisions a day without giving them much thought.  There is a long tail of little decisions that seem simple enough but when added up, amount to days or months of potential lost productivity.  And the bigger decisions which could lead your organisation to fame and fortune, or a total disaster. 

Each one of those decisions deserves more respect than they are probably given.  It’s unlikely that you can review every decision your organisation makes.  You can, however, start to look at the kinds of decisions you make and the approaches you use to make them.

Automated decisions are likely to go through some sort of design process regardless.  Is there are a set of principles laid out which will influence the design? Or are you leaving such choices open to interpretation and at the whim of a programmer? 

This is why we have become such advocates of a decision management plan; A document that seeks to outline the best practices and principles for the kinds of decisions your organisation makes.  While seemingly not common in businesses yet, government agencies are starting to publish such guides and directives, particularly when it comes to automated decisions.  The Australian Government has made theirs publicly available (6).

What we do know is one of the single most powerful metrics to assess organisational performance is their ability to make a decision.  The question becomes what kind of performance are you getting?  If you don’t know, then that’s a problem in of itself.


1. Predictive Inequity in Object Detection. Wilson, Benjamin, Hoffman, Judy and Morgenstern, Jamie. s.l. : arXiv preprint, 2019.

2. Henriques-Gomes, Luke. Robodebt: government to refund 470,000 unlawful Centrelink debts worth $721m. The Guardian. [Online] 29 May 2020. [Cited: 7 August 2020.]

3. Jackson, Scott. Principles for Resilient Design – A Guide for Understanding and Implementation. [book auth.] IRGC. Resource Guide on Resilience. Lausanne : EPFL International Risk Governance Center, 2016.

4. Organisational design: dynamically creating and sustaining integrated virtual enterprises. Bernus, P and Nemes, L. 2, 1999, IFAC Proceedings, Vol. 32, pp. 189-194.

5. Chachere, John, Levitt, Raymond and Kunz, John. Can You Accelerate Your Project Using Extreme Collaboration? A Model Based Analysis. s.l. : Stanford University, NASA, 2003.

6. Australian Commonwealth Ombudsman. Automated decision-making better practice guide. Commonwealth Ombudsman. [Online] [Cited: 7 August 2020.]

“Head in Hands” courtesy Alex E Proimos

Leave a Reply

Your email address will not be published. Required fields are marked *