the problem when the evidence looks right

====

“It is good to get in the habit of considering if the measured improvements are truly an indication of an improved system or merely the result of distorting the system or the data.”

(Deming, I think)

====

You know what sucks? You have data, it appears like evidence of something, and it really isn’t, but people, desperate for proof, use it to make a decision. It sucks because (a) data and evidence feels like it should be easier to assess and (b) there will always be someone rushing to find proof to support what they believe rather than evidence of something we should know. As Tony Fish suggests: “Data can tell us from where we came, our models based on said data can suggest where we will go, but as yet, we cannot determine if we should.”

All that said. This business/data gauntlet runs face first into something called ‘asymmetrical updating’ which is the tendency to favor evidence that confirms our beliefs and to ignore or misread new evidence that does not. I will come back to the end of that thought later.

Oh. It also sucks because the reality, the harsh truth as it were, is there is never enough time to wait for more data once an opportunity, or even a problem, arises. Problems gain velocity on their own and opportunities diminish on their own. The longer you wait the more likely something dies while you decide.

Yeah. It sucks.

Here is what I know.

If you look at how metrics develop over time, several things might happen:
Some will stay the same.
Some will go up and down <sometimes in extremes>.
Some go steadily up or down.

It’s the last category where our desire for evidence leads us astray. We lend a higher meaning when two things move in tandem, even more so when three do, but all this really does is reflect correlation, not causation. The unfortunate truth with our use of evidence is that we tend to associate causation with correlation when in reality it is simply coincidental.
That said. I always get a nagging feeling when any hypothesis conveniently fits the data. It feels unreliable despite looking quite reliable. I sometimes refer to this as “the even face of evidence.” It’s like a seamless flat surface with nothing to get a grip on. Evidence shouldn’t be neat and tidy, it should be, well, “the uneven face.” Why? Information, even data, is usually unequally distributed (there is an asymmetry of information). They create what Brian Christian in The Most Human Human calls “holds’.

** note: “as akin to an indoor rock-climbing gym’s “holds” – those bright jewel-tone rubber blobs that speckle the fake rock walls. Each is both an aid to the climber and an invitation onto a certain path or route along the ascent.” (The Most Human Human, p. 181)

Therefore, patterns become what is important because it is patterns that identify the variations that matter as well as the non-variations that matter – to climb or choose a path. In addition, pattern recognition offers the evidence to better “results” by improving the system <not a specific tactic>. What I mean by that is our biggest misuse of evidence is to use it to achieve a better result, i.e., improve what is measured <a causal attempt>. This is doubly damaging in that (a) in systems improving one part does not improve the whole and (b) doing this over time will begin to narrow-cast data in order to improve specific data limiting the true scope of what ‘better’ can be.

What we do more often than not is distort data and evidence to meet the needs of the measurement rather than use the data and evidence to improve the system as a whole <which may beget different things to measure>. Evidence demands that we understand the limitations of the specifics measures we use and the data we select. We should be constantly trying to improve the system but, well, that’s not the way business works. 99% of the time measurements are set up to assess performance and outcomes and people get paid to perform and generate outcomes. The measurements are tied to incentives. Evidence is then viewed through a bias of what I am incentivized by.

Anyway.

Using data effectively requires understanding how measures capture the actual state of the system and requires understanding the limitations of the specific measures we are using in their refection of the system as a whole. But that also means understanding the data itself. Let’s be clear. No business lacks data. If anything, they have more than they know what to do with. So, what any sane business does is compress their data into usable forms. In technology there are two types of compressions “lossless” and “lossy.” Lossless means nothing is lost or compromised in its compression – meaning it can be reconstructed in its entirety without missing anything or detail. Lossy is you may lose some data or some level of detail as a cost of compression. Data evidence, particularly in a typical business environment, is exactly the same. I would be remiss if I didn’t point out any compression relies on some bias that computers, as do people, seek patterns they expect or at least are representative of something similar to what they have seen. So, compression will arc away from the unexpected. Compression doesn’t encourage what I call “evidence drift.” I made up that term mostly because business hates drift particularly if they feel time constrained. Under the guise of “highest probability given the information we have at hand”, drift doesn’t meet the constraints of the context nor the desire of the decision makers. That said. Evidence drift is actually where the answers tend to reside. I say that because data compression zeroes in on “what to do in the present” while evidence drift gives signals to help with “what do we need to do for the future.” I am certainly torturing good data science practice and distorting how things get done, yet, I am summarizing what most businesses dependent upon a lot of data to make decisions do. Time compression plus data compression is a bad equation for decision-making.

But data compression has some structural repercussions.

It uses the compressed evidence to make the system more efficient – over and over again. Evidence drift, which could actually permit the business to drift into a more effective space, gets ignored over and over again. The problem is when the evidence looks right, evidence is acting responsibly, and irresponsibly, at exactly the same time.

Which leads me to information.

Information is a stabilizing buffer. In the good old days, most buffers were physical and unwieldy, in the here and now, information is a leverage buffer. It protects against crisis and permits advancing to meet emergent opportunities. It actually buffers against uncertainty as well as unhealthy asymmetry (when disruption affects the system). I am not sure it needs to be said, but evidence is information. But to be effective that evidence information must shed delays in entering into feedback/decision-making loops (both positive and negative). I say this because no one person can ever hold all information and, currently, most knowledge/information banks are a bit unwieldy for true business agility. I would be remiss here if I didn’t point out that our unhealthy love of agility only encourages us to misuse evidence <put too high a confidence in on the wrong things too quickly>.

Evidence needs to be a readily accessible inventory from which people can select possible patterns of information to test out new patterns. I should note the source of power, in this system, is actually not the evidence, but within human imagination and creative thinking. Anyway. The objective of any business should be to deliver evidence where it did not exist before (in the system and the system of people) causing people to act, and act differently, therefore the system itself acts differently. Missing information, missing evidence, or the misusage of, is most likely the greatest cause of system failure (or complications within a normal complex business) than anything else.

Which leads me to the fallacy of the do the best thing.

This is actually about complexity, consequences and repercussions. The need to consider the consequences of your actions also means you need to consider future mistakes. This can sound mind-numbing. Almost like a bad game of “what ifs.” It’s not. The truth is a decision today is actually more dependent upon what you do next, more than it is what you do now. All the initial decision does it start the game <and the game, if it plays itself, will never take into consideration the consequences and repercussions you desire>. Circling back to ‘doing the best thing.’ The best thing may be available, but not available now. But good enough is available now. The best thing to do may actually not be by using the best thing. Ponder that. The reality is all decisions are flawed, nothing is guaranteed, and only truly shitty decisions are unfixable. And here is where ‘evidence that looks right’ rears its ugly head. Not all evidence is created equal, but we are more likely to use evidence as proof of best thing to do now and for later rather than evidence of ‘getting the game started.’ I admit. I don’t know if I am arguing most business people do not understand complexity or they simply do not know how to use evidence. My guess is it’s a little of both.

In the end.

Evidence breeds complacency. That may sound odd given how much data we have, how much time we invest on looking at data and how much we seem to value data to craft evidence, but the reality is we are becoming complacent to its compression.

We become complacent with too much “good evidence.”

We become complacent with tidy answers.

All that complacency makes us less sharp and less in tune to “lossiness.”

All I can truly say is, evidence is your friend, until it’s not.

 

, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
Written by Bruce