=================
‘Optimization matters only when it matters. When it matters, it matters a lot, but until you know that it matters, don’t waste a lot of time doing it. Even if you know it matters, you need to know where it matters. Without performance data, you won’t know what to optimize, and you’ll probably optimize the wrong thing.’
Joseph M. Newcomer
==============
An optimal decision is a decision that leads to at least as good a known or expected outcome as all other available decision options. It is an important concept in decision theory. In order to compare the different decision outcomes, one commonly assigns a utility value to each of them.
==============
“Premature optimization is the root of all evil.”
==============
Google ‘navigating uncertainty’ and you get: 18,400,000 results in 0.43 seconds with the top result being “8 steps to navigating uncertainty.”
Google ‘navigating certainty’ and you get: 4,770,000 results in 0.42 seconds with the top result being “Navigating Chaos: How to Find Certainty in Uncertain Situations” and I will note that only the 1st 5 results actually have to do with certainty.
While there is an endless supply of information, and disinformation, on living in an uncertain world (179,000,000 results in 0.57 seconds), managing uncertainty ( 153,000,000 results in 0.60 seconds) or ‘the future of work will be about navigating uncertainty’ (46,000,000 results in 0.44 seconds) there seems to be less discussion on, well, the certainty side of the ledger. We ignore the possibility uncertainty is ‘the normal’ in our pursuit of a much more comfortable belief — that things are certain. That seems absurd, if not counterproductive.
Maybe we have it all wrong. Maybe we should be talking about navigating certainty. Maybe everything is uncertain and focusing on situational certainty is the key to ongoing success. Maybe business, and business decision making is grounded in lily pads of certainty from which one seeks to identify probabilities. In other words, minimizing uncertainty and maximizing certainty. That, in a nutshell, is optimal certainty.
That said.
If there is one thing a pandemic has taught people is there is more uncertainty than there is certainty in the world, life and business. This has been a shock to the grander system of thought because, well, 99% of people ground their life on some sense of certainty and 99% of businesses ground their business in the concept of certainties. Rituals, process, systems, 9-5, best practices, on & on & on. Business creates an illusion of as much certainty as possible. Now. They do so mostly with good intentions – planning, predictability, efficiency. Productivity. Unfortunately, these good intentions not only inevitably create fragility and less than optimal productivity but also finds itself constantly running into that constant companion of business – uncertainty. Business loves certainty and hates uncertainty to such an extent (it often takes on mythical proportions in an imagined future) that business disproportionately creates an illusion of certainty. Suffice it to say, in today’s business world. normalcy (or certainty) is an ongoing quest for every business and its people.
The allure of certainty is easy to understand, but there is an underlying less-recognized appeal- ”compounding privileges.” There is an inherent belief that those who actually have more certainty, in systems & beliefs, are rewarded with ‘compounded privileges’, I.e., structurally they will benefit more. They see certainty as an enabling constraint and not a reductive constraint or even associate increased risk.
** note: I appropriated the term “compounding privileges”
So.
I say all that to suggest maybe we are getting all of this wrong. Maybe we, in business, should assume everything is uncertain and purposefully seek lily pads of certainty?
Makes you uncomfortable, no?
So let’s talk about the uncomfortable:
what if we just assumed that everything is uncertain and lived life and conducted business based on this?
what if by assuming everything is grounded in uncertainty that we then sought moments of certainty, what I often call ‘lily pads’ on which to find some stability within all that uncertainty?
The ‘comfort’ relationship between certainty and uncertainty is fairly consistent. In a normal environment, where we believe uncertainty is situational, we simply seek certainty. However, in a highly uncertain environment we seek extreme certainty. We do this because uncertainty often takes on exponential geometric proportions of immense imaginative sizes. The only way, mentally, we seem to deal with this monstrosity is to embrace our hero – certainty. Of course, this all has to do with risk. Okay. It has to do with ‘risk feelings.’ Because the truth is everything is a risk, we just choose to not see it as such when we convince ourselves we have attained some ‘certainty.’
Let me begin with where any discussion of optimization and uncertainty/certainty begins: we live in a binary-love world in which we believe compromise isn’t good because anything less than the binary solution is not only ‘lesser than’, but wrong.
This is all, well, unfortunate because the world is complex, binaries are actually in a significant minority and compromise is more often reflective of optimal than it is less-than-optimal (because it actually contributes to the resilience of the choice).
This is all, well, unfortunate because business is complex and, yet, far too often sees strategy in a linear way when in reality it is more often like figuring out ways to find solid ground amongst some mud, quicksand, water & dense forest. At times, not all the time mind you, you look to maintain minimal viable business or “there is no optimal way to do business, but there are viable enough ways to keep going.”
Which leads me to the relationship between certainty and statistics, data and analysis.
I would argue our capacity, potential, is defined by maximizing optimal certainty.
Too much uncertainty and clarity is elusive (and clarity is the energy for progress). Too much certainty and clarity actually blinds you (it ceases to be true clarity and more simplistic focus).
*Note: Donald Ervin Knuth, The Art of Computer Programming, Volume 1: Fundamental Algorithms and “Paradoxes of probability and other statistical strangeness” offer some interesting thoughts on statistics, data, analysis and optimization.
Objectivity is often dependent upon a variety of things: ability to avoid simplistic looking solutions, attitude toward risk, curiosity (and the right type of directed curiosity), attitude toward what is possible versus impossible and, of course, the ability to force your way through personal bias and the ability to view things from multiple views (conceptually & abstractly).
That said. I would suggest true objectivity has to be a strategy in order to address possibly the greatest enemy of objectivity: over fitting. While overfitting is a data analysis term, its also a human bias. I would also suggest overfitting is the greatest enemy of conceptual thinking (which would then make it the greatest enemy of a Conceptual Age organization).
In computer science, the question of how to create benchmarks that allow a program to learn from past experience is at the heart of machine learning. Machine learning can arc toward increased performance or poor performance as it treads the delicate tight-rope between underfitting and overfitting.
Simplistically, this is exactly the situation the decisionmakers of the Conceptual Age organization is faced with. In fact, I would argue because of the importance of conceptual thinking, which is inherently abstract, the battle between underfitting, appropriate fitting and over fitting will define who wins the future war in the future of work.
That said. Overfitting is one of the banes of business – it is a reflection of the ongoing battle between certainty & uncertainty. it is actually the response to people who do not understand probabilities and fear explaining uncertainty, randomness and the sometimes quirkiness of anything to do with people.
The world is complex. Humans are complex. Life is complex. Work is complex. Business is complex. We deal with inherent complexity every single day. It’s naïve to think we can solve complexity nor is it smart to think we should solve complexity. Complexity is expansive and actually a force for good. Far too often we use overfitting to not just explain complexity, but simplify it.
Overfitting, simplistically, is the human desire to find parallelism rather than analogies.
Note: obviously if an organization is going to be dependent upon a foundation network of algorithms and technology the Algorithms & technology need to be discussed. This means ethics and accuracy. We need to begin somewhere and as I stated in 2010 when discussing a global education initiative I designed, the UN Universal Human Rights is as good a place to begin as any. For some great thinking on technology & ethics I highly recommend both The Ethics Centre “Ethical by Design: Principles for Good Technology” 45 page manifesto as well as listen to “Why we need more Human Algorithms” a podcast with Flynn Coleman and Mike Walsh.
I would suggest the pathway out of the underfit/overfit trap is the pursuit of ‘optimal certainty.’
the value of optimal certainty:
We deal with inherent complexity every single day. Once again. It’s naïve to think we can solve complexity nor is it smart to think we should solve complexity.
Additionally, reductive simplicity, eliminating detail, is not the solution to complexity. Removing detail not only robs the richness complexity offers, but also robs people of understanding.
As Jackson & Jackson suggested in How to Speak Human, “the challenge is confusion.” Confusion complicates (and while complexity is good, complications are bad). Confusion creates cognitive load and cognitive load is not good.
** Note: It is interesting that IBM actually has something called “IBM Decision Optimization” which is a family of prescriptive analytics products that combines mathematical and AI techniques to help address business decision-making
That said. The truth is that a reasonably good decisionmaker will not make a grossly inefficient decision. Sometimes optimizing certainty is easy, sometimes hard. Sometimes optimizing flies in the face of certainty, and sometimes it requires you find some certainty within what can often look like a big blob of chaos (uncertainty). I will note that in my experience, always, no one has ever been able to predict or analyze without data. I say that because it would become quite easy to begin thinking optimal certainty is all about instinct, it is not. You have to incorporate some sense of logical decision-making. Logic in that we seek to exclude <or marginalize> emotions and try to use rational methods <perhaps even mathematical/statistical tools> with the intent to isolate what is typically called the decision utility.
—-
‘marginal units‘ and the economic concept of marginalism. The fact that prices don’t correspond to the total value of all goods in existence but rather the marginal unit (the value of the next unit).
—-
That is not to say instinct does not matter but “good instinct” (intuition) is actually a function of experience not just ‘gut.’
** Note: Thoughts on instincts and intuition.
Optimizing certainty:
If you accept the fact the environment is uncertain and one seeks certainty within, you then also have to accept the environment drives the optimization, not the solution (you cannot place a square peg solution into a nonexistent hole no matter how hard you try). I will note optimization is challenging because the brain defaults to what is perceived the simpler option if there appears to be a small difference in a perceived outcome.
Optimization actually demands some choice architecture (the impact that our environment can have on our decision making).
This leads me to a book called “The Power of Pull: How Small Moves, Smartly Made, Can Set Big Things In Motion”. Basically <so you don’t have to read the book> they suggest that business success can often be defined by excelling in “managing serendipity.”
“By mingling with many strangers you can find that you often bump into people who can not only give you valuable information. The authors’ core argument is surely right: today’s technology, especially the internet, is undermining the old top-down approach to business, which the authors call “push”, by giving individuals more power to shape their working lives. There are huge opportunities available to people who can figure out how to use the “power of pull”, a term the authors define as “the ability to draw out people and resources as needed to address opportunities and challenges.”
They propose a three-pronged pulling strategy. First, approach the right people (they call this “access”). Second, get the right people to approach you (attraction). Finally, use these relationships to do things better and faster (achievement).
The authors argue that change is both faster and less predictable than before, making traditional top-down planning trickier. And because things change so fast, knowledge is increasingly dispersed. Instead of drawing on a few, stable, reliable sources of information, managers today must tap into multiple, fast-moving, informal knowledge flows.
Which brings us to managing that serendipity funnel.
When knowledge is dispersed, you are less likely to find what you want via a formal search. You may not even know what you are looking for. But you are more likely than you were in the past to discover something useful through a chance encounter.”
The Economist
Now. ‘Serendipity’ sounds a lot like embracing chaos. I would suggest complexity, as part of natural order, does not seek chaos, but actually seeks optimal stability (lily pads of certainty/stability so the ‘gyre not fly off the wheel’). This happens because systems are people and people need some constants, some stability, some certainty, to hold onto. In business optimal stability seems centered on something I call the ‘construct-to-agility* ratio.
*agility is the ability to shift resources to meet needs (reading: The Cactus and the Weasel).
** construct-to-agility ration: optimal certainty in which you find the perfect ratio of stability to freedom/chaos.
*** Note that this is just not a business definition, but also an individual behavior definition (time, energy, knowledge, emotion, need, etc).
Regardless. Part of optimal anything is the ability to optimize speed & choice making, i.e., spend enough time on a decision so that it is far sighting enough, but not too much time that there is paralysis and too much time has passed and choice evaluation of an array of choices – let’s say this is a version of ‘fast & frugal decision-making.’
Optimal certainty decision-making demands assessing not just binary conflicting objectives, but multidimensional conflicting objectives AND conflicting challenges (vulnerabilities can become strengths in certain contexts).
“Constrained optimization is the art of compromise between conflicting objectives.”
William A. Dembski
I will note the secret ally of optimal certainty is optimal curiosity
Optimizing the certainty/uncertainty equation demands relentless curiosity. Decision-making stimulated by understanding and the absence of understanding. Too much ‘understanding’ and you lose curiosity to explore beyond your own certainty and too little ‘understanding’ and curiosity seems like a vast empty space of uncertainty in which finding any stepping stones of certainty seems an impossible quest. But it takes a certain type of curiosity because some curious people seek answers (certainty) when they should be seeking ‘probabilities’ (or probable scenarios/consequences/outcomes).
“Certitude is not the test of certainty.”
Justice Oliver Wendell Holmes Jr.
Now. The obvious enemies of optimal:
- signal versus noise
The signal is the meaningful information that you’re actually trying to detect. The noise is the random, unwanted variation or fluctuation that interferes with the signal.
In exact terms, it’s a measure of how much useful “signal” there is in a thing over time, versus how much useless (and costly) “noise” must be removed or ignored. Signals have solutions, noise does not. All that said, discerning between signal and noise is part science and part art and 100% optimal certainty. We must be aware of the fact the rational part of our brains desire certainty to such a degree we will often hold on to whatever noise seems to have a solution attached to it, thereby ignoring the important signal because its solution is less obvious or less certain in its outcome.
Ponder.
- the proactive/reactive binary
I will not discuss scenario planning here, but rather sensemaking and choice making. Within complexity seeking, and finding, optimal certainty demands a little push and a little pulling, a little linear and a little non linear decision-making. It demands A LOT of ‘I don’t know” and A LOT of probabilistic sensemaking. It demands simultaneously accepting that chaotic system are inherently deterministic.
* Note: chaos theory: there will always be some uncertainty in the initial conditions, and it makes sense to characterize the behavior of a system in terms of its response to this uncertainty. Chaos Theory states deterministic systems have moments of chaos.
Yes. This means that sometimes, sometimes, you are seeking optimal ‘chaos.’ What I mean by that is most business, well done, lurks at the edge of chaos. That is business.
Yeah. I just shared chaos and optimal.
Which leads me to the fact chaos, and uncertainty, naturally makes people want to flex their ‘control’ muscles. This encourages proactive behavior. Unfortunately, we also, mentally, attach ‘predictability’ to our attempts at proactiveness, i.e., “because I was proactive, I was more likely to generate the desired outcome.” Well. It may or it may not. Context actually drives outcomes and proactiveness does not guarantee context <just movement, or progress, within a context>. That said. Being solely emergent based, or reactive, can quite easily, and quickly, lead you astray chasing things (and it uses up a lot f energy also). It is delusional to only be proactive or reactive. Optimal certainty can only be explored, and discovered, through both proactiveness & reactiveness. Choosing between the two is a false binary.
Ponder.
“Faced with a maze of causal influences, unable to trace all their interactions, the most we can do is focus on those that seem most revealing for our purpose and recognize the distortion implicit in that choice.”
Toffler
All that said.
I imagine the very last question is “can you always optimize certainty?” Yes and no. Optimization is not always a sacrifice of one thing for another, but more often than not we make it so. This means optimizing certainty is not about clever optimizations that have no meaning or actually LOSE meaning in compromise. What do I mean by that? Meaning can be lost in 2 main faux optimization techniques:
- Complication: in the pursuit of ‘balance’ a decision/solution is arrived at that is hard to develop, difficult to debug, and absolutely impossible to maintain.
- Simplification: in the pursuit of everything you arrive at nothing. Now, to be clear, that ‘nothing’ can look good, look smart and even look right, but more often than not in the pursuit of ‘optimal’ you squeeze out any of the value that could have been achieved. That’s the underbelly of simplification.
In either case, optimization is meaningless.
In conclusion.
William Ogburn said “we should admit into our thinking the idea of approximations, that is, there are varying degrees of accuracy and inaccuracy of estimate.” This flies in the face of what most business people seek – tighter certainty in the future. The entire concept that having an approximate idea of what lies ahead is better than having none is an anathema to most business people. They see an approximation as of equal value to knowing nothing. “That doesn’t do me any good” would be the conference room utterance while viewing such a thought. Which leads me to Alvin Toffler from his 1965 Horizon article “The Future as a Way of Life”.
“Perhaps we need special courses in ‘how to predict.’”
“How many of us, even among the educated public, understand the meaning of a random sample, or of probability, or of prediction by correlation? This does not mean all of us need to be statisticians or logicians or mathematicians. But the principles of s scientific prediction can and should be grasped by every youngster who graduates from high school, for these are not merely the tools of scientific research, they are powerful instruments for dealing rationally with the problems of everyday existence.”
* Note: on a prescient note Toffler also warned in the 1965 piece that he said “it would, of course, be foolish to oversell the ability of science, hard or soft, to foretell the future. But the danger today is not that we will overestimate this ability, but that we will underutilize it.”
Sound thinking is not in the purview of certainty, but rather either probabilities or optimal certainty. This is not to suggest that the future can actually be predicted as a single static future, but rather to offer an overarching territory in which business quests can explore. Preciseness does not matter. In fact, preciseness is the enemy of adaptiveness (or makes one Fragile in Taleb terms).
‘People’s capacity for adaptation may have limits, but they have yet to be defined’
Toffler, 1965
Lastly.
Optimal certainty is always about balancing the grander narrative and smaller stories within – the meta AND the mesa. In fact. I have always suggested we view, at least in business, the meta/mesa (or clouds/ground) discussion incorrectly. I believe the correct way to view it as a successful business should have its feet in the clouds and head on the ground.
Anyway. Years ago I wrote about ‘optimal newness’ which is enough ‘old’ (understandable/relatable) to provide comfort and enough new to make interesting. Optimal certainty is the decision-making version of optimal newness – enough certainty for stability and enough uncertainty for adaptability.
In the end.
Maybe the future of work is not about navigating uncertainty, but navigating certainty. Becoming better at isolating the tenuous spots of stability and replicable and optimizing them to the benefit of an uncertain world.
Oh. Maybe that puts an entirely different spin on the concept of ‘scaling’, but that is a different post for a different day.
===========================
author’s note:
this piece has been sitting in my draft folder since February 2020. it is actually a prelude to a thoughtpiece on the business model of the future – a 100% emeregent organization grounded in what I call “The Conceptual Age of business. this piece is incomplete, yet, completely full of thoughts. i apologize for its roughness but if you were to take one thing away i would suggest uncertainty is, and has always been, the normal context and we have been fighting that belief for decades.