
===
“To paraphrase someone smarter than me, who still knows nothing, the philosophical task of our age is for each of us to decide what it means to be a successful human being.
I don’t know the answer to that, but I would like to find out.”
Ottmer <the futurist>
===
“Imagine a cheap little device that isn’t just smarter than humans — it can compute as much data as all human brains taken together.”
===
Discussions about all technology seems to careen in-between oblivious no-fear (lack of belief that something like social media or an algorithm could “effect how I think”) and conspiratorial fear (government control, globalists, ‘the great reset’). And that’s before we even discuss something like a brain chip, an invasive introduction into mental enhancement. But there is a future lesson found in that fear binary. That lesson is that something like a brain chip will make the world binary and, objectively speaking, even more unequal.
Let me explain.
The people with the least understanding of technology but the greatest fears will reject the most progressive innovations, i.e., something like the brain chip.
The people with the most understanding of technology and the healthiest skepticism of ‘technology for good’ will accept the most progressive innovations, i.e., something like the brain chip.
Simplistically, just to make a point, the quasi-ignorant social media users will become dumber and dumber while the quasi- careful innovation adopters will become smarter and smarter. The latter will accept the risks viewing the potential benefits as outweighing them and the former will just see risks with little benefits. I would be remiss if I didn’t point out that this means one group of people will progress while another group will inevitably regress which begets a societal issue.
I begin with those thoughts to frame what I see as the benefits of a brain chip (which I do believe is inevitable).
*** note: Michio Kaku believes it is inevitable and Elon Musk even has a company, Neuralink, attempting to design one.
Which leads me to say we shouldn’t dismiss fears, but should still embrace the better hope.
I believe it was Peter Thiel who suggested something like “there’s all these scenarios where the stuff can spiral out of control. I’m more scared of the one where nothing happens.” This may sound odd but I believe we have been lulled into belief that technology is progressing too fast. I say that because most technology innovation hasn’t really been that innovative. Most of it has been superficial, but highly accessible, so that there is a general appearance of fast paced high innovation. But most of the technological innovations haven’t truly moved the needle on truly meaningful things (but have made a lot of money through consumerism). What this means is that our general view of ‘better’ has been skewed toward a fairly low bar. That becomes important when something like a brain chip appears because it is so far above the common bar it takes on all the appearances of science fiction, fantasy and, well, scary stuff – mostly dystopian. But here is where ‘better’ steps in: the higher the brain integration – verbal, visual, special and execution – the higher the intelligence. If a brain chip stimulates just that, well, that’s better. Better thinking by any one individual encourages better thinking within all people. And if enough people embrace the technology it creates structural societal thinking lift. It is this version of better that will encourage a shitload of people to seek to augment themselves in terms of intelligence and skills. And those people will inevitably separate themselves from those who elect to not pursue this opportunity.
That said. Let me describe better in my mind. it is clear that the human brain is incapable of coping with the onrush of new technology in the mosh pit formation available on a day-to-day basis. As trite as it may sound, while we now have more information available to us than ever, overstimulation makes us dumber on a daily basis if not unhealthier (depression, anxiety, etc). Many of us clearly understand our brains just can’t keep up with technology-driven cognitive load issues. Research has clearly documented the influence of information overload on attention, perception, memory, decision making, and even regulating emotions. We have even seen that engagement with technology interferes with the pursuit of other behaviors critical for maintaining a healthy social contract. So, if an implanted chip is able to
address many of our cognitive needs AND make us more effective thinkers, why wouldn’t we consider it? Why wouldn’t we consider augmenting our brain to better optimize it (not change it)? Maybe we should think of the brain chip as existing to help the brain as kind of a thinking companion. Try this thinking. Because this chip would be collecting real-time data on everything imaginable with regard to your brain physiology and sense-of-environment, it also optimizes your physical presence. You gain richer and richer datasets from which the chip can guide you so you could be at your highest functioning thinking and behavioral level. I imagine it actually could augment you to new level. I would be remiss if I didn’t note I am discussing a closed loop machine Learning System. Therefore it is secure and designed to augment only you and personalizes your data as opposed to a one-sized fits all system. However, this means the chip is on all the time (as is your brain). You have to accept the fact your brain chip is listening all the time – to everything (including your memory). What this means is that many things – memories, knowledge, faces, etc. – stored away on some dark dusty shelf in your mind (meaning it has an imprint somewhere in your brain) can be activated by the chip. It takes away that nagging feeling you are forgetting something and brings it to the forefront at the right time. The chip activates a portion of your brain that says “hey remember this/remember what happened/remember that person” and it activates images from the past, in relevant context, thereby heightening your level of attention in the present. The interesting thing about this particular idea is the majority of us remember the things that we like to remember and forget things we like to forget. What that does is inherently bias your views and attitudes. The brain chip doesn’t permit this shortcut. It cuts in line in front of bias with even the things you wanted to forget. To be clear. The chip I am discussing means you remember even the things you really do not want to remember – yeah, even the horrible stuff and the stuff you hate. That said. What this means is you use, better than in the past, what you already know and increase depth of decisionmaking and insight into what you are thinking.
I imagine the next level would be if the brain chip could actually connect with the world wide web and supplement, or complement, what you already know to round out your knowledge. And it does so contextually (which we already know from education deepens learning). That’s next level thinking.
Which leads me to security and privacy.
Some people will never get over their fear of information being stolen and the fears will only increase with a brain chip because it becomes even more intrusively personal. That said. The adopters recognize within an increasingly complex world to keep identity safe and secure – from a personal identity standpoint as well as identity interface to things we own and have – the way to save identity is to actually lean in on technology. Insert a ‘yikes’ here. Yeah. Hear me out. While I have a couple of ideas on how to do this, I tend to belief an implanted chip is the best way forward <for a variety of reasons>. Every person could simply have a tiny chip implanted that permits a computer, or scanning system, to read a personalized code broadcasted by the chip. And while that may sound vulnerable to hacking or copying there are a variety of means and authenticating systems which actually protect us. For example, both Google Authenticator and Blizzard’s official authenticator use open-standard “TOTP” for authentication codes (although different). Google uses 6-digit codes, while Blizzard uses 8-digit codes, but the real idea I offer is that your personal identity algorithm, because it is implanted, can be tied to your biology which, well, cannot be stolen.
Which leads me to creating a good brain chip.
Algorithms really aren’t capable of thinking. They are simply yes, no; A, B; limiting, reductive, not expansive. Well. that is before the brain chip. A brain chip augments are complex brain so we see in patterns, not binaries. That said. We should simultaneously mistrust a brain chip technology and still recognize that it can augment the superiority of the human brain. It doesn’t circumvent our choice making it actually increases our choice base, our consideration set, so that our responses are more well informed and more well formed. It could permit all of us to sift through stacks of information which until now we were simply cognitively unable to do – or do effectively. A good brain chip makes us more effective human beings. It makes us optimize the present and increases our sensemaking skills (which cascades into better choicemaking). I could even argue it just makes us better people in that we become more well informed and a bit less biased. And if that’s the benefit, maybe it is worth the risk. To be clear. I would gladly take a brain chip. Heck. I would take a health chip implant that could possibly optimize my health. The benefits of ‘optimizing’ my brain and body clearly outweighs the possible risks – at least in my mind. And I end with that thought because that is the decision everyone is going to have to inevitably make and I worry that the decisions will only amplify societal divides. But. That is a piece for another day. Today we should be pondering whether you would put a technology chip in your brain.





‘things’ behind and ‘starting anew’ as if you completely throw out the old and start with a clean slate <which sounds good but is not really possible>.




Cause and effect is any easy thing to grasp and I wonder why managers forget it. Maybe it is because we seem to often get caught up in the “blame game” versus “teaching game” (probably because of the alliteration). Or maybe we get caught up in the complexity narrative and begin thinking there is no cause for any of the effects happening. Either of those two beliefs are less than useful if you want to foster an effective business.
==
Should people have to escape and is “awhile,” if you actually do need to escape, even a worthy objective? I imagine the answer to the latter is obvious. It isn’t. It’s not an escape, it’s simply a break. But this isn’t about what is a worthy break and what isn’t, this is about why we believe we have to escape at all.

First. Let me say I am
try and try and then claim it has been measured. I would suggest we do this as part of some devious command and control ideology. What I mean by that is we
Source:
Most businesses fear unmeasured learning not because of wasted time, but more so wasted efficiency. What I mean by that is business fears anything that could create a complicated and time-consuming process that less-than-efficiently stitches together all the necessary knowledge/data to decide or do something. The fear is that reality is vague if there are no numbers to create an outline to see (and business fears vagueness). The fear is that any actionable learning is too late to make the optimal impact on financial performance. Look. Learning shouldn’t be judged on efficiency only effectiveness. Learning has no need for logic other than learning is good and therefore learning has no need for measurements other than “am I consistently providing an environment which encourages people to pursue learning.” I know that sounds like heresy in a business world religiously attached to measurement. I think of “intelligence” as less to do with “knowing a bunch of stuff” and more to do with figuring stuff out in new and uncertain situations, but that skill is only developed by actually being in uncertain situations full of unknowns. So maybe measurement should be reflective of ‘effective navigation’ (financial performance is an outcome of this done well consistently).

Which leads me to talk about learning by lurking again because of this article in Neuroscience News, 
I am certainly not suggesting a Global mind nor is the intent to create a “global mind” or even a “Global society.” I am discussing the benefit of a global collaborative education, therefore, I am discussing collaboration as an extension of having a global perspective in solution-seeking.
In my eyes the value of an educational web world is that it permits a child to regularly place themself in unfamiliar situations, or with unfamiliar people, and provides the opportunity to be exposed to ideas and views that they’ve not been exposed to before. And all of this provides an opportunity for real-life evidence/knowledge to challenge existing certainties – and open the way for curiosity.
then higher creative achievement and productivity is accomplished. Healthy cross-functional teams working in concert for the greater good eventually translates into efficient operations, regardless of whether in an academic, work, social or home environment. In the end, society benefits from groups performing productively with another. Of course, teaching the basics of all of this at the preschool age means a greater likelihood of kids continuing positive collaboration abilities as they progress in life. And I do believe that the structure of web based schools/schooling with children as young as five or six in which they can express their opinions, share thinking & ideas and ultimately propose their own solutions creates a solid foundation for a ‘community individualism drive/intent’ generation of citizens. It is this kind of attitudinal construct which offers unlimited opportunities for leadership and engagement. And it is this type of education structure which, by the time those five- and six-year-olds reach an appropriate graduation age, they will have a profound and enduring understanding of what it means to be in a collaborative society and have the ability to contribute within their own community as well as a global level if given the opportunity.
It was Alvin Toffler who said “the illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.” In other words, literacy is an ability to absorb learning, and understanding, and adapt. So maybe you don’t do it at ‘speed’ , but you are an evolving understander. And then this is where speed comes into play. I do not believe the world is actually moving any faster than it has ever, even in business, but I will say that the faster you understand something; the faster you can do something. So the only reason the world may appear to be moving faster is that with the ubiquity of most information, the ones who ‘speed understand’ just move faster. Circling back to my opening question, the reality is that their intelligence is contingent to the environment.
===
Assumptions as they change are often like tectonic shifts (without the earthquakes). Unseen, and unfelt, a paradigmatic shift creating a fundamental shift in the way that something is understood or approached. It is not simply an incremental change, but rather a change in the underlying assumptions or theories that form the basis of how we see, and believe, about things. These shifts have far-reaching implications to society and how we thrive, or struggle, within that society. Circling back to a prior point, if duration expectations are affected, the general sense to an individual is lack of control and chaos. That said. The majority of assumptions are found below water, not above, and 99% of the time what is above water gives very little indication of what is truly below the water. The majority of people will scan what is floating around and assess that way. The more thoughtful want to know at least something about the parts they cannot obviously see. And the most thoughtful are interested in everything they cannot see even if it takes a lot of time and it is less than simple. I could argue that in Life or in business what we actually do is spend a shitload of time focused solely on the assumptions we can see so we are often late to see the assumptions below changing.
I don’t care if you read this as thinking shifts, belief shifts, attitude shifts or even mindset shifts, there is always a cost involved in reclassifying assumptions-to-experiences. Some people will lead the way and some will lag along the way in this reclassification design. New systems will be created for the ‘new’ even while the old systems remain in place for the ‘laggards.’ And while we talk a lot about the limits, or unlimits, of people’s ability to re-educate themselves, maybe we should talk a bit more about what limits systems have. I say that because if laggards lag too far and builders build too far, the systems gets split in an ugly tug of war in which no one wins and the system becomes to fail on its most basic duties. Emotionally, and experientially, we begin to feel the repercussions of the fact fundamental assumptions have shifted – and we haven’t.