===

“To paraphrase someone smarter than me, who still knows nothing, the philosophical task of our age is for each of us to decide what it means to be a successful human being.

I don’t know the answer to that, but I would like to find out.”

Ottmer <the futurist>

===

“Imagine a cheap little device that isn’t just smarter than humans — it can compute as much data as all human brains taken together.”

===

Discussions about all technology seems to careen in-between oblivious no-fear (lack of belief that something like social media or an algorithm could “effect how I think”) and conspiratorial fear (government control, globalists, ‘the great reset’). And that’s before we even discuss something like a brain chip, an invasive introduction into mental enhancement. But there is a future lesson found in that fear binary. That lesson is that something like a brain chip will make the world binary and, objectively speaking, even more unequal.

Let me explain.

The people with the least understanding of technology but the greatest fears will reject the most progressive innovations, i.e., something like the brain chip.

The people with the most understanding of technology and the healthiest skepticism of ‘technology for good’ will accept the most progressive innovations, i.e., something like the brain chip.

Simplistically, just to make a point, the quasi-ignorant social media users will become dumber and dumber while the quasi- careful innovation adopters will become smarter and smarter. The latter will accept the risks viewing the potential benefits as outweighing them and the former will just see risks with little benefits. I would be remiss if I didn’t point out that this means one group of people will progress while another group will inevitably regress which begets a societal issue.

I begin with those thoughts to frame what I see as the benefits of a brain chip (which I do believe is inevitable).

*** note: Michio Kaku believes it is inevitable and Elon Musk even has a company, Neuralink, attempting to design one.

Which leads me to say we shouldn’t dismiss fears, but should still embrace the better hope.

I believe it was Peter Thiel who suggested something like “there’s all these scenarios where the stuff can spiral out of control. I’m more scared of the one where nothing happens.” This may sound odd but I believe we have been lulled into belief that technology is progressing too fast. I say that because most technology innovation hasn’t really been that innovative. Most of it has been superficial, but highly accessible, so that there is a general appearance of fast paced high innovation. But most of the technological innovations haven’t truly moved the needle on truly meaningful things (but have made a lot of money through consumerism). What this means is that our general view of ‘better’ has been skewed toward a fairly low bar. That becomes important when something like a brain chip appears because it is so far above the common bar it takes on all the appearances of science fiction, fantasy and, well, scary stuff – mostly dystopian. But here is where ‘better’ steps in: the higher the brain integration – verbal, visual, special and execution – the higher the intelligence. If a brain chip stimulates just that, well, that’s better. Better thinking by any one individual encourages better thinking within all people. And if enough people embrace the technology it creates structural societal thinking lift. It is this version of better that will encourage a shitload of people to seek to augment themselves in terms of intelligence and skills. And those people will inevitably separate themselves from those who elect to not pursue this opportunity.

That said. Let me describe better in my mind. it is clear that the human brain is incapable of coping with the onrush of new technology in the mosh pit formation available on a day-to-day basis. As trite as it may sound, while we now have more information available to us than ever, overstimulation makes us dumber on a daily basis if not unhealthier (depression, anxiety, etc). Many of us clearly understand our brains just can’t keep up with technology-driven cognitive load issues. Research has clearly documented the influence of information overload on attention, perception, memory, decision making, and even regulating emotions.  We have even seen that engagement with technology interferes with the pursuit of other behaviors critical for maintaining a healthy social contract. So, if an implanted chip is able to address many of our cognitive needs AND make us more effective thinkers, why wouldn’t we consider it? Why wouldn’t we consider augmenting our brain to better optimize it (not change it)? Maybe we should think of the brain chip as existing to help the brain as kind of a thinking companion. Try this thinking. Because this chip would be collecting real-time data on everything imaginable with regard to your brain physiology and sense-of-environment, it also optimizes your physical presence. You gain richer and richer datasets from which the chip can guide you so you could be at your highest functioning thinking and behavioral level. I imagine it actually could augment you to new level. I would be remiss if I didn’t note I am discussing a closed loop machine Learning System. Therefore it is secure and designed to augment only you and personalizes your data as opposed to a one-sized fits all system. However, this means the chip is on all the time (as is your brain).  You have to accept the fact your brain chip is listening all the time – to everything (including your memory). What this means is that many things – memories, knowledge, faces, etc. – stored away on some dark dusty shelf in your mind (meaning it has an imprint somewhere in your brain) can be activated by the chip. It takes away that nagging feeling you are forgetting something and brings it to the forefront at the right time. The chip activates a portion of your brain that says “hey remember this/remember what happened/remember that person” and it activates images from the past, in relevant context, thereby heightening your level of attention in the present. The interesting thing about this particular idea is the majority of us remember the things that we like to remember and forget things we like to forget. What that does is inherently bias your views and attitudes. The brain chip doesn’t permit this shortcut. It cuts in line in front of bias with even the things you wanted to forget. To be clear. The chip I am discussing means you remember even the things you really do not want to remember – yeah, even the horrible stuff and the stuff you hate. That said. What this means is you use, better than in the past, what you already know and increase depth of decisionmaking and insight into what you are thinking.

I imagine the next level would be if the brain chip could actually connect with the world wide web and supplement, or complement, what you already know to round out your knowledge. And it does so contextually (which we already know from education deepens learning). That’s next level thinking.

Which leads me to security and privacy.

Some people will never get over their fear of information being stolen and the fears will only increase with a brain chip because it becomes even more intrusively personal. That said. The adopters recognize within an increasingly complex world to keep identity safe and secure – from a personal identity standpoint as well as identity interface to things we own and have – the way to save identity is to actually lean in on technology. Insert a ‘yikes’ here. Yeah. Hear me out. While I have a couple of ideas on how to do this, I tend to belief an implanted chip is the best way forward <for a variety of reasons>. Every person could simply have a tiny chip implanted that permits a computer, or scanning system, to read a personalized code broadcasted by the chip. And while that may sound vulnerable to hacking or copying there are a variety of means and authenticating systems which actually protect us. For example, both Google Authenticator and Blizzard’s official authenticator use open-standard “TOTP” for authentication codes (although different). Google uses 6-digit codes, while Blizzard uses 8-digit codes, but the real idea I offer is that your personal identity algorithm, because it is implanted, can be tied to your biology which, well, cannot be stolen.

Which leads me to creating a good brain chip.

Algorithms really aren’t capable of thinking. They are simply yes, no; A, B; limiting, reductive, not expansive. Well. that is before the brain chip. A brain chip augments are complex brain so we see in patterns, not binaries. That said. We should simultaneously mistrust a brain chip technology and still recognize that it can augment the superiority of the human brain. It doesn’t circumvent our choice making it actually increases our choice base, our consideration set, so that our responses are more well informed and more well formed. It could permit all of us to sift through stacks of information which until now we were simply cognitively unable to do – or do effectively. A good brain chip makes us more effective human beings. It makes us optimize the present and increases our sensemaking skills (which cascades into better choicemaking). I could even argue it just makes us better people in that we become more well informed and a bit less biased. And if that’s the benefit, maybe it is worth the risk. To be clear. I would gladly take a brain chip. Heck. I would take a health chip implant that could possibly optimize my health. The benefits of ‘optimizing’ my brain and body clearly outweighs the possible risks – at least in my mind. And I end with that thought because that is the decision everyone is going to have to inevitably make and I worry that the decisions will only amplify societal divides. But. That is a piece for another day. Today we should be pondering whether you would put a technology chip in your brain.

Written by Bruce