Hugh McLeod

 

‘.. striking a balance between humans and machines … on one side biased and arbitrary human organizations and on the other, soulless technocracy based on unfeeling machines. There is a path.”

Mike Walsh

——————-

Technology, in and of itself, is an incredible tool for progress. When viewed in an ideal, or idealistic, way it is a democratized disseminator of knowledge, information, and connector of people. Technology, in its ideal form, lets call it a ‘static state’ is an energy for positive progress. So, if it is not within the ideal state we have to wonder why. The answer is not actually that difficult – humans, people. Humans ‘re-design’ technology to manipulate. I could argue that technology has simply amplified manipulation (someone is always attempting to manipulate you & your thinking and very often in a deceptive way) but the amplification has become so sophisticated we are being manipulated in ways it is difficult for people to unravel the manipulation (it takes an incredibly high sense of awareness). This isn’t to let humans being manipulated completely off the hook, just to say society, in general, has better things to do than question everything within the 30,000 decisions they must make, on average, every day. The truth is that humans have wired algorithms to reward our worst instincts/behaviors in, well, every domain.

** note: We would be ecstatic if people explored multiple sources online before forming their own opinions, but the reality is that the power of information distribution has fallen into fewer hands with more power to control the actual distribution of information. I would suggest this issue is only amplified with the increased use of handheld technology (smartphones/pads) to do research. I say this because smartphone research is increasingly likely to be “question/answer research” rather than “exploration research.” The former has its uses but is also more likely to not actually offer the greatest truths.

For example.

Why shouldn’t there be an algorithm that instead of sneaking in the infamous “you may also like this” (which tends to drive people farther down the misguided thinking rabbit hole) the algorithms offered up a “to better understand this topic you should read this.” In other words, algorithms that fight the natural gravity of shitty thinking.

To be clear. This, at its core, is not about technology. It is about humans creating things for humans to create a better society (and, yes, I could argue that I can be a raging capitalist and still endorse this idea – albeit it would make me work a bit harder to fulfill my profit goals as a raging capitalist).

 

Look. I’m not scared of technology; I’m scared of humans. I love technology. Its given me access to more knowledge and more knowledgeable people than I could have ever imagined.

But scared of Facebook? No. I’m scared of the people writing the algorithms & software and I am scared of the people who are either not self-aware enough, or have the critical thinking skills, to fight back against the humans writing the algorithms (I foresee this as an ongoing conflict).

 

“Knowledge, in principle at least, is infinitely expandable.”

Toffler 1991

The internet has enabled the quirky to unite globally. What may have been one or two people in a vicinity, loners in their particular focus in Life and thinking and likes/dislikes, have become clubs and groups and a swarming team of fish swimming in their own direction enjoying the fruits of the sea of information and things available online. The difficulty is that there are other humans crafting currents within that sea so that swarms congregate in certain areas and do certain things (and think certain things). Humans manipulate the sea so that the fish can be captured (by business) or driven to certain places (for business). The truth is that many of us either drown in the violent sea of information or get lost following currents of information.

It would be an understatement to say that technology has affected society. That said. I don’t believe it is all bad, I do not support the dystopian view of the effect of technology on people & society and actually a proponent of Grady Booch’s point of view: “New tech spawns new anxieties, but we don’t need to be afraid an all-powerful, unfeeling AI.”

I, similar to Grady, tend to focus on how artificial intelligence will enhance human life. This is not to suggest that technology has not contributed to economic inequality, social issues and some aspects of division. But I do not believe we are at the end game with technology, but rather the middle game.

 

The relationship between technology and society is not simple, nor complicated (it’s about amplification and access), but it is complex. Its complex not because technology is difficult understand, but rather because people are involved. Software is designed with theory in mind and those theories go beyond simple economics or even politics but directly impact the type of society, social change and civilization, we seek to have. These theories, embedded in things that influence/manipulate us, impacts how we view, and do, work as well as even our lives.

Now. Now that I have stated this is about people, I will now suggest that morality, ethics, and empathy need technology. The destiny of all those things are dependent upon technology. If that is so then I imagine I could say society’s fate resides in the hands of technology. That may seem backwards in that all of these things reside in humans, not technology. Well. They do but technology will either flatten, amplify or even extend the reach of all. I am certainly not suggesting that technology will guarantee morality or ethical behavior, but I would suggest improving technology increases the likelihood that morality and ethical behavior becomes more pervasive (expansive).

While morality and virtue are developed over time <via repeated decisions to choose what is right and to forego what is wrong> this means technology, currently crafted to manipulate us, can also be crafted to engage our ethical senses. Theoretically this means lots of small decisions being directed in the present to a longer term greater good for society.

** note: I believe it was Daniel Schmachtenberger who suggested ethical behavior was better in smaller tribes because it was more difficult to be unethical (transparency). This would suggest for this idea to be successful, technology engaging ethical senses, that there would need to be some transparency so that people couldn’t ‘game the ethical decision’ to their benefit rather than the collective good.

 

All That said.

Technology, in and of itself, is nothing. Without people, without people generating content, it is a passive tool regenerating itself to its own purposes. Yet. Once humans become involved technology begins to amplify – amplify divides, fragments, groups and tribes. It is within the fragmentation aspect in which we begin to pause on the benefits of technology with regard to society. The fragmentation, the phrasing of ideas, ideologies, values, norms and actual ideological commitments just begin to blur the greater truths associated with each. Fragments get emphasized to strengthen pieces of views all the while blurring larger issues.

In addition.

Technology has a nasty habit of increasing wealth inequality. It’s impact is additive to lower income gains and multiplicative to higher income gains. As technology improves “better” is not equally distributed. And just to return to morality and ethics for a moment. I would suggest that as technology improves (as improvement is judged in the present) there is a reverse distribution than wealth. What I mean by that is as technology improves low income morality & ethics get better in a multiplicative way while high income morality is additive. In other words, those with less wealth and less money own the higher moral ground while those with the money (and inevitably the power) reside in a lower moral ground.

That juxtaposition of things creates stress to a societal structure.

I am not suggesting that technology should be the tool which corrects inequities, but it would be foolish of us to ignore the fact technology amplifies both good and bad and is a tool which can affect inequities.

It was McLuhan who said, “we shape our tools and thereafter our tools shape us.” Once I again I point out without people technology is basically nothing – it needs designing, data, content, input, behaviors, attitudes, actions and, yes, even decisions to curate. 

I find it interesting that as we move into discussion on a world of technology, society and progress of civilization there are two loud voices in this discussion – one dead and one alive. Alvin Toffler and Daniel Schmachtenberger. I find them both equally interesting because they both explore the issues civilization is currently facing due to technology. Toffler discussed this from a speculative frame of reference while Schmachtenberger is discussing it from an experiential frame of reference.

** note: I wish they could have had a podcast together discussing this topic

It was Toffler, in 1981, who stated that technology will create a new civilization that “challenges all our old assumptions” by inspiring, or dictating, a painful process of social change. Toffler suggested civilization has its own atmospheres:

    1. Techno-sphere: an energy base -production system -distribution system

    2. Socio-sphere: inter-related social institutions

    3. Info-sphere: channels of communication

    4. Power-sphere: relationships with outside world -exploitative, symbiotic, militant, or pacific

    5. Super ideology: powerful cultural assumptions that structure its view of reality and justify its operation

 

As technology squeezes each of these layers there is an inevitably stress on the power, and distribution, of knowledge.

“the coming struggle for power will increasingly turn into a struggle over the distribution of and access to knowledge not just consumption. Power seekers’ will use this struggle to further their power ambitions. While mass could have been seen as sweeping generalized manipulation of attitudes and beliefs technology is using fragmentation to have those with power divide and conquer.”

 

** note: power, and power dynamics, and the power of conserving institutions, is a topic in and of itself.

Knowledge, therefore, is the fuel for change, whereas technology is its engine creating the liminal space within which the social conflict occurs – people shocked by what they see as destruction of everything they know (and actively attempting to consolidate past fragments they see value in) versus people embracing positive change and empowerment thru the fragmentation. I say this because it is not a conflict of technology or enabled by technology, but rather of people – one within people’s minds and how they think. This means knowledge and technology are the two powerful ‘tools’ in facilitating changes in society.

This is where it gets tricky. Information should be intrinsically beneficial to society and, yet a fragmented flow of information and communication results in increased knowledge AND fragmentation.

Technology plays a role in where we go from here, but first, people need to think about the future they desire which, typically, rests on the most fundamental assumptions we make about reality. In the present competitive view of the world, we often think that the most capable are those who are the most competitive, and accordingly that competition creates and secures long-term viability. On a societal level, we have essentially adopted a zero-sum dynamic behind most of the things we do. This separates everything, and everyone, into a binary 0/1, yes/no, good/bad, this way/that way design world (software, algorithms, business individuals and even countries).

This current design belief has only made the divide between winners and losers constantly grow. Polarization leads to radicalization (of ideas, opinions, behavior) and, well, society does not benefit.

We do need technological advances, but our humanity and society must develop at the same speed as technology develops. As our technological ability to impact the world is radically scaling up, our human ethical choices as to how to implement that power must scale up accordingly. But the present reality is that our humanity is lagging our technology.

 

“The key dynamic is to move forward towards a more profound experience of ethics, and also aesthetics, in a way that increases our collective capacity to connect and solve the common problems on the existential level that we face.”

Tom Atlee

In the end.

As usual, the future will be defined by humans, not technology or any innovations.

Technology design needs to embrace less destructive manipulation and more ‘enlightenment.

People need to design algorithms not based on brain exploitation to the detriment of people (and society).

People need to make content not based on brain chemistry exploitation to the detriment of our whole society.

People need to be educated to become more effective at critical thinking and embrace a more thoughtful worldview.

People. We can keep talking about technology as if it were a person, but it is absurd to do so. It is not a person and, in fact, the problem I not technology, it is people. The future of society, let alone everything I imagine, resides in the minds and actions of humans and the quicker we accept that the quicker we begin taking steps to shape a future we like.

, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,
Written by Bruce