What is really inevi­table about digital change?

Inevi­table – What is really inevi­table about digital change?

“Inevi­ta­bility in the digital realm is the result of momentum of an ongoing techno­lo­gical shift.“ Based on this thesis statement, Wired Magazine co-founder Kevin Kelly identifies unavo­idable trends in his book “The Inevi­table — Under­standing The 12 Techno­lo­gical Forces. Indeed, there seems to be little doubt about the inevi­ta­bility of trans­for­mation. But this makes it inevi­table to discuss the topic of change and its possible expres­sions in an open and unbiased way – if we want to co-shape the digital future.

Our world in present progressive tense – where are we heading to?
By naming 12 major trends, Kelly[1] provides a description of how our world, driven primarily by digitiz­ation, is changing:

  1. Becoming: „We are constant newbies“,
  2. Cogni­fying: „ubiquitous artifical intel­li­gence“,
  3. Flowing: „stocks to flows“,
  4. Screening: “all infor­mation will become liquid”,
  5. Accessing: “the availa­bility of anything … immediately without owning”,
  6. Sharing: “everyone creates and it’s all shared”
  7. Filtering: “an exponen­tially expanding universe requires filtering based on who we are”,
  8. Remixing: “whatever is new is a remix of what exists”,
  9. Inter­acting: “interact with our devices and with others”,
  10. Tracking: “We will track and be tracked every­where and everywhen”,
  11. Questioning: “creating a new level of organiz­ation where questioning is the norm”,
  12. Beginning: “These forces will shape our future and we are only at the beginning”.

Without going into every detail of these trends (or being able to do so in this context), we could summarize by saying that it is, in general, about trans­for­mation, dynamiz­ation and poten­tiation on an unpre­ce­dented scale.

The poten­tiation of human possi­bi­lities

Taking a closer look at the trends described by Kelly, we can – aside from the “lique­faction” of products into processes – make out a poten­tiation of human possi­bi­lities. Here, a good example is the trend of “Remixing” (8.). In this context, Kelly gets to the heart of matters, stating:

„All new techno­logies derive from a combi­nation of techno­logies … [which] creates an unlimited number of new techno­logies“[2].

After all, the digital world itself is the result of remixing, using knowledges, techniques, and techno­logies from all times like, for example, numbers, letters, codes, cables, electricity, electronics, audio-visual media, etc., thus enhancing the possi­bi­lities of humankind. Of course, there are issues arising in this context like those of intel­lectual property, patents, etc. However, according to Kelly, it would be nearsighted to only call for regulation[3] since the tradi­tional concept of property is based on the principles of agrarian societies, which do not corre­spond to the realities of the digital world any more. The property of the future will not consist of material goods, but “intan­gible bits”, i.e. immaterial things. Service and product providers as well as manufac­turers will gain value from things that are “better than free”: Immediacy, Embodiment, Integration, Acces­si­bility, Authen­ticity, Disco­vera­bility, Perso­na­liz­ation, Liquidity.[4] Though, as Kelly states, legis­lation is lagging far behind develo­p­ments, it will eventually follow them. The trend of “Sharing” (6.) provides one good example of how little techno­lo­gical develo­pment can be delayed by legis­lative regula­tions. In 2001, the p2p sharing platform Napster was taken off the net after inter­vention by the music industry. Now sharing is being commer­cia­lized and legally offered by net giants like Apple and Amazon[5] – with conse­quences for the whole music industry.

Tracking – Problem or oppor­tunity?

Describing the trend „Tracking“ (10.)[6], Kelly addresses a topic which is consi­dered especially proble­matic by critics of the ubiquitous digitiz­ation. In this context, some buzzwords are “glassy” citizens, consumers, humans. However, Kelly focuses on the oppor­tu­nities of the surveil­lance of everyone by everyone (e.g. in the Internet of Things, IoT), leading to a better self-knowledge, the possi­bility of optimized health care, but also – and this is really intere­sting – to an expansion of the sensual possi­bi­lities of man. „Coveil­lance“ is consi­dered much less a threat than a “natural state“.[7] Societies must be trans­parent, because anonymity would be destroying any social system. In ancient times, as Kelly points out, clans were strangers to the idea of privacy as we under­stand it today.[8] But he goes even further, saying:

„The internet makes true anonymity more possible today than ever before. At the same time the internet makes true anonymity in physical life much harder.”[9]

To sum it up in a simple way: digital anonymity will be incre­asing in the digital world while, at the same time, it will be decre­asing in the real world. As a result, tracking will be providing enormous oppor­tu­nities, coming along with the risks. So then, we are on a journey toward two “singu­la­rities”: one of them the enslaving dominance of Artificial Intel­li­gence (AI), which is referred to as “hard singu­larity”, the other an AI which will be, fortu­n­ately, not smart enough to enslave us („soft singu­larity“)[10]. Although Kelly considers the soft variant to be more likely, there is still enough potential left for disturbing visions. But we have to face these develo­p­ments if we want to be able to influence and control them in the direction of a “soft” singu­larity anyway. Apart from the question of which ’singu­larity’ will prevail, Kelly uses the example of ‘Tracking’ to shed a light on the potential for change coming along with digitiz­ation and as well as on its enormous potential for change, deeply impacting both our lives and the societies we live in. Without doubt, his point of view has the potential to raise a great deal of contra­diction, especially in regions and societies with a totali­tarian heritage or present. Hence, it is all the more important to lead and maintain and open and unbiased discussion on the most critical topics and issues, always aiming to find viable solutions to present and future challenges. Only in doing so, we will be able to properly seize the oppor­tu­nities of digitiz­ation, provided, for example, by innovation areas like the IoT.

Mind the trap of self-fulfilling prophecy

Already the title “The Inevi­table“ gets Kelly’s point across: „We can or should attempt to prohibit some of results or manife­sta­tions the techno­lo­gical shift, but the techno­logies are not going away. Change is inevi­table. We now appre­ciate that everything is mutable and under­going change, even though much of this alteration is imper­cep­tible“.[11]

Nonetheless, he recognizes that, although human beings are exposed to the universal forces of digital change, there is still scope for action with regard to the expres­sions of digitiz­ation: “But while culture can advance or retard expression, the under­lying forces are universal”.[12] And elsewhere:

„We are morphing so fast that our ability to invent new things outpaces the rate we civilize them. These days, it takes us a decade after a technology appears to develop a social consensus on what it means and what etiquette we need to tame it.”[13]

This raises the question of the inevi­ta­bility of certain techno­lo­gical develo­p­ments. Basically, it can be said that few things have proven to be unavo­idable in history in the long run. There is, and always has been, the possi­bility of refusal, withdrawal, or regulation. The latter may come late but will come all the same.[14] Even though many evange­lists, but also companies, challenge the option of regulation, we know that there is always counter-movement to every movement, resulting – at least in the majority of cases – in mutual compromise.

The problem is, however, that we can only under­stand and recognize any result and process in retro­spect. And even a suppo­sedly ‘fixed’ status quo such as the nuclear phase-out can be changed at a later time when the use of nuclear energy would become socially accepted in a modified form again. In this context, the term “inevi­table” generates the embarr­assing feeling of being trapped and at the mercy of our desires and peer pressure – the need to fit in. At this point, the term becomes a “self-fulfilling prophecy”.

Invitation to an ‘inevi­table‘ dialog
As apodictic as Kelly’s theses come across, and as much as we agree that (techno­lo­gical) develo­p­ments are unavo­idable for a period of time – at least as long as we want to parti­cipate in them: they should be under­stood less as a prediction of inevi­ta­bility, but as an offer to start a fruitful dialog on the matter of digital change. And this is the special merit of Kelly, helping us under­stand present develo­p­ments, identify and interpret them. Describing, for example, that there are many various forms of intel­li­gence, and pointing at the fact that machine intel­li­gence is very different from human intel­li­gence, Kelly provides valuable insights into the huge potential of digitiz­ation. At the same time, it is becoming clear that there are risks that could far exceed the possi­bi­lities and oppor­tu­nities of “Artificial Intel­li­gence” (AI).

If we want to further influence develo­p­ments and keep control over them, we have to deal with them in a constructive manner. And this is what we want and will do. In retro­spect, it has always been public discussion and the coöpe­ration of economy and society that enabled a “social consensus”, helping turn the benefits of technology into genuine progress.

Techno­lo­gical aberra­tions do not funda­mentally disprove progress, but can be seized as an oppor­tunity and used as approach to make correc­tions. Opening up whole new dynamics for us, the ongoing digitiz­ation calls for completely novel dynamics of questioning things. Already today, we can see new dimen­sions of how people challenge or argue – a develo­pment outlined by Kelly as a future trend (Questioning, 11)[15].

If digitiz­ation is inevi­table, we must under­stand it as our inevi­table task to ensure that new techno­logies will be used as an extension of human possi­bi­lities – but not as a means of limiting humanity. Techno­logies create oppor­tu­nities and options. They will gain accep­tance only if they support us in solving problems and helping us meet the most pressing challenges we face today and tomorrow.

One of these big challenges is the question of how we deal with the infinitely incre­asing amount of knowledge being generated every day and making us “permanent newbies”. Due to the fact that we are becoming such permanent newbies, we will not be able to fully master things anymore. We will be, on the contrary, subject to constant change. If we want to make use of this ever incre­asing amount of knowledge, we will have to accept change and develop new skills to tackle with upcoming challenges – which also includes the use of new techno­logies.

And this is where we experience a tremendous expansion of human possi­bi­lities. At the same time, this is the point where we face the need to think about the topic of ‘resilience’. We should think, in spite of all the euphoria, about possible negative effects and how we can prevent them. It is important not to bury our heads in the sand, but to follow the call to think ahead.

For example, ’smart home’: It is very good when smart home techno­logies help us protect privacy and prevent damage from burglary. At the same time, the same technology can be used by burglars to gain easier access to our homes. In addition, the constant collection of smart home-generated data could lead to negative effects on privacy, although they were, by definition, designed to improve the quality of services and products as well as to develop new appli­ca­tions.

The example shows how important it is to think ahead and make correc­tions if necessary. Only if we address present and future develo­p­ments and trends openly, we can prevent their concrete expres­sions from becoming inevi­table.

Thinking ahead and leading an open and unbiased discourse secures our further parti­ci­pation in change. This way, we will be acting and creative subjects and not objects. One the other hand, both hostility towards technology as well as uncri­tical enthu­siasm would lead to inevi­ta­bility at the end of the day. Risks are not colla­teral damages, but only a conse­quence of the refusal to think ahead, treat customers with respect, and under­stand problems within the context of change.

And this is where the challenge for innovators starts – and where the wheat is sorted from the chaff, separating the winners from the losers.

[1] Kelly, The Inevi­table: Under­standing the 12 Techno­lo­gical Forces That Will Shape Our Future, Hardcover Edition – June, 7, 2016, p. 4.

[2] Kelly, pp. 193 ff.

[3] Regulation without balancing the interests of economy and society is often seen as an obstacle to innovation (cf. https://www.bearingpoint.com/de-de/unsere-expertise/insights/droht-der-digitalwirtschaft-eine-regulierungsbremse/).

[4] Kelly, pp. 207 ff et passim.

[5] Cf. https://www.bearingpoint.com/de-de/unsere-expertise/insights/droht-der-digitalwirtschaft-eine-regulierungsbremse/

[6] Kelly, pp. 237 ff.

[7] Cf. a more critical view on innovation in: Ai Weiwei lebt in unserer Zukunft, https://netzpolitik.org/2015/hans-de-zwart-ai-weiwei-lebt-in-unserer-zukunft/

[8] Kelly, pp. 261 ff.; on the develo­pment of modern privacy cf.: P. Aries, G. Duby, The History of Private Live, 5 volumes, 1993.

[9] Kelly, p. 262.

[10] Kelly, pp. 294 ff.

[11] Kelly, p. 5.

[12] Kelly, p. 4.

[13] Kelly, p. 5.

[14] Cf. https://www.bearingpoint.com/de-de/unsere-expertise/insights/droht-der-digitalwirtschaft-eine-regulierungsbremse/

[15] Kelly, pp. 269 ff.