The blind alley of digital technologies

To maintain the essential balance, I suggest sometimes taking a sceptical look at the tech rapture which we are increasingly swept up in. Then we will grasp that investing unheard-of amounts in the growth of AI and other digital technologies is not our most pressing need now.

For some time I’ve been bothered by the Solow paradox. This is the entirely counterintuitive connection observed since the 1980s between the growth of digital technologies and productivity. The economic data for the last 40 years show that the most advanced economies have not achieved a significant growth in productivity, despite the spread of computers, the internet, and various digital tools. Indeed, the productivity indicators during this period have been clearly lower than in periods preceding the digital revolution.

I’m not an economist, and I won’t try to resolve this intriguing economic phenomenon. However, I would like to draw attention to other data that may at least partly explain the Solow paradox. These figures were recently presented in an overview by John Burn-Murdoch in the Financial Times.

It turns out that various international studies have found a decline in human cognitive and intellectual competencies in recent years. The scores in PISA tests, studies of young people’s skills conducted for years by the OECD, show a steady declining trend in math skills and in reading and understanding texts. In turn, the Monitoring the Future research in the United States has found a growing problem with concentration and absorbing new knowledge. These trends coincide with the dynamic growth of digital technologies, particularly social media.

This suggests that in our approach to technology, we have probably fallen into the trap of too narrow a perspective. We have uncritically assumed that we must invest in technologies and expand them because they lead to greater productivity, which in turn will translate into an overall increase in prosperity. Limiting our thinking to just one dimension of a narrowly conceived notion of productivity has led us up a blind alley.

Digital technologies can undeniably accelerate many processes and have huge potential for increasing efficiency. With these technologies, within the same unit of time, and with the same resources, we are capable of producing more. Hence the assumption that these technologies lead to higher productivity. It’s just that this claim assumes that the heralded growth in efficiency occurs in a vacuum and is neutral for the overall setting, particularly other factors affecting the production process. In other words, these technologies could indeed drive an increase in productivity, but only assuming that the other factors affecting the production process remain unchanged. However, it turns out that digital technologies are not neutral towards these other factors. More and more data, along with the everyday experience of many of us, point to at least two key changes in this context.

First, digital technologies have generated unprecedented increases in content and information, while also giving rise to a range of new synergies, spaces for interaction, and so on. Generally, they have greatly increased the complexity of the system. The potential profit achieved through accelerating processes has been quickly offset by the increasing amount of content that has to be processed in the course of production. This is particularly evident in the production processes carried out by humans.

The amount of time at a person’s disposal is by definition finite. Theoretically, by replacing a typewriter with a computer, a white-collar worker should be able to perform many more processes in the course of the day than before (e.g. to issue more administrative decisions). But in practice this is not so obvious. It turns out that both the quantity and the complexity of the content that a person must deal with in carrying out a given process have increased. The production process itself could thus become more complex. For example, due to the threat of cybercrime, a process that was once relatively simple now requires several additional security procedures.

A second factor we must take into account is the impact of certain applications of digital technologies on our intellectual and cognitive resources necessary to achieve appropriate levels of efficiency. It is hardly controversial to point out that the ability to focus, and to reason and draw conclusions, encourages efficient action. Meanwhile, some digital technologies clearly hinder these abilities, as shown in the data summarised by the Financial Times. With the growth of the video game industry, social media and streaming services, our attention has been redirected to different regions. The profits generated by the providers of these technologies are often driven by their success at diverting our attention from difficult and demanding tasks, in favour of comfort and instant gratification.

Thus the bottom line is far from clear. Increasing the efficiency of processes through digital technologies, while taking into account the other effects of these technologies, such as the overproduction and complexity of the content they generate, and the reduction of certain cognitive and intellectual abilities, may not lead at all to the hoped-for increase in productivity. Over the longer term, it might be the opposite.

In this context, it is worth asking whether the current dominant model for the development of digital technologies is truly the best possible model. Isn’t there a risk that the huge sums invested in developing these technologies won’t bring about the desired goals, because we forget that our goals, including increased productivity, depend on a range of different factors? Apart from developing digital technologies, shouldn’t we devote just as much attention to identifying and fostering the growth of other factors key to achieving the goals we set for ourselves?

I imagine that everyone has heard of the hundreds of billions of dollars invested in the growth of AI systems. But have we heard of comparable strategic measures in the field of education, fostering mindfulness in young people, or battling addictions to digital media?

These are some of the reasons I believe it is worth defending the European approach to the development of digital technologies, including AI. Looking at it from this perspective, we should not necessarily be racing for first place in developing these technologies. This may bring results the opposite of what is intended—and faster than we may realise. In the longer perspective, those betting on integral and holistic growth may achieve a much better result, even at the cost of a somewhat slower pace of development of technology as such.

The European call for human-centric AI should also be understood in this context. We will reap the positive fruits of digital technologies (including increased productivity) only when these technologies contribute to integral growth and reinforcement of human skills and capabilities. We certainly won’t achieve this by investing billions in the growth of digital tools while at the same time helplessly witnessing our own progressing intellectual decline.

Krzysztof Wojdyło

 

Previous post
Human oversight of AI systems