Internet Explorer 11 is not supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

The Fourth Industrial Revolution and the Struggle for a Preferred Future

A nascent concept out of the United Kingdom captures the promise and pitfalls of a technology-laden future. As its anticipated arrival date of 2030 comes closer, feet grow colder. Is the story the same here?

Editor’s Note: We’ve been reading about the Fourth Industrial Revolution, a formulation introduced in tech-socioeconomic literature by the World Economic Forum in 2015. It captured the public imagination and became subject of considerable debate in the UK in the intervening years while failing to gain much traction here in North America. But the underlying concepts, conversations and controversy should not be lost in translation. As Economics Professor Jamie Morgan writes in the essay that follows from The Conversation, the Fourth Industrial Revolution is at the center of tension between technological transformation of society and widescale destruction of jobs. In the UK context, Morgan contends government has come to rest on now familiar policy positions. They include “competitive threats” (If we don’t do it, somebody else will); “employability” (with its focus on personal responsibility), all while giving short shrift to the threats technology poses to humanity in late stage capitalism that undermines wage earning, profit making, and tax paying. Language use notwithstanding, how different is the experience here? – Paul W. Taylor, Editor




Cast your mind back a decade or so and consider how the future looked then. A public horizon of Obama-imbued “yes we can” and a high tide of hope and tolerance expressed in the London Olympics provides one narrative theme; underlying austerity-induced pressure another. Neither speaks directly to our current world of divisive partisan politics, toxic social media use, competing facts and readily believed fictions.

This should be instructive. The future is made, not discovered, and yet we are constantly confounded by the future as it becomes the present. What we believe, say, do, organise and vote for matter, but the world they matter to constantly eludes our grasp. We often stumble into futures we would rather avoid. Our ecological and climatological future represents one such horizon and whether and how we will work, another.

Organisations are also constantly trying to own the future by mapping out what it will mean for us. The “fourth industrial revolution” is the latest version of this. It is commonly defined as a combination of new technologies, including artificial intelligence (AI), machine learning, natural language coding, robotics, sensors, cloud computing, nano-technology, 3D printing and the internet of things. According to proponents of the fourth industrial revolution, these technologies are set to transform the societies we live and the economies we work in. And apparently, this is likely to be well underway by 2030.

It’s important to grasp, though, that the fourth industrial revolution is just a concept, an attempt to capture the meaning and significance of what seems to be occurring. The idea incites anxiety-inducing headlines regarding threats to employment and a general theme of positivity regarding the benefits of technology.

 

 

 

 

 

 

file-20191220-11924-j9qprd.jpg
How many jobs will be affected? Kate.sade/Unsplash, FAL
 

 

 

 

 

 

A shiny future

The main proponents of the idea of a fourth industrial revolution are think tanks and consultancies working with modellers, economists and tech-experts (and of course technology companies themselves). This work provides the themes, insights and much of the analysis of data that informs current government policy in the form of industrial strategy.

At the heart of this is the World Economic Forum’s work, spearheaded by its Executive Chairman Klaus Schwab, and that of the McKinsey Global Institute. The focus of both is weighted towards expressing the benefits of imminent transformations if we invest quickly and invest heavily.

For example, imagine a world where your toilet bowl tells your fridge that your cholesterol is high. Your fridge, in turn, both adjusts your order for dairy products that week (delivered by automated vehicle or drone from a grocery warehouse) and sends an alert to the healthcare AI whose database monitors your cardiovascular system. This AI, in turn, liaises with your home hub chatbot facility (which rebukes you and suggests you cut down on fats and make more use of your home gym subscription) and, if deemed necessary, sets up a home visit or virtual reality appointment with your local nurse or doctor.

According to the fourth industrial revolution literature, this, like many other possibilities, is science fiction on the cusp of being science fact. It is a commercialised future, a cradle to grave system. A system that, apparently, may help us survive our profligate past and present since the fourth industrial revolution also promises a sustainable future, where a connected set of technologies creates the possibility of controlled energy and resource use, minimal waste creation and maximal recycling.

But these think tanks and consultancies are hardly going to be held directly responsible for the future they help to produce. They are not sinister organisations, but nor are they neutral. The “fourth industrial revolution” is not simply an opportunity. It matters what kind of opportunity it is for whom, and under what terms. And this is discussed much more rarely.

A future for whom?

The emphasis on benefits and the focus on the need for investment subtly distracts from the core issue of who will own the basic infrastructure of our futures. Large corporations aim to control intellectual property for technologies that will influence every aspect of life.

At the same time, those writing about the fourth industrial revolution recognise that there might be what they call “technological unemployment”. Current claims regarding the likely rate of job displacement are mixed. Some research claim between 30% and 50% of current forms of employment could disappear. Some suggests around 10% is more likely.

But the implicit message conveyed by corporations and consultancies, despite the fact that this will affect most sectors of society, is that “the future is coming and you’d better get used to it”. And government messaging and policy has tended to absorb this point of view. For government, the opportunities have been translated into a language of competitive threats: “If we do not do these things, others will.” This subtly focuses attention on inevitable economic consequences without providing scope to consider the broader social ramifications that might need to be managed.

In the UK, for example, there is, as yet, no broad government initiative for public education, consultation and deliberation regarding a subject that may involve profound changes to our societies. Only the House of Lords Select Committee on Artificial Intelligence has flagged this. The focus otherwise has been on “employability”. And the main emphasis has tended to be on individual responsibility. This assumes there will be jobs we can do if we retrain, enhance our human capital, compete with robot capital, and get used to collaborating with technologies.

And yet fourth industrial revolution technologies could put the basic functional relations of a capitalist economy at risk. Waged labour is what allows consumption, which in turn becomes profit for companies, which in turn maintains companies, wage labour, and the capacity to contribute taxes. If adoption of new technologies is rapid and pervasive, then the displacement of human workers may overwhelm the capacity of economies to provide alternative forms of work.

This is one extreme possibility, but it is one that current government policy is doing little to confront. At the moment, in the UK, only trade unions and some fringes of the Labour Party are thinking about the scope inherent in new technology for different kinds of societies that might liberate us from work. This must change.
The Conversation


This article is republished from The Conversation under a Creative Commons license. Read the original article.

Special Projects