Newsletter

Subscribe to the latest news from the Nordic Labour Journal by e-mail. The newsletter is issued 9 times a year. Subscription is free of charge.

(Required)
You are here: Home i In Focus i In Focus 2024 i Theme: AI and the labour market i Everyone talks about AI, but is it not just more digitalisation?
Everyone talks about AI, but is it not just more digitalisation?
tema

Everyone talks about AI, but is it not just more digitalisation?

| Text: Björn Lindahl

What is the difference between artificial intelligence and normal digitalisation? What will be the consequences for the labour market? Will it lead to more jobs or fewer?

These are big issues but they are highly relevant. During the week-long political gathering in Arendal, Norway, more than 100 events centred around artificial intelligence. 

Yet there did not seem to be a common understanding of where Norway stands when it comes to AI. There was a broad range of opinions:

Can Norway become a digital developing country? was the title of one event.
How can Norway take AI into the future? was the title of another.

Many of the events had titles that reflected concerns about what AI will lead to. But there were also those who were (somewhat provocatively) impatient. 

“It is important to protect personal integrity, but how will Norway become a leading AI nation if all our energy goes into regulations and putting the breaks on developments?” ran the headline for one of the events.

The Nordic Labour Journal followed a few of the events online and it did not take long before the question I asked at the beginning of this piece was raised:

“What is the difference between ordinary digitalisation and AI?” 

At the event “Are Norwegian businesses ready to use AI?”, the head of cloud engineering at the IT company Tieto, Kent Inge Fagerland Simonsen, gave the following answer:

“Artificial intelligence is an umbrella term for a number of computer programming models that appear human to people and require large amounts of data for training, rather than being specifically designed algorithms.”

In other words, AI is not about new technology – like when steam power replaced muscle power and sailboats during the first industrial revolution, or when digital tools replaced analogue tools like typewriters or cameras using film.

Artificial intelligence is about building computer programs in new ways, enormous amounts of computing power, faster ways of moving information and staggering amounts of money. 

Microsoft has spent more than nine billion euro developing their Copilot AI program. It comes in different variations – for ordinary office work, coding and much more. 

ChatGPT started the debate

What made everyone talk about AI, however, was ChatGPT. Launched on 30 November 2022, the virtual assistant could generate text, images and video from prompts by the user.

What is known as generative AI are programs like ChatGPT that learn patterns and structures from their training data and then generate new data with similar characteristics. 

It is still a long way from computers being able to think for themselves, known as Artificial General Intelligence (AGI), which is typically defined as the level of intelligence a machine would need to understand and learn all the intellectual tasks that a human can perform.

After 100 million people were using ChatGPT in just two months (it took TikTok nine months in comparison), warning voices were sounded in March 2023: 1,000 researchers and well-known business people, like Elon Musk, called for a moratorium for six months on the further development of AI until the risks could be better understood. 

A new kind of debate

That was more than a year ago, and now the debate is more about how to find enough skilled people and whether there will be a shortage of the type of computer chips needed for AI. 

There are also questions about how much energy AI needs and the inherent weaknesses in some computer programs. One issue is how generative AI models have a tendency to “hallucinate”.    

This means that AI models give incorrect answers, or different answers to the same question. One characteristic of the new digital assistants is that they are not particularly humble. Answers are given without reservation. 

Considering the millions of people using ChatGPT, Copilot and other AI programs, there are still not that many examples of AI being a direct danger to humanity or causing a dramatic impact on unemployment rates.

This is not to say there are no challenges, however. Yet people’s ability to adapt is considerable. What once seemed revolutionary, quickly becomes everyday.

What is Copilot?

Copilot for Microsoft 365 is one of the programs that most ordinary workers will come across. 

These are some of the tasks Copilot can perform:

  • Summarise information from many different documents.
  • Transcribe meetings.
  • Help create graphs and presentations.

Microsoft’s own marketing points out:

“It is not like you can just kick back, press a button and let the co-pilot do your job for you. As the name suggests, this is not an autopilot, but an assistant who supports you from the sidelines. Copilot works alongside you.”

Examples of tasks that Copilot should be able to do, according to Microsoft: 

  • “What is the most important thing to have happened on my team during the holidays”. 
  • “Summarise the long email thread that I have just been copied into”. 
  • “Gather all the documents from the previous year where this particular project is mentioned”.
Photo: NTNU
Bodil Åberg Mokkelbost, Heine Skipenes, Hanne Jensen Moe and Silje Reiten Blichfeldt have studied how Copilot has worked on NTNU.

A group of researchers at the Norwegian University of Science and Technology (NTNU) have looked at what has changed since the university began using Copilot for Microsoft 365. The report has eight main findings, which can be summarised like this:

Copilot…

  1. …is excellent when you already know what you want it to help you with.
  2. …can influence the exercise of authority.
  3. …processes enormous amounts of personal information in new and uncontrolled ways.
  4. …is in constant development.
  5. …is still in its infancy...
  6. ... but already influences the entire organisation.
  7. …can be used to surveil and measure efforts and behaviour.
  8. …sometimes works really well.

Photo of the cover of the reportThe researchers write:

“Copilot is good at extracting the essence from large files and presenting it in a new, more focused document. This is the kind of task that could easily take several days to complete, but Copilot does the job in just a few minutes. This allows you to get started on tasks more quickly because you get immediate assistance when you need it.

“These kinds of everyday time-savings are why AI is expected to significantly boost office productivity.”

Norwegian employers’ organisations have asked the analysts at Economics Norway to consider the economic benefits that can be had from AI. The analysts say more use of generative AI could increase value creation in Norway by 2,000 billion Norwegian kroner (€171bn) by 2040.

But AI can be used nearly everywhere – surveying processes, in autonomous vehicles, health diagnostics and more. 

The report concludes that other advanced digital technologies and types of AI can increase value creation in Norway by 3,600 billion Norwegian kroner (€307bn) in that same period. That means the total increased value creation could reach 5,600 billion kroner (€479bn) in the next 15 years. 

There are also risks

The new technology also presents risks, albeit on a more manageable level. The NTNU researchers describe some of them:

Copilot can give you incorrect answers or false conclusions. This is particularly true when it does not have enough information to work with, but it can also happen when it does have access to good information. It is important to be aware of this and have mechanisms in place to identify and correct such errors.

Large organisations handling the personal information of many people often save documents temporarily or share them internally before they are properly archived. Copilot will then have access to such documents and can use them in different settings or for purposes other than those that were originally intended. 

What happens to personal integrity? And how is the relationship with the employer affected?

“The employer can assess employees' performance based on information from files, documents, Teams chats, email correspondence, Teams meetings, emotional expressions (emojis), transcriptions from meetings with automatic recording, etc. Employees have no way of knowing if this is happening.”

Since Copilot is under continuing development, it is a demanding tool. A company must actively decline any expansion or new tools being added. This is not something that only concerns the IT department. Copilot is also about organisational development – and all organisations should have an exit strategy:

“It is important to have an exit strategy so that the organisation retains control over its own technological future. There should be plans in place for how to migrate data, functions and services to a supplier if necessary.

“Such a strategy can help reduce the risk of downtime or data loss. It also strengthens the organisation’s bargaining position with the supplier since it is not locked into a single system.”

Newsletter

Receive Nordic Labour Journal's newsletter nine times a year. It's free.

(Required)
h
This is themeComment