Career

7 out of 10 have not tried ChatGPT: How to get started

Artificial intelligence is going to affect all knowledge workers, but many have never tried ChatGPT. Here’s an introduction to the chatbot, which can save you hours of work, but which also tends to "hallucinate" and make up facts.

Created with the program Midjourney, which produces images with artificial intelligence.

A large number of programs that use artificial intelligence have become publicly available in a short time. Several of them are free or cost the equivalent of a Netflix subscription.

The most talked about such program is the chatbot ChatGPT, which became freely available at the end of 2022, but a population survey conducted by Analyse Danmark for IDA shows that 6 out of 10 have not yet tried it.

In the survey, 36 percent of the respondents also answer that they do not expect chatbots to affect their work.

And that’s naïve perspective according to Thomas Moeslund, professor of artificial intelligence at Aalborg University.

‘I believe that all office workplaces will be transformed in connection with artificial intelligence. Exactly as we saw when the Internet became adapted to daily life and work’, he has stated to Ritzau.

He compares it to the way the internet has ended up changing the job market completely.

And in a new report, the consulting firm McKinsey concludes that 60-70 percent of all working hours will be able to be automated, and that this will primarily affect "knowledge workers" such as IDA's members.

However, artificial intelligence does not have to herald mass unemployment, and many researchers point out that new needs and tasks will arise for those who know how to adapt to the technological changes.

ChatGPT is "autocomplete on steroids"

Since ChatGPT became publicly available, it has made its mark in several areas.

It has passed both medical and bar exams, it has written song lyrics and books and has become a regular helper for both software developers and high school students in a time crunch.

But basically, ChatGPT is a language model that can analyze and write texts. According to Erik David Johnson, who is Principal AI Specialist at the consulting company Delegate, it can be described as ‘autocomplete on steroids’ because it tries to predict the next word when it writes a sentence, in the same way that you know it from your phone.

However, ChatGPT is far more advanced as it is trained on huge amounts of text and has a more complex understanding of language.

‘It has vacuumed almost all available text and analysed billions of statistical correlations in the languages ​​it is trained in. Based on that, it builds sentences by calculating what will be the most likely word to use, one word at a time hallway’.

ChatGPT also has a memory, which means that it can take into account previous commands or questions and thus have a more fluid dialogue.

Several have pointed out that ChatGPT seems almost frighteningly human in its responses, but according to Erik David Johnson, it will never be able to simulate human intelligence with its current method.

‘For humans, language is a reflection of an inner world we have. It is not for ChatGPT, because it actually just calculates what the next word should be based on previous input’, he explains.

According to Erik David Johnson, language models such as ChatGPT are based on ‘logical atomism’ - a theory within language and philosophy, where the basic idea is that language can be divided into minimal units, the logical atoms, and that all complex sentences and meaning can be built up by to combine them – in ChatGPT's case, by calculating a sentence word by word. But the idea of ​​logical atomism is criticized for missing that language is always used in a context and with a particular meaning that we humans decipher.

Put another way, there is a difference between being able to think and speak and in a grammatical, rule-based language and a fully human language.

That's why ChatGPT's answers can sometimes seem completely meaningless or out of context, even if they are written in well-formed and grammatically correct language, for example when it insists that the number 1000 is greater than 1602, or that ‘peanut butter and feathers taste good together, because they both have a nutty taste’.

Three tasks where ChatGPT is particularly useful

Rolf Ask Clausen is a chief consultant at IDA and an expert in new technologies. He points out that you can use ChatGPT to help you with many different things in your work, but that there are three task types in particular where it makes sense to begin experimenting with it: Text work, planning and programming.

Since ChatGPT is a language model, it first makes sense to use it to help you write texts.

‘You can use it to formulate emails or other messages, of which there can be many in everyday life. You can, for example, put a message from your colleague into ChatGPT and ask it to come up with a suitable answer. Here you must also instruct it in which tone you want, how long the message should be and your most important points’.

‘In addition, ChatGPT can write longer texts for reports, websites or the like. You may not be able to use ChatGPT's text directly, but for many it is a really good help to get started if they are hit by writing block or don't know how to get started with their text’, explains Rolf Ask Clausen .

Secondly, ChatGPT can fulfil the role of colleague or sparring partner, he says.

‘If you get a new task or are given responsibility for a project, you can ask ChatGPT to make a plan for how you approach it. It is both good for brainstorming and for writing lists, where it divides a task into smaller steps so that it becomes more manageable. In this way, ChatGPT can almost function as a colleague if you are left alone with a task’.

The third type of task that ChatGPT can help with is particularly relevant to software developers, and many of them have already opened their eyes to the possibilities.

‘ChatGPT can both program, find errors in code and comment on code, because it is precisely good at language and rules. In the future, you simply cannot work as a programmer without having it as a tool, because it makes you far more efficient’, says Rolf Ask Clausen.

ChatGPT can hallucinate

Although ChatGPT can answer almost all questions, according to Rolf Ask Clausen, it is important that you maintain your critical sense.

‘It is neither an encyclopaedia nor an oracle’.

One of the biggest criticisms of ChatGPT is that it tends to hallucinate and fabricate information. This landed a US lawyer who had used ChatGPT in a tort case in trouble because he referred to a series of fictitious cases that ChatGPT had invented.

‘You alone are responsible for double-checking the answers that ChatGPT gives you. You should not think of ChatGPT as an expert, but as an assistant who can do part of the legwork for you’, explains Rolf Ask Clausen.

You must program ChatGPT to create a response

If you don't have an understanding of how to use ChatGPT, it can quickly become a disappointing experience where the answers seem superficial or out of line with the question.

According to Principal AI Specialist Erik David Johnson, many people make the mistake of using ChatGPT as a search engine.

‘Unlike Google, ChatGPT does not find an answer when you ask it something, it creates it’, he explains.

Therefore, it is necessary to use ChatGPT actively and constantly refine your commands or questions – a process that has been dubbed prompt engineering.

At Wharton University in the US, Ethan Mollick, associate professor of innovation, has experimented with letting his students use ChatGPT to write assignments. Here they concluded that ChatGPT gives the best answers when the students instruct ChatGPT continuously in an iterative process, while they get the worst results by simply writing one sentence. According to Ethan Mollick, it's about understanding that ChatGPT is neither a search engine nor a human being, but a machine that you have to program or prompt with words to get a specific result.

Rolf Ask Clausen, who is an expert in new technologies at IDA, explains that it is important to give ChatGPT the necessary context for it to respond properly.

‘ChatGPT must have a carefully defined role and context when you ask it for something. For example, you can write that it must assume the role of an experienced project manager who must create a project plan over three months for you, or that it must write a text in an informal and humorous tone’.

‘You can actually also start by asking ChatGPT yourself about how you best use it to solve a task. Then it comes with a detailed guide’, says Rolf Ask Clausen.