ChatGPT, Google's Bard and artificial intelligence in general are a new and rapidly growing work tool for IDA's members. It is not new that you leave digital traces and share details about yourself and your work when, for example, you do searches on the web. But where a search is based on what you lack or don't have, you give something you have (e.g. code or text) when you use ChatGPT. And you cannot withdraw any material that you have provided to ChatGPT.
On this page, you will get answers from IDA's legal advisers to your questions about using AI technologies in your work. These are the questions that we receive inquiries about now, but as the new technology develops, new questions will arise. That is why IDA keeps itself updated so that we can continuously give you the best possible advice.
The short answer is: yes. Before you start using ChatGPT or other AI tools in your work, ask your manager for permission to use it for the tasks you intend to.
It is your manager or the company's management that decides how tasks are to be solved and with which tools. In this connection, it is also very important that you examine and follow the company's overall guidelines regarding the use of AI tools such as ChatGPT.
Gradually, clear guidelines for the use of AI tools have been set in many companies. But if this is not the case at your workplace, you can contact the company's DPO (Data Protection Officer) or HR/the legal department to ensure that you stay in line with the company's guidelines and wishes.
If you share text, code or other data with ChatGPT, it is important that you know what is permitted - both generally in relation to the legislation and in relation to the internal rules at your workplace.
If the internal guidelines are not specific enough regarding what you may share, ask management questions. It could be, for example: Can I use ChatGPT to make minutes from internal meetings? To optimise my code? To analyse and draw conclusions from collected data?
You must not share personal information about yourself or your colleagues with ChatGPT, as this information may be confidential and protected under data protection laws (GDPR regulations).
Whether you are allowed to or not depends a lot on what information it is: Is it general information, or is it specific customer information that is protected in terms of personal data, confidentiality and/or confidentiality?
It is important that you check and comply with your company's policies and guidelines for handling customer data, so that you ensure that there is no breach of the personal data rules, confidentiality and confidentiality or other clauses in your employment contract.
Yes, it may have consequences for your employment if you have shared information with ChatGPT which belongs to the company. It can be business secrets, e.g. IT code, or other internal information from the company.
Which consequences depend a lot on the situation. It could be, for example, that you receive a warning or be dismissed or, in very special cases, be expelled from the workplace. What you may and may not share with Chat GPT will depend on company policies/internal guidelines, confidentiality agreements and any clauses in your contract, as well as the law.
If you have already shared something with ChatGPT or other AI technologies that you are not sure you should have, you should call the IDA Legal Department for advice. In addition, you must contact your immediate manager or the HR department as soon as possible.
It is important to be open about the situation and explain your concern so that the company can take the necessary precautions if, for example, during the sharing, a data breach has occurred in relation to the rules on personal data or otherwise.