Communicating with Students about acceptable AI use
The aim of this page is to provide you with annotated KI examples of how the bounds of acceptable AI use can be communicated with students. These examples should not be considered as statements that can be 'cut and pasted' into work, but instead sources of inspiration when you start crafting your own.
Notes about these pages
The pages in this AI and education hub provide advice and suggestions and should not be construed as guidelines or rules. Keep in mind that the information on the page is general and not adapted to any specific area, programme, or course. It is also important to know that AI technology is developing rapidly and continuously. The pages will therefore be updated regularly.
These pages has been created by the Unit for Teaching and Learning (UoL) within the framework of the project "Use of generative AI in teaching and examination" (autumn 2023), on behalf of the Committee for Higher Education (KU).
The last significant revision was done in December 2025.
Students often say they're unclear about what counts as acceptable AI use versus cheating. This is a genuine communication challenge, as both academic writing standards and AI technology are evolving rapidly. As such before you read this page, we recommend that you read the university library's superb guide 'AI for students' which should give you a grounding the student perspective.
This page provides examples of how to clearly communicate acceptable AI use boundaries. You'll find:
- Local examples: Documents and statements from within KI, adapted to our local context
- External examples: Documents, approaches and statements we think it could be worth highlighting which are produced by, for example, other universities
A practical approach:
It's useful to create broad guidelines at the programme or course level covering:
- How to declare AI use
- Critical perspectives on AI tools
However, you'll likely need to add specific guidelines for individual courses or assessments. This allows you to:
- Remind students of the rules
- Tailor guidelines to fit the specific assignment context, as the appropriateness of certain tools or AI usage cases will vary depending on the pedagogical model being used, the specifics of the learning goal etc etc.
Local examples:
Example: Programme guidelines for the Joint Master's Programme in Health Informatics
This statement has been crafted as a AI usage statement for the whole of the Joint Master's
Programme in Health Informatics.
Guided by the information on https://www.ucl.ac.uk/teaching-learning/generative-ai/using-ai-tools-assessment (accessed 5 September 2023), the students of the MBE programme are asked to take the following concerns into account before writing assignments.
Before using generative AI, you should ensure that:
- You know whether or not it is permitted for your assignment/research in this particular course. Please see Canvas or ask your Course Director for this information.
- You understand the limitations and risks of using generative AI.
- Your assignment/research remains your work. If you are suspected of submitting a document which you have not written yourself (i.e. used generative AI as a ghostwriter), it will be regarded as cheating and handled as a disciplinary case.
Generative AI can be a useful starting point to gather background information on a topic, but be aware that:
- Generative AI produces information that may be inaccurate, biased, or outdated.
- Generative AI is not an original source of information: it reproduces information from
- unidentified sources.
- Generative AI may fabricate quotations and citations.
- You should always refer to original and credible sources of information.
If you do choose to use generative AI tools, you must always:
- Critically evaluate any output it produces.
- Carefully check any quotations or citations it creates.
- Correctly document your use of the tools so that it can be appropriately acknowledged.
How to acknowledge the use of AI systems in your academic work
The use of generative AI must be acknowledged in an ‘Acknowledgements’ section of any piece of academic work where it has been used as a functional tool to assist in the process of creating academic work. This acknowledgement should be written on a separate page and accompany the written work as the first page after the front page or for smaller assignments last, before the references.
You need to include the following text in the acknowledgement:
Acknowledgement of the use of generative AI
During the preparation of this work, the author(s) used [NAME TOOL/SERVICE and VERSION, PUBLISHER and URL] to [REASON]. After using this tool/service, the author(s) reviewed and edited the content as needed and take(s) full responsibility for the content of the publication.Version 1 - 230906
EXAMPLE
During the preparation of this work, the author(s) used ChatGPT 3.5 (OpenAI, https://chat.openai.com/) in order to summarise my initial notes and to proofread my final draft. After using this tool/service, the author(s) reviewed and edited the content as needed and take(s) full responsibility for the content of the publication.
Version 1, revision 2 – Approved by the programme committee on 16 October 2025
These guidelines cover KI courses in the following programmes:
- Bachelor’s Programme in Biomedicine
- Master’s Programme in Biomedicine
- Master’s Programme in Biostatistics and Data Science
- Master’s Programme in Molecular Techniques in Life Science
Students of the Study Programmes in Biomedicine (as listed above) are expected to read and understand the KI “AI for students” pages (https://kib.ki.se/en/ai-students). In addition, students must take the following into account if using generative AI for studies and/or assignments for KI courses. Ask your course director if you are uncertain.
Before using generative AI, you must ensure that:
- You know whether the use of generative AI is permitted for your assignment/research in the course concerned. See Canvas or ask your Course Director for this information.
- You understand the limitations and risks of using generative AI.
- Your work and submitted assignments are based on your own ideas, unless otherwise indicated (e.g., via referencing). Inappropriately taking credit for ideas or work that have been generated using AI is considered cheating and action will be taken in accordance with KI disciplinary procedures.
When using generative AI, you should be aware that:
- Generative AI produces information that may be inaccurate, biased, or outdated.
- Generative AI is not an original source of information: it reproduces information from unidentified sources.
- Generative AI may fabricate quotations and citations.
- You should always, as far as possible, refer to original and credible sources of information.
If you do choose to use generative AI tools, you must always:
- Critically evaluate any output it produces.
- Carefully check any quotations or citations it creates.
- Correctly document your use of the tools so that it can be appropriately acknowledged.
How to acknowledge the use of AI systems in your academic work
The use of generative AI must be acknowledged in an ‘Acknowledgements’ section of any piece of academic work in which it has been used as a functional tool to assist in the process of creating academic work. You must include the following text in the acknowledgement:
Acknowledgement of the use of generative AI
During the preparation of this work, the author(s) used [NAME TOOL/SERVICE and VERSION, PUBLISHER and URL] to [REASON]. After using this tool/service, the author(s) reviewed and edited the content and take(s) full responsibility for the content of the work.
EXAMPLE:
During the preparation of this work, I used GPT-5 Auto (Open AI, https://chat.openai.com/) to summarise my initial notes and to improve the spelling, grammar, and language of my final draft, including the restructuring of sentences and paragraphs without substantially changing the underlying meaning. After using this tool/service, I reviewed and edited the content and I take full responsibility for the content of the work.
What tools are considered generative AI?
Generally, you do not need to disclose the use of tools that automate time-consuming tasks where the end result essentially remains the same. For example, you may use reference management systems such as Endnote, Zotero, or Mendeley to provide your sources in the specific format required. Similarly, you may use word processing programs that help you with spelling, grammar, level of style, and concision of your text. These suggestions are based on grammatical rules and stylistic principles, and these tools will not re-write your text. Examples of such computer programs are basic grammar functions in Word, Grammarly, and Instatext. However, using generative AI parts of such programs must be acknowledged. If only using the basic functions in computer programs as described above, you remain responsible for the output and must check it carefully.
In the absence of other information, such as instructions from the Course Director or Examiner of a specific course, students are allowed to employ AI tools to support their learning and improve communication and writing skills. However, it is mandatory for students to be transparent and describe in detail how they used AI tools.
Information security
(adapted from https://staff.ki.se/education-support/teaching-and-learning/generative-ai-and-teaching-advice-for-educators)
The AI learns from the data fed to it, and therefore you risk that your data ends up in the hands of companies who are free to do as they please with the information. Therefore, you should not submit protected material such as patient data, personal data or research data when you use generative AI. It is not appropriate to submit others’ work (including group reports) to such AI services without explicit consent from everyone involved.
External examples from Sweden
The 'Perkins Matrix', 'Traffic lights', Högskolan Väst, KTH
What has generally become referred to as 'The Perkins Matrix' or 'Traffic Light' model has become a model which many use to define the appropriate use of AI tools on an assignment or course level. The linked website includes the scale itself, research papers on its use and general commentary.
Based upon that, Högskolan Väst have translated the scale to Swedish. KTH have used a broadly similar approach which is outlined on this page, though their version includes four stages instead of five.
Jönköping University
Jönköping University has taken a rather different approach, creating a selection of statements about different use cases for AI which can be carefully sown together to create a statement which is appropriate for a specific assignment or course.
You can find these statements here.
Lund
Lund have produced a template for their teachers, which appears (at time of writing) to only be available in Swedish - see step 4 on this page 'Skapa en policy'. What's notable about the Lund document is that they not only give guidelines regards the levels, but also the scope of the document, and more specifically their formulation covers learning activities as well as assessment.
External examples from abroad
University College London
Similarly, UCL in the UK (linked here), which has been mentioned as a source of inspiration on some of the KI examples above uses a 'traffic light' approach, this time with three stegs. One notable change here is that UCL specify the for the possibility for an exception from their strictest category if students have special educational requirements, though these do need to be decided by the examiner.
Note also that UCL has a specific page on the subject of communicating with students about AI.
