What has driven the development of AI application rules within SBE? Was there a specific moment when you thought, 'OK, now it has to happen'?
There wasn't necessarily a specific moment. It happened quite gradually, parallel to the rise of an abundance of AI tools increasingly used by students. It became more and more clear that a shift was taking place. During a course taught in period 4 of 2023, it became evident from the analysis of reports that the use of AI was increasing exponentially. This was demonstrated by texts and references that appeared to be generated by AI but were not picked up by standard plagiarism detection software. Teachers found out by reading. We also conducted a survey among teachers in March 2023. It turned out that many teachers were not very aware of the developments around AI. Few had actually experimented with it, as it turned out.The combination of these factors led us, as programme directors, to consider it urgent to regulate and guide the use of AI.
What are the core principles of the AI application rules, and how are they applied concretely within SBE?
The core principle is that our measures encourage and ‘compel action’ by students and teachers. This was not about punishing or sanctioning or taking a legal route, but about raising awareness of the possibilities and opportunities, as well as the dangers. That awareness is our priority, by actively involving and prompting them to take action, we achieve a deeper understanding than if we were to simply send an email.
What do these actions entail exactly?
For teachers, for example, it is mandatory to include a section in the manual. Within this section, they must choose from a predetermined 'AI menu'. In assignments, they then have to select from the following options in this menu: (1) none, (2) light, (3) medium, (4) heavy.
If teachers select option 1, this means that AI is not allowed, with a protected environment for testing also required. With option 2, AI is used for support, allowing students to use it for tasks such as editing and information retrieval. Option 3 actively encourages the use of AI tools, while option 4 indicates that the use of AI is an integral part of the assignment. With options 3 and 4, AI prompts and output must be included in the final report.
Making this AI menu mandatory is performative, and therefore requires that action, as it forces teachers to think about how to deal with AI. Even if they do not want to delve into it and simply select option (1), they will have to consider how to prevent the use and abuse of AI tools. In addition, we asked every teacher to evaluate their course in terms of learning process and assessment, in the light of AI. This allowed us to discuss the design of their course with the teachers.
What about students?
As for the students, we ask them to sign and submit an Own Work Declaration (OWD). In this declaration, they indicate that the final product is their own work. They also indicate that they have completed the assignment in accordance with the guidelines in the menu. If someone does not sign and submit an OWD, the student will not receive a grade for the course. Again, the core principle applies here: the student is compelled to ‘take action’.