Education

The first step for reviewing content would be looking at the education that students have around AI, academic offences and what that looks like in practice. Education is part of the ‘swiss cheese’ strategy by Rundle et al (2020) and highlights that students and staff need to be educated as part of a wider environmental development to help them understand what an acceptable use of AI is and the potential challenges that may arise in relation to academic offences.

Rundle, K, Curtis, G and Clare, J. (2020) ‘Why students choose not to cheat’. Chapter in Tracey Bretag (Ed) A Research Agenda for Academic Integrity.

Test and check

To check for AI vulnerabilities, it can be useful to try asking an AI to complete your assessment. Try asking with multiple prompts to see what is produced. This is only designed as an indicator, and you may need to try multiple approaches to see potential vulnerabilities. This can then inform any small adaptions to questions or approaches that you are intending to use.

Question-based tests

When looking at your assessments, especially if focused on examination based questions, it can be useful to add multiple step problems into the range of questions available to students. This is due to AI working on pattern recognition and not being able to link back to previous questions. There are also other elements, such as varying data sets and adding distractors, that can help. For more information, please visit our question-based testing and AI page (web).

Reflective/Personal accounts

The University of Lincoln has specified within earlier guidance (AI in assessment | web) that adding personal reflective accounts within assessments can be useful to link to real experience and is more personalised. This personalised approach makes it harder for AIs to replicate as it is based around students lived experiences. When linked directly into the assessment it can provide a more robust output.

Varying ways of presenting information

The ways in which students access information that informs their answers can be a useful tool in trying to make more robust assessments. For example, you may have a two-part question in which the second part asks them to build upon information provided in the first part. This approach is usually seen within question-based tests (TCA’s, MCQ’s) and makes it more difficult for AI’s to analyse. If this approach is taken, then accessibility requirements still need to be considered to make sure it is not disadvantaging students.

Optimise a mix of Lower Order and Higher Order thinking skills

Any tests that focus on recall of information are more vulnerable to AI since factual information is available from a variety of sources. It is instead recommended that you take the flip the question approach (web) using Bloom’s taxonomy (Web)(Oregon State University. 2023) and focus on higher order thinking skills. This will be more difficult for AI to interpret.

Oregon State University 2023, advancing meaningful learning in the age of AI, Oregon: Oregan State University Available: https://ecampus.oregonstate.edu/faculty/artificial-intelligence-tools/meaningful-learning/