Logo

Discussion forums: the key to AI-proof assessment?

Providing a supportive learning culture for students will make them less likely to cheat – and discussion forums, with a few tweaks, may be the way to do it

Edward Palmer's avatar
6 Aug 2025
copy
  • Top of page
  • Main text
  • More on this topic
Brightly-coloured speech bubbles and avatars
image credit: iStock/DragonTiger.

Created in partnership with

Created in partnership with

The University of Adelaide

You may also like

Translating creative assessment styles online: Blackboard learning journals
Using Blackboard learning journals as a form of online assessment

Popular resources

Like many educators, I have revisited assessments in my courses multiple times over the past few years, in response to the rise of artificial intelligence. My focus has been on student learning rather than academic integrity, and apart from discussing ethical use of AI with students, I approach my courses from the perspective that students won’t cheat if we can provide a meaningful and supportive learning culture for them to work in. 

Underlying this is an admittedly optimistic faith in human nature, a belief that it is impossible to reliably detect and prove inappropriate use of AI and a confidence that strong curriculum design (not just assessment design) will lead students down a path of learning, rather than one based on a fear of consequences. Discussion forums, a now relatively basic technology, can provide the core of a reliable learning environment.

Discussion forums have been a staple of many courses over the past 30 years. They provide opportunities for students to share opinions and engage in robust conversation with their peers under the expert eye of an educator. But their use, and the time commitment expected, is often not rewarded in marks, and students treat them accordingly. 

As academics, we model the value we attribute to tasks by the weighting, and a “participation grade” of 10 per cent does not often communicate value to students. They may choose to take the hit on tasks with little grade value and decide not to engage. 

Additionally, discussion forums’ implementation can be poor in learning management systems, providing a vastly inferior experience to those available on social media platforms. This doesn’t support casual engagement. Yet we know they can have value to both active participants and casual engagers (lurkers).

One novel way of implementing discussion forums in courses is by harnessing a continual assessment approach. Weekly activities, typically taking an hour to do, can be designed to scaffold key concepts from each week and encourage discussion of those concepts. My implementation of this starts from week one, where students are expected to contribute a post to the forum and respond to three other posts in a meaningful way. The posts are designed to scaffold learning towards a final task, which varies from course to course. A weighting of 30 to 40 per cent for this task shows the value attached to the task and students typically respond accordingly by investing time.

By providing feedback rapidly each week, students can see where their weaknesses lie, but they can also identify these quickly from the activity of other students in the forums. A clear rubric identifying the requirements for the tasks supports clarity. An example might be one major post which answers the question posed (worth 40 per cent) with one to three references (20 per cent), replying to others in a way that develops further conversation (40 per cent).

So how effective would this approach be? In my courses, when discussion forums were weighted at a lesser amount or were optional, there was little activity. Unsurprisingly, a higher weighted series of discussion tasks engaged students more strongly, with the number of weekly posts for a 50-student class ranging between 250 and 400. Grades for discussions aligned strongly with the grades for the final tasks and provide superior discrimination (the ability to differentiate different performance levels among students) from previous tasks, such as lower-weighted discussions or group presentations. 

This approach provided meaningful learning outcomes at both first-year and master’s level, with the overall quality of the final tasks moving up half a grade level since implementation.

Questions often asked by colleagues start with marking workload and scalability. For a class of 50 students, it would typically take 3 hours a week to mark, with the final task requiring 20 minutes per student. Assuming additional staff are allocated to marking, this approach could scale easily. One educator could manage 80 students and stay on top of required feedback.

More important is the design of the courses. The educator must curate material to provide a clear pathway to the weekly learning outcomes and to ensure discussion forum tasks are relevant. My classes have no lectures, relying on readings, short videos and interactive activities to drive engagement. The core learning occurs in weekly three-hour workshops, where discussion activities for the week are always part of the class and, depending on the nature of the task, may have dedicated time attached to them to ensure understanding. 

Blunt the power of AI

But can’t students do this task with AI? The answer is that they can, but there are a few ways to reduce these concerns. The first is that we should be willing to trust the students. I don’t believe students enter higher education thinking about the best ways to cheat their way to a degree. I do believe that circumstances can lead to bad decision-making, which might drive students to use methods that aren’t in their best interests to try to pass. But by providing meaningful support, we can minimise these stresses. 

Second, providing strong scaffolding around the ethical use of AI can support improved decision-making. Many instructions have resources and courses on AI, which can help students understand where they should and shouldn’t use AI to support learning. This is an area where we may not currently devote enough time – or worse, assume that it has been addressed in other places. It probably hasn’t, and is every educator’s responsibility.

But if a student uses AI to answer well-designed forums, they will hit a few hurdles. The first is that many of the forum questions can be designed with personalisation involved. Asking students how particular concepts relate to their experiences or to course-specific discussions is one way of making AI prompting a bit more challenging (but not impossible). To create references that will pass scrutiny is also challenging and to reply to others means gathering information from the learning management system and putting it into an AI tool. There are some time savings to be had, but they may not be as significant as students imagine. 

If the supporting materials and overall design of the course are clearly articulated to students and they can see the value of each task and if they have attended the workshops, they will have the skill set required to answer the question. In short, the efficiencies gained from AI may not be significant enough to be worthwhile. Especially when the current quality from AI will likely yield a pass, at best, for well-designed tasks. There are times when it will be clear that a student has taken a superficial approach to answering the question or has used AI. With clear feedback provided early on, students will recognise the cause of poor results – if appropriate weighting is attached to the forums. Negative consequences will become rapidly apparent.

So, good design and development of a good learning culture can provide a meaningful pathway to learning, using simple technologies – despite the presence of AI. There is no need to fear new technologies, just understand how they can be best used, if at all, for optimal learning outcomes.

Edward Palmer is professor of educational technologies and director of the Digital Education and Society hub in the School of Education at Adelaide University.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site