Response to What Gen AI Exposes About Written Assessment

brown desk and chair lot

Opinion

Mark Bassett’s recent opinion piece argued that the current focus on detecting gen AI use in student work is misplaced and that there have long been problems with the validity of written assessment.

He suggests universities should recognise the urgent need to change our assessment practices. Highlighting the historical problem is important but it is time now to think about meaningful solutions. Disciplines such as science with their heavy experiential learning are perfectly placed here.

The Australian Council of Deans of Science (ACDS) agrees with the problem Mark Basset identified but sees the current challenge as an opportunity – we must capitalise on the need for change to improve learning outcomes and consequently assessment within science degrees. Our proposed strategy is to focus on valued discipline-specific practices and to consider how we can assess the unique learning outcomes that arise.

In science, this means re-evaluating how we assess practical work, where students are in the laboratory or field learning techniques of data collection and analysis. Practical work reinforces theory and inspires curiosity in students while introducing them to the techniques of their discipline. While the traditional assessment – a report modelled on a scientific paper – is now vulnerable to AI, the in-person nature of practical work means that there are opportunities for new approaches that re-centre our learning outcomes and assessment to emphasise process over product.

The Australian science community has a strong history of innovative educational practices and now has a growing body of education-focussed academics. ACDS has drawn on the work of this community to identify evidence-based best practice that can be applied to practical assessment and has recently released a position paper, Science practical work and its assessment: an opportunity to improve learning in a post-AI world.

The position paper recognises the unique learning environment that students experience during practical work. The paper suggests (in line with TEQSA recommendations) that assessment re-focus from the product, or the experimental results, to the process of obtaining them. Ideally, the process would encompass the many skills and techniques that students need to master to generate and analyse results, as well as their ability to work safely and productively with others. Done well, process assessment would evaluate a much wider range of learning outcomes than the traditional laboratory report.

Our search of the literature identified three groups of assessment strategies that could be usefully applied to practical work. These are briefly summarised below but the position paper includes references and resources for each strategy. While some are specific to science, others can be applied to other disciplines.

  1. Practical exams. Students can be directly assessed and graded on their ability to carry out an experimental procedure or on the quality of the results obtained. An alternative mechanism to assess practical ability is mastery learning or competence assessment, where students must achieve a defined level of competence to pass. They are given opportunities to practice skills and have multiple attempts at demonstrating competence.
  2. Oral assessment. Interactive oral assessments have recently risen in popularity as a response to AI challenges to assessment integrity. These are well suited to the laboratory environment as students can be questioned on the theory underlying techniques, as well as interpretation of experimental results. Group or individual posters or presentations, followed by questions, have also been successfully used to assess experimental work.
  3. Structured integration of assessment. Students can be given many small assessments that focus on different stages of the scientific process, for example, pre-laboratory quizzes, assessed experimental activities, lab book completion, results presentation, post-lab quizzes. These can be scaffolded to guide students through all steps of the scientific method, assessing the process from start to finish. Peer review can be incorporated into these activities, such as generating results, presenting them graphically and then peer-reviewing with a simple template. While some of these steps are vulnerable to AI, as a package, they can provide a comprehensive assessment of a student’s understanding.

It is clear from these examples that we do have options to re-design assessment. However, Mark Bassett points out that institutional inertia can result in a failure to innovate because of a commitment to improved surveillance of written work.

But this ignores the many positive steps that individual academics are making towards developing and testing more secure and innovative forms of assessment. Leadership from within disciplines might generate improved assessment through a better focus on what is important, authentic and unique to each discipline.

None of this is easy! In science, it means we need to re-structure practical work to incorporate assessment without compromising time for learning. Leadership and resourcing is badly needed but perhaps bottom-up initiatives can help drive change.

The opportunity we have now is to rigorously test new and old assessment options to develop strategies that meet current needs. As has always been the case, using multiple and different methods of assessment provides the best assurance of learning and security. The way forward will be a community-based approach with best practice shared through professional development opportunities.

The ACDS position paper is a first step in this process. ACDS also fosters sharing of practice through an annual forum for Associate Deans (Education) from science faculties and the Australian Conference on Science and Mathematics Education. We look forward to continued assessment discussions both within and across disciplines.

Susan Howitt is Emeritus Professor at the Research School of Biology, ANU and until the end of 2025, was Director of the Australian Council of Deans of Science Teaching and Learning Centre.

Michelle Harvey is Associate Dean Teaching and Learning, Faculty of Science, Engineering and Built Environment, Deakin University and is the current Director of the Australian Council of Deans of Science Teaching and Learning Centre.

Share:

Facebook
Twitter
Pinterest
LinkedIn

Sign Up for Our Newsletter

Subscribe to us to always stay in touch with us and get latest news, insights, jobs and events!!