
Who’d have thunk it! When students think about using AI they, “prioritise their developing values and moral positions over institutional messaging.”
Among all the angsting that students will used artificial intelligence destroy the process of learning, Margaret Bearman (Deakin U) and colleagues used a novel approach – they asked 79 of them, in 20 online focus groups.
Insofar as this self-selecting sample is representative, people who think learning and teaching are doomed should read this – students as quoted are autonomous and ethically engaged.
“In students’ accounts, almost all grappled with how much to trust GenAI and with the appropriateness of their relationship with GenAI. Many distrusted GenAI outputs, but this concern went beyond frustrations with inaccuracy. Rather, there was a sense that doing the right thing went beyond institutional rules, and was a matter for the students themselves” Professor Bearman and colleagues commented.
They found students focused on three themes:
Studying with/without AI: some use it “to simplify, to summarise and to create outputs.” Others use it for “valuable alternate insights.” And others, across disciplines and demographies, don’t. “There were frequent references to other students using GenAI to ‘write the whole assignment’ but no participants declared this about themselves.
“Mixing message and assumptions:” there is “considerable institutional prescription” on using AI which does not always help. “Students described contradictory guidance as well as their own contradictory responses.” Universities’ “mixed positioning” leads to students making their own moral judgements but worrying how they relate to “the often vague institutional landscape.”
“Trusting self and resisting dependency:” they grapple with trusting AI over outputs but also using it according to their own ethics. “for many, GenAI interactions appeared to always entail a values-based position or a moral judgement about themselves as students. “Sometimes, the right thing, or what students should do, seemed to be more prudentially calculated in terms of reward and punishment, but was ultimately understood in relation to the perceived purpose of education.”
The take-out: “many emphasised that students themselves were responsible for GenAI's impact on their learning and education.”
“Our study suggests that students’ own moral compasses are, often, already aligned to an understanding that the work must come from them – that the students must own what they produce.