Over the last few months, I have worked with the University’s Library team to try and uncover how students feel about GenerativeAI and how best to frame University guidance. AI has permeated almost every field of academia; therefore, it is crucial students understand the perks and pitfalls of using it. In order to do this, we carried out a number of focus groups, aimed first at understanding how students currently interact with AI, and later how students experience specific AI tools. Empathy mapping, while a novel technique for many of us, proved an effective way to allow students to project thoughts and feelings onto an imagined character. Participants were encouraged to conjure up a character who is in their first year at university and describe how they feel about AI, how they use AI, and what messaging they may have received already. In each focus group, a sense of AI being a double-edged sword was quickly developed; participants described characters as considering “pros and cons”, and experiencing “excitement” and “anxiety” towards using AI. Participants supposed that whilst the character may fear “missing out” on AI or falling behind peers who are using it, they may also fear the consequences of misusing AI or committing accidental plagiarism. Therefore, a key takeaway from these discussions is that clear guidance regarding AI use is absolutely essential in minimising the anxiety students may feel about accidental plagiarism.
Focus groups involved participants using two generative AI systems: CoPilot and ChatGPT. Participants were guided through tasks such as translating phrases, constructing campaign slogans, and writing and referencing short essays. Throughout the process they were asked about their observations, and across groups it was noticed that ChatGPT did not always provide accurate references; participants investigated some of the alleged sources and found that some were inaccurately referenced or didn’t exist at all. Participants also noted that the sources were not consistently academic or recent; as one remarked, “I wouldn't be using any references they've given”. It was also discovered that neither CoPilot nor ChatGPT were able to write to a specified word count. It was brilliant to see participants thoroughly engaged in critiquing AI, as through taking an active role in tasks participants saw for themselves the possible dangers of using AI generated content unchecked. Furthermore, some had “never heard” of CoPilot, therefore exercises involving comparison of the two tools may be useful in widening student perspectives around AI. Students also shared that AI was useful in helping to write and check the tone of emails, therefore perhaps one of the most exciting applications of AI could be in supporting students thrive in their life outside of academia. Therefore, for students interested in using or learning more about generative AI, it would be worthwhile to read the University’s guidance and explore different tools to find one that best supports your academic or extracurricular needs.
Find out more about AI and how it can be used at University.
.png)
