United Kingdom

Education: Getting smart with artificial intelligence

Generative artificial intelligence (AI) tools such as ChatGPT and Google Bard have the potential to transform higher education. But, as this technology also brings risks, Matt James, client director – head of education at Aon, says that universities and colleges must be prepared.

With predictions ranging from the extinction of the human race to a cure for cancer, it’s clear that AI has the potential to be a force for both good and evil. Although it may be a few years before these predictions are tested, the higher education sector is already considering how to manage AI’s use among students and staff to ensure it brings all the benefits and none of the downsides.

Generative AI tools such as ChatGPT and Google Bard present several challenges to universities and colleges. While it’s important for students to understand how to use the technology as these skills will be valued by future employers, there is also a risk of undermining academic integrity if it’s used inappropriately to create coursework.

Where it is used appropriately, AI has the potential to change the way students learn. As well as supporting neurodivergent students and those for whom English isn’t their first language, it could also transform teaching and assessments.

Unsophisticated script

It may have huge potential, but it’s not much of an issue at the moment as current generative AI tools aren’t hugely sophisticated. Rather than deliver the original thought expected of university students, these tools are much more likely to package up a series of facts, much like an advanced internet search.

For example, in a feature on the BBC news website1, academics at the University of Bath explored whether students could pass using AI tools. Although they found the tools handled multiple choice questions well, there were tell-tale signs that written answers was generated by AI. These included repetition of the question, GCSE level reasoning and citing a completely fictitious academic paper as a reference.

But few expect this to be the case for long. As these tools become more sophisticated, it will become less obvious that a student has used them to help write a dissertation.

University challenge

Universities and colleges must act now to ensure students and staff understand how these tools can be used and there are several issues to consider:

  • AI is a valuable tool for employment
    Employers will expect graduates to understand these tools and how they can be applied in the workplace. By not allowing students to use AI, universities and colleges could potentially hinder their students’ opportunities and progression within the future workplace. This strategy therefore runs the risk of legal action if students do not feel they have received a rounded education that prepares them for the workplace.
  • Students must be able to compete on a level playing field
    Where some students use AI tools and others don’t, there is the potential for complaints that marking or award classification was unfair. As an example, consider a law dissertation requiring potentially hundreds of case law references relating to the tort of negligence. If half the class used AI to generate these references and the others didn’t and received lower marks (perhaps because they were unaware that they could / could not use AI software), there would be potential grounds for complaints.
  • Inappropriate use of AI and academic misconduct
    Students risk being accused of academic misconduct if they rely on AI tools to generate their work. It may be easy to identify AI-generated work presently (ironically by using AI/machine learning software to scan for plagiarism) but as these tools become more sophisticated, staff will need to be able to identify its application in student work. Universities and colleges also need to ensure students are aware of the ramifications of using the technology inappropriately.

Advice and guidance

To help universities and colleges prepare for greater use of generative AI tools, the Quality Assurance Agency for Higher Education (QAA) has published guidance and advice (Maintaining quality and standards in the ChatGPT era: QAA advice on the opportunities and challenges posed by Generative Artificial Intelligence2).

It recommends these institutions have clear and consistent policies that outline what is and isn’t acceptable; how students should reference any AI-generated content; and the steps that will be taken if these tools are used inappropriately.

These policies must be clearly communicated to staff and students to ensure everyone understands how they can use AI. It’s also imperative that these policies are regularly reviewed and updated to keep abreast of the rapid developments in this technology.

Having a consistent, transparent and well-communicated policy in place ahead of the new academic year – the first full year in which these tools will be readily available – will ensure that the benefits of AI can be accessed by students and staff, without compromising academic standards or increasing the risk of legal action.

Be curious

In addition to distributing a clear and consistent policy, be curious about this new technology and ensure that staff within the institution have an idea of what AI tools are. Similar advice is given to parents when their children take their early steps with technology: play around with the device, learn the apps and games that are suitable for your child’s age, and be aware of the technology so that you can develop conversations with your child and ultimately enable them to use it safely.

The same easily applies with AI. There will be new software coming to market; ‘paid-for’ services that significantly enhance the content created; and future improvements and changes that wider society has not yet considered.

We are all, right now, living through the Technology and Information Age. Encouraging academic curiosity into ChatGBT and other generative AI tools now, so staff are aware of the limitations – and potential – they offer, will ensure that, as this technology evolves, your institution will be able to manage its application effectively, and for the benefit of students and staff.

More information

To discuss the steps your organisation can take to minimise the risks associated with generative AI tools, speak to your Aon account manager or contact Matt James at [email protected].

 

1 ChatGPT: Can students pass using AI tools at university? - BBC News

2 QAA publishes additional advice on Generative Artificial Intelligence tools

 

Whilst care has been taken in the production of this article and the information contained within it has been obtained from sources that Aon UK Limited believes to be reliable, Aon UK Limited does not warrant, represent or guarantee the accuracy, adequacy, completeness or fitness for any purpose of the article or any part of it and can accept no liability for any loss incurred in any way whatsoever by any person who may rely on it. In any case any recipient shall be entirely responsible for the use to which it puts this article.

This article has been compiled using information available to us up to 06/07/23.

Aon UK Limited is authorised and regulated by the Financial Conduct Authority. Registered in England and Wales. Registered number: 00210725. Registered Office: The Aon Centre, The Leadenhall Building, 122 Leadenhall Street, London EC3V 4AN. Tel: 020 7623 5500.