AI vs academic integrity: threats and opportunities for higher education

By Sreethu.Sajeev, 23 May, 2024
Large language models might be an obvious challenge to academic integrity, but with the right approach, policy, and faculty and student engagement, can academics make AI a net win for their institution?
Article type
Article
Main text

Universities can teach AI, they can use it to augment learning or try to ban its use on campus, but what they cannot do is ignore it. The 21st-century university needs a cogent strategy for accommodating AI-driven technologies such as large language models (LLMs).

As the use of AI by students in assessments has become more prevalent and difficult to detect, there are increased concerns in the academic community regarding academic integrity. Inviting open dialogue and involvement from students in assessment and feedback mechanisms can bring promising results in student success and academic quality. At a round-table discussion hosted in partnership with Turnitin at the 2024 THE Europe Universities Summit, higher education leaders discussed the impact of AI on academic integrity and the broader policy implications of a revolutionary technology becoming commonplace.

The participants of the round table spoke about how generative AI tools are also changing student engagement. They also explored how using intuitive grading and feedback tools can help universities standardise assessment, facilitate personalised interaction between educators and students to prevent academic misconduct and promote skills development. 

Stefanie Seewald, academic director at Northern Institute of Technology Management, argued that more fundamental questions had to be answered before worrying about academic integrity. “We have to see what we are teaching and what we are assessing,” she said. “Why are we checking things that ChatGPT can answer? We have to take a huge step back and re-evaluate everything.”

Assessment formats might be considered obsolete if assessment outcomes can be gamed by an algorithm, but educators can turn the tables on the technology by incorporating LLM prompts in assessment design, such as tasking students with evaluating LLM-produced content. Those who had taken this approach reported better engagement from students and said it taught a valuable lesson about evaluating the veracity of information – a core component of digital literacy.

As chief entrepreneurship and innovation officer at Constructor University, Ali Alam advocates for the use of AI in education, reasoning that employers will expect students to know how to use it. LLMs can remove the blank page and offer a launch pad for innovation. “We encourage the use of ChatGPT and AI, for example, using it to build case studies or ideations at a base level,” he said.

The key consideration for educators is to ensure students are using AI to develop creative thinking, not replace it. This is one of Carol Damm’s concerns. As head of digital education at Constructor University, Damm wants to see students offering ideas that are from their own perspective. “I find it a great tool for brainstorming new ideas,” Damm said. “But I have a lot more life experience that I can pull on and evaluate those ideas.”

Technology can help support academic integrity. Turnitin’s platform has a tool for identifying content likely created by AI tools such as ChatGPT. But student experiences with AI, allied to university policies on where and when to introduce it in teaching and assessment contexts, will shape a culture of use.

“The sooner we embrace it and understand what it is we want to utilise it for, as well as its misuses, the better,” said Muhammad Ashfaq, academic director of international and course director of digital business at IU International University of Applied Sciences. 

There are many opportunities for academics to be creative with AI in their pedagogies and to exploit its efficiencies. The more the sector engages with this rapidly developing technology, the better prepared it will be to tackle the difficult questions surrounding its use.

The panel:

  • Muhammad Ali Alam, entrepreneurship and innovation coordinator, Constructor University
  • Florian Becker, managing director, Bard College Berlin
  • Charlotte Coles, director of global events, Times Higher Education
  • Carol Damm, head of the Digital Education Unit, Constructor University
  • Helmut Kern, deputy chairman, University Council of Vienna University
  • Kunal Saigal, director of academics and strategic partnerships, IU International University of Applied Sciences
  • Nadine Schroeder, head of industry partnerships, Hasso Plattner Institute
  • Christian Schuchardt, vice-dean, International Graduate Centre, Hochschule Bremen
  • Stefanie Seewald, academic director, Northern Institute of Technology Management
  • Krista Somers, senior manager of sales, Turnitin
  • Ramón Spiecker, managing director, Graduate and Professional School, Hochschule Bremen

Find out more about Turnitin.

Standfirst
Large language models might be an obvious challenge to academic integrity, but with the right approach, policy, and faculty and student engagement, can academics make AI a net win for their institution?

comment