What Role Will Artificial Intelligence Play In Academia? Here's What Experts Have To Say
To get a better perspective on the current and future role of AI in academia, International Business Times UK interviewed subject-matter experts who are currently working on AI through research, teaching, and studying at King's College London.
As the world of artificial intelligence continues to evolve, so do the impacts it has on critical sectors such as education. While concerns have been raised that AI will replace nearly half of the modern workforce in years to come, others have flagged the exponential growth possible by embracing AI.
To get a better perspective on the current and future role of AI in academia, International Business Times UK interviewed subject-matter experts who are currently working on AI through research, teaching and studying at King's College London.
Dr Caitlin Bentley is a researcher and lecturer in AI education within the Department of Informatics. She says that AI in academia is in "a moment of incredible flux", as its role is different depending on the discipline, university and country. For some areas of study such as computer science, medicine and engineering, AI has been a core aspect of education for some time.
She says the biggest change now is due to a specific type of technology called generative AI, using data to produce new and original content, which has been introduced into education without a lot of forethought.
She highlights that the main concern with students' use of generative AI technologies such as ChatGPT is that they can generate coursework and essays. But generative AI has many uses outside of cheating, such as supporting research processes, as a personal tutor and even as a friend in some cases, says Bentley.
For the most part, the focus of AI education has been introducing students to the technology, what it does and how to use it. However, this is changing as discussions are increasingly taking place about the risks associated with certain AI technologies going to scale.
Bentley teaches in the professional practice stream which is trying to help computer scientists design and develop technologies more responsibly by thinking about the types of impacts that they can have in the real world. While computer science typically focuses on the use of math, algorithms, data structures and theoretical frames, her approach is to always put the technology into context.
She does this by encouraging students to think about where a given technology has come from, who the developers of the technology were, and those who might have been excluded from the development process.
She also has students look at technology from different perspectives, such as how users might engage with the technology in different ways depending on their needs. She says that evaluating the impact of AI is important, especially the unintended consequences of its use.
"I not only teach people how to use it [AI] but also teach them more broadly and the problems it creates, especially in terms of social inequalities and environmental impact," notes Bentley.
Because ChatGPT draws from existing information on the internet, which is dominated by US-based English sources, it tends to be culturally biased. In one case, the response generated by AI made a gender-stereotyped assumption that the coder was not female.
She also provided an example of someone becoming dependent on ChatGPT to write emails, without understanding that every time they use the technology, there are carbon emissions caused. Without this context, they cannot make an informed choice to use the technology more responsibly.
"We are thinking about the impact on jobs, but we are not thinking about the environmental costs," said Bentley, who suggests there are viable alternatives to these technologies that would be less harmful to the environment, including by taking a few more minutes and drafting the email yourself.
In thinking about the responsible use of AI in academia, Bentley notes that the approach of King's College London is to be clear with students about when and how AI can be used. For example, there are several different models developed by the university that departments and lecturers can adopt based on their needs.
Some may tell students they aren't allowed to use it, while others may add a reflection component to an assignment requiring students to think about how using AI supported their learning. Some may allow the use of AI for research, but not writing assignments. She says the university is open to seeing how these tools can support learning, especially because they will transform many professions and thus must know how to use them.
To prepare students for a world that is increasingly driven by AI, Bentley believes the most important skill to teach is critical thinking. Students must be able to question the outputs they get when using AI technology, verify the information provided, and pick up on potentially biased results.
She says that universities should be creating professionals who can think about how to solve complex problems, which is a multidisciplinary effort that requires creativity and collaboration. Further, she says that using AI to better understand it and its issues is the only way for us as a society to steer the technology in a direction that is better for the public good.
In terms of training those outside the university setting to responsibly use AI, Bentley says not enough is currently being done in the UK, in part because businesses don't want to invest in training for their employees anymore. She believes this gap could be addressed by universities offering shorter programs to those who are already in the world of work to help level the playing field.
Moving forward, Dr Bentley is interested in developing practical solutions that would help those from marginalized communities to understand and engage with AI to reduce the impact of the technology on existing inequalities.
Modest Lungociu is currently a master's student in Computer Science. He says that the use of AI has changed greatly since he did his undergraduate studies a few years ago, especially with the release of ChatGPT. He has noticed that ChatGPT has greatly increased the accessibility of AI for students in all subject areas. While some students use AI to better understand course material or make their research more efficient, others rely on it to complete assignments.
He says that universities are aware that, like the internet, students' use of generative AI is inevitable. As such, it's not about avoiding the technology, but creating rules so that it is used responsibly. With concerns regarding academic integrity, universities are trying to keep up with the transformative role of AI in the classroom by implementing policies to regulate their use.
However, detecting plagiarism in assignments completed using generative AI can be very difficult, especially in more technical fields. Lungociu says this is because there are often a small number of possible outcomes for technical assignments, such as writing code, so the probability that several people have the same answers is high.
On the other hand, assignments such as essays are much easier for plagiarism software to catch because of the formulaic way in which ChatGPT writes. Students who use AI in violation of their university's academic regulations risk losing credit for their assignment, being suspended, or even expelled.
The implications of misusing AI extend beyond the university campus. Lungociu notes that students using AI to avoid doing their assignments risk becoming dependent on technology, which will likely impact their quality of work in the professional world. They are more likely to struggle with analysing new problems and taking steps to solve them because it requires critical thinking, something which AI cannot easily replace. He believes it will take time for the education curriculum in a variety of fields to catch up with the developments in AI and focus more on these fundamental skills.
Looking to the future, Lungociu sees AI as a potential tool for making research more efficient, which will result in important discoveries for the common good, especially in the fields of medicine and science.
© Copyright IBTimes 2024. All rights reserved.