Artificial intelligence is expanding rapidly. This article looks at the strengths and weaknesses of ChatGPT and other generative AI tools in nursing education
Artificial intelligence (AI) refers to the application of algorithms and computational models that enable machines to exhibit cognitive abilities including learning, reasoning, pattern recognition and language processing that are similar to those of humans. By analysing vast amounts of data (text, images, audio and video), sophisticated digital tools, such as ChatGPT, have surpassed previous forms of AI and are now being used by students and educators in universities worldwide. Nurse educators could use these tools to support student learning, engagement and assessment. However, there are some drawbacks of which nurse educators and students should be aware, so they understand how to use AI tools appropriately in professional practice. This, the first of two articles on AI in nursing education, discusses the strengths and weaknesses of generative AI and gives recommendations for its use.
Citation: OConnor S et al (2023) Artificial intelligence in nursing education 1: strengths and weaknesses. Nursing Times [online]; 119: 10.
Authors: Siobhan OConnor is senior lecturer, Emilia Leonowicz is nursing student, both at University of Manchester; Bethany Allen is digital nurse implementer, The Christie NHS Foundation Trust; Dominique Denis-Lalonde is nursing instructor, University of Calgary, Canada.
Artificial intelligence (AI) comprises advanced computational techniques, including algorithms, that are designed to process and analyse various forms of data, such as written text or audio and visual information like images or videos. These algorithms rapidly evaluate vast quantities of digital data to generate mathematical models that predict the likelihood of particular outcomes. Such predictive models serve as the foundation for more advanced digital tools, including chatbots that simulate human-like conversation and cognition.
AI tools have the potential to improve decision making, facilitate learning and enhance communication (Russell and Norvig, 2021). However, it is important to note that these AI systems are not sentient or conscious; they lack understanding or emotional response to the inputs they receive or the outputs they generate, as their primary function is to serve as sophisticated predictive instruments.
AI technology has existed for some time in many everyday contexts, such as recommendations for content on social media platforms, finding information and resources via internet search engines, email spam filtering, grammar checks in document-writing software, and personal virtual assistants like Siri (iPhone) or Cortana (Microsoft), among others. The latest evolution of AI is a significant leap from these previous versions and warrants additional scrutiny and discussion.
The Joint Information Systems Committee (JISC), the UKs digital, data and technology agency that focuses on tertiary education, published a report on AI in education in 2022. JISC (2022) explains how AI could help improve different aspects of education for teaching staff and learners. As an example, AI could be used to create more adaptive digital learning platforms by analysing data from students who access educational material online. If students choose to read an article, watch a video or post on a discussion forum, this data could predict what kind of support and educational resources they need and like. This type of learning analytics could be used to improve the design of a digital learning platform and curricula on different topics to suit each individual student.
JISC also set up a National Centre for AI to support teachers and students to use AI effectively, in line with the governments AI strategy (Office for Artificial Intelligence, 2021). The centre holds a range of publications and interactive demonstrations on different applications of AI, such as chatbots, augmented or virtual reality, automated image classification and speech analysis.
JISC also has an interactive map of UK institutions that are piloting AI in education in practical ways. In addition, there is a blog to follow, and many events that focus on AI in education, which are free to attend. Recordings of these events are also available on the JISC website (JISC, 2023).
A cutting-edge type of AI is generative AI, which uses algorithms and mathematical models to create text, images, video or a mixture of media when prompted to do so by a human user. One promising application of generative AI is a chatbot or virtual conversational agent that is powered by large language models.
Chatbots can generate a sequence of words that a typical human interaction is likely to create, and they can perform this function surprisingly accurately as they have been trained using a large dataset of text. Chatbots have been trialled in university education to:
Despite the benefits of these chatbots, they are not yet widely used in universities as they have several limitations. Some problems include: the accuracy of responses they provide; the privacy of inputted data; and negative opinions of the technology among teachers and students, who prefer face-to-face interactions and fear the potential implications of AI (Choi et al, 2023; Wollny et al, 2021).
A chatbot called ChatGPT (version 3.5) was launched in November 2022 by a commercial company called OpenAI. GPT stands for generative pre-trained transformer, which is powered by a family of large language models. ChatGPT went viral in early 2023, with millions of users around the world (Dwivedi et al, 2023). The dataset for ChatGPT 3.5 came from websites such as Wikipedia and Reddit, as well as online books, research articles and a range of other sources. This caused concern about how much trust to place in the chatbots responses as some of these data sources may contain inaccuracies or gender, racial and other biases (van Dis et al, 2023).
Understandably, educators and students at schools and universities have been conflicted about the use of generative AI tools. Some institutions have tried to ban the use of ChatGPT on campus, fearing students would use it to write and submit essays that plagiarise other peoples work (Yang, 2023).
In an attempt to identify AI use, detection tools, such as GPTZero, have been created, as well as tools by educational technology companies, such as Turnitin and Cadmus (Cassidy, 2023). These could be integrated into learning management systems, like Blackboard, Canvas or Moodle, to detect AI writing and deter academic misconduct. However, detection tools may not be able to keep up with the pace of change as generative AI becomes ever more sophisticated. Relying on software to spot the use of AI in students written work or other assessments may be fruitless, and trying to determine where the human ends and the AI begins may be pointless and futile (Eaton, 2023).
In March 2023, a more advanced chatbot, GPT-4, was released. It is currently available as a paid subscription service and has a waiting list for software developers who want to use it to build new digital applications and services. Other technology companies have promptly released similar AI tools, such as Bing AI from Microsoft and Bard from Google. Other types of generative AI tools have also emerged, including:
These types of AI tools could be used in many ways in education. The UKs Department for Education published a statement on the use of generative AI in education. Key messages were:
DfE (2023) also highlighted that generative AI tools can produce unreliable information or content. For example, an AI tool may make up titles and authors of seemingly real papers that are entirely fictitious; as such, critical judgement is needed to check the accuracy and quality of any AI-generated content, whether it is written text, audio, images or videos. Humans must remain accountable for the safe and appropriate use of AI-generated content and they are responsible for how AI tools are developed (Eaton, 2023).
The use of AI in nursing education is just starting. A recent review by OConnor (2022) found that AI was being used to predict student attrition from nursing courses, academic failure rates, and graduation and completion rates.
Nurse educators and students in many countries may have already started using ChatGPT and other generative AI tools for teaching, learning and assessment. However, they may be hesitant or slow to engage with these new tools, especially if they have a limited understanding of how they work and the problems they may cause. Developing guidelines on how to use these AI tools could support nurse educators, clinical mentors and nursing students in university, hospital and community settings (Koo, 2023; OConnor and ChatGPT, 2023).
Nurses should leverage the strengths and weaknesses of generative AI tools (outlined in Box 1) to create new learning opportunities for students, while being aware of, and limiting, any threats they pose to teaching and assessment (OConnor, 2022).
Box 1. Strengths and weaknesses of generative AI tools
Strengths
Weaknesses
AI = artificial intelligenceSources: OConnor and Booth (2022), Russell and Norvig (2021)
As generative AI tools can process large amounts of data quickly, they could be used in nursing education to support students in a number of ways. For instance, AI audio or voice generators, which create speech from text, could be used to make podcasts, videos, professional presentations or any media that requires a voiceover more quickly than people can produce. This could enrich online educational resources because a diverse range of AI voices are available to choose from in multiple languages. Some tools also allow you to edit and refine the pitch, speed, emphasis and interjections in the voiceover. This could make digital resources easier for students to listen to and understand, particularly those who have learning disabilities or are studying in a foreign language.
A chatbot could, via interactive conversations on their smartphone, encourage students to attend class, speak to a member of faculty or access university services, such as the library or student support (Chang et al, 2022). One designed specifically for nursing students could also be beneficial during a clinical placement, and direct them to educational resources, such as books and videos while training in hospital and community settings. This may be particularly useful to support learning in those clinical areas in which nurses are very busy or understaffed, or where educational resources are limited or inaccessible.
As generative AI can adjust its responses over time, a chatbot could provide tailored advice and information to a nursing student that aligns with their individual needs and programme outcomes.
Another way nurse educators could support students would be to highlight a weakness of generative AI in its ability to confabulate that is, to fill in knowledge gaps with plausible, but fabricated, information. Nursing students should be taught about this weakness so they can learn to develop the skills necessary to find, appraise, cite and reference the work of others, and critique the outputs of generative AI tools (Eaton, 2023).
Simple exercises comparing the outputs of a chatbot with scientific studies and good-quality news articles from human authors on a range of topics could help students appreciate this flaw. As an example, a chatbot could be asked to explain up-to-date social, cultural or political issues affecting patients and healthcare in different regions and countries. The AI-generated output could be cross-checked by students to determine its accuracy. They could also discuss the impact the AI output could have on nurses, patients and society if it were applied more broadly and assumed to be completely factual and unbiased.
Nurse educators could also use AI-generated text, image, audio or video material to help students explore health literacy. As group work in a computer laboratory, students could use a generative AI tool to create diverse customisable patient education about a health problem and how it might be managed through, for example, diet, exercise, medication and lifestyle changes. Students could be asked to design and refine text prompts to ensure the content that is generated is appropriate, accurate and easy for patients to understand.
Chatbots can also be used to create interactive, personalised learning material and simulations for students. Box 2 illustrates how generative AI has been used in simulation education. Given this example, it is easy to imagine combining realistic text-to-speech synthesis (which we have today) and high-fidelity simulation laboratory manikins. This could support learning by providing engaging and interactive simulations that are less scripted or predetermined than traditional case study simulations.
Box 2. Use of generative AI in simulation education
Context: A two-hour laboratory session with first-year nursing students.
Objective: To create opportunities for students to trial relational communication skills to which they have previously been exposed in lectures.
Simulation: Nursing students were put into small groups and a chatbot was used as a simulated patient in a community health setting. Using relational communication techniques, each group interacted with the chatbot in a scenario it had randomly generated. The patient responded based on what the students typed, with no predetermined storyline. The chatbot allowed several conversational turns, then provided students with a grade and constructive feedback.
Prompt used (GPT-4): Lets simulate relational practice skills used by professional registered nurses:
Results: Students enjoyed the novelty of this activity and the opportunity to deliberately try different question styles in a safe and low-risk context. They thoughtfully and collaboratively put together responses to develop a therapeutic relationship with the patient and their chatbot-assigned grade improved with each scenario tried.
Considerations: Although not a replacement for in-person interaction, this activity provided space for trial and error before students engaged with real patients in clinical contexts. It is important for nursing students to be supervised during an activity like this, as the chatbot occasionally became fixated on minor issues, such as its inability to detect students eye contact and other body language. When this occurred, the chatbot needed to be restarted in a new chat or context window to function correctly. It is also critical that students be instructed not to input any personally identifiable data into the chatbot as this information may not remain confidential.
AI = Artificial Intelligence
Nurse educators could leverage another weaknesses of generative AI to create innovative lesson plans and curricula that teach nursing students about important topics. Bias that is present in health and other data is an important concept for students to understand as it can perpetuate existing health inequalities. AI tools work solely on digital data, which may contain age, gender, race and other biases, if certain groups of people are over- or under-represented in text, image, audio or video datasets (OConnor and Booth, 2022). For example, an AI tool was trained to detect skin cancer based on a dataset of images that were mainly from fair-skinned people. This might mean that those with darker skin tones (such as Asian, Black and Hispanic people) may not get an accurate diagnosis using this AI tool (Goyal et al, 2020). A case study like this could be used to teach nursing students about bias and the limitations of AI, thereby improving their digital and health literacy.
Finally, nursing students will need to be vigilant with their use of AI tools to avoid accusations of plagiarism or other academic misconduct (OConnor and ChatGPT, 2023). They should be supported by nursing faculty and nurses in practice to disclose and discuss their use of generative AI as it relates to professional accountability. This could help reduce the risks of inappropriate use of AI tools and ensure nursing students adhere to professional codes of conduct.
The field of AI is evolving quickly, with new generative AI tools and applications appearing frequently. There is some concern about whether the nursing profession can, or should, engage with these digital tools while they are in the early stages of development. However, the reality is that students have access to AI tools and attempts to ban them could well do more harm than good. Furthermore, as patients and health professionals will likely start using these tools, nurses cannot ignore this technological development. What is needed during this critical transition is up-to-date education about these new digital tools as they are here to stay and will, undoubtedly, improve over time.
A curious, cautious and collaborative approach to learning about AI tools should be pursued by educators and their students, with a focus on enhancing critical thinking and digital literacy skills while upholding academic integrity. Wisely integrating AI tools into nursing education could help to prepare nursing students for a career in which nurses, patients and other professionals use AI tools every day to improve patient health outcomes.
Cassidy C (2023) College student claims app can detect essays written by chatbot ChatGPT. theguardian.com, 11 January (accessed 6 September 2023).
Chang CY et al (2022) Promoting students learning achievement and self-efficacy: a mobile chatbot approach for nursing training. British Journal of Educational Technology; 53: 1, 171-188.
Choi EPH et al (2023) Chatting or cheating? The impacts of ChatGPT and other artificial intelligence language models on nurse education. Nurse Education Today; 125: 105796.
Department for Education (2023) Generative Artificial Intelligence in Education. DfE
Dwivedi YK et al (2023) So what if ChatGPT wrote it? Multidisciplinary perspectives on opportunities, challenges and implications of generative conversational AI for research, practice and policy. International Journal of Information Management; 71: 102642.
Eaton SE (2023) 6 tenets of postplagirism: writing in the age of artificial intelligence. drsaraheaton.wordpress.com, 25 February (accessed 6 September 2023).
Goyal M et al (2020) Artificial intelligence-based image classification methods for diagnosis of skin cancer: challenges and opportunities. Computers in Biology and Medicine; 127: 104065.
Joint Information Systems Committee (2023) National Centre for AI: accelerating the adoption of artificial intelligence across the tertiary education sector. beta.jisc.ac.uk (accessed 6 September 2023).
Joint Information Systems Committee (2022) AI in Tertiary Education: A Summary of the Current State of Play. JISC (accessed 17 April 2023).
Koo M (2023) Harnessing the potential of chatbots in education: the need for guidelines to their ethical use. Nurse Education in Practice; 68, 103590.
OConnor S (2022) Teaching artificial intelligence to nursing and midwifery students. Nurse Education in Practice; 64: 103451.
OConnor S et al (2022) Artificial intelligence in nursing and midwifery: a systematic review. Journal of Clinical Nursing; 32: 13-14, 3130-3137.
OConnor S, Booth RG (2022) Algorithmic bias in health care: opportunities for nurses to improve equality in the age of artificial intelligence. Nursing Outlook; 70: 6, 780-782.
OConnor S, ChatGPT (2023) Open artificial intelligence platforms in nursing education: tools for academic progress or abuse? Nurse Education in Practice; 66: 103537.
Office for Artificial Intelligence (2021) National AI Strategy. HM Government.
Okonkwo CW, Ade-Ibijola A (2021) Chatbots applications in education: a systematic review. Computers and Education: Artificial Intelligence; 2: 100033.
Russell S, Norvig P (2021) Artificial Intelligence: A Modern Approach. Pearson.
van Dis EAM et al (2023) ChatGPT: five priorities for research. Nature; 614: 7947, 224-226.
Wollny S et al (2021) Are we there yet? A systematic literature review on chatbots in education. Frontiers in Artificial Intelligence; 4, 654924.
Yang M (2023) New York City schools ban AI chatbot that writes essays and answers prompts. theguardian.com; 6 January (accessed 16 April 2023).
Help Nursing Times improve
Help us better understand how you use our clinical articles, what you think about them and how you would improve them. Please complete our short survey.
Follow this link:
Artificial intelligence in nursing education 1: strengths and ... - Nursing Times