Using Chatbots Wisely in Education

Using Chatbots Wisely in Education

By: Ken Purnell, Skye Playsted, and Justin Kennedy

ChatGPT and Microsoft Bing headlines abound on mainstream and social media. Such AI chatbots will change the way we teach, assess, and work. Indeed, the “genie is out of the bottle” and so we must learn how to use it effectively and wisely. Some recent examples from mainstream media include:

“The teacher is dead. Resistance is futile. What sets ChatGPT apart is its exceptional performance and potential to disrupt education. Since its launch, there has been a plethora of online discussion from education professionals sharing examples and opinions about how students will use ChatGPT to cheat on assessments, such as writing essays or coding programs, and how AI could soon replace the teacher.” (McMinn, 2023)

“Schools Shouldn’t Ban Access to ChatGPT. . .Blocking access to ChatGPT is a mistake. There is a better way forward. Students need now, more than ever, to understand how to navigate a world in which artificial intelligence is increasingly woven into everyday life. It’s a world that they, ultimately, will shape.” (Lipmen & Distler, 2023)

“90% of online content could be generated by AI by 2025.” (Garfinkle, 2023) 

“There’s no consensus . . . with teachers expressing both optimism and hesitation regarding how content generation AI will forever change the classroom. Judging from the reaction on TikTok, teachers on the app see ChatGPT as a tool to be treated the same way calculators and cell phones are used in class—as resources to help students succeed but not do the work for them.” (Townsend, 2023)

“Schools right now face a choice—fight the wave of ChatGPT, or surf it.” (Voight, 2023)

“Research Summaries Written by AI Fool Scientists: Scientists cannot always differentiate between research abstracts generated by the AI ChatGPT and those written by humans.” (Else, 2023

These quotes are indicative of the polarization of views of AI chatbots such as ChatGPT and Microsoft Bing. Chatbots will be transformative for education, business, and many other aspects of life and are here to stay. They are only going to get better and better at what they do and much more sophisticated and human-like. We should recognise that chatbots have the potential to make very positive contributions to, for example, education.

So, why might a person use AI bot-generated works? To perhaps cheat? After all, used well, chatbots can write a PhD, a resume, a magazine article and just about anything else you care to name. A student, for example, may “beat the system” and get a degree or pass a college or school essay, but of what value is it when you don’t develop the requisite knowledge, skills and the ability to apply them?

Failure to use your brain’s natural plasticity to learn has potentially severe ramifications. You can’t cheat the rigours of life. And if an engineer can’t build a bridge that holds up, a pilot fly a plane safely, a surgeon perform surgery, or whatever, it becomes obvious in their lack of relevant knowledge and skills that they are not the expert they may have claimed to be. Whilst that may be socially embarrassing, it could cost money to rectify, and possibly lives.

Getahun (2023) provides an overview of ChatGPT as:

The artificial intelligence chatbot that generates eerily human-sounding text responses, is the new and advanced face of the debate on the potential—and dangers—of AI. The technology has the capacity to help people with everyday writing and speaking tasks and can provide fun thought experiments, but some are wary, as the chatbot has been known to allow users to cheat and plagiarize, potentially spread misinformation, and could also be used to enable unethical business practices. What’s even more alarming: Like many chat bots before it, it is also rife with bias.

Despite these dangers, the effective use of chatbot AI can improve our thinking and problem-solving abilities. Chatbots have great potential to assist most writers to improve the quality and clarity of their written communication whether in schools, businesses or the community. 

Are works created by chatbots a friend or foe? Both—depending upon how they are used, and to what degree readers of the works understand how they were produced. Users and readers of such AI-produced works need to have an informed view of its biases, potential misuse, and positive applications to human learning.

Now that you have our conclusion, let’s go on a short journey to see how we arrived at that. As the authors have a special interest in education, we will use that as our context. 

Chatbots and our brain’s plasticity

Educators change brains. That is our chief job. We can do it better by becoming expert Neuroplasticians—educators who apply advanced neuroscientific knowledge and skills that are brain-friendly. That includes the positive use of technologies such as chatbots.

Indeed, Neuroplasticians seek chiefly to improve the achievements and wellbeing of people and organisations. They consistently apply their rich toolkit of evidence-based strategies and seek to help others understand the neuroscience of learning (Willis, 2021). Neuroplasticians work with and alongside people and organisations to make learning brain-friendly and maximise performance (Purnell, 2023a),

There is an even greater need for expert Neuroplasticians who engage in the science of teaching and learning in our evolving and expanding AI chatbot world. No doubt there will soon be a proliferation of short courses to do that (see, for example, Udemy, 2023).

Like technology such as computers, tablets, and social media, ChatGPT, Microsoft Bing and similar AI platforms are themselves neutral. However, how we use them is not. As educators, we need to lead our students to develop skills in using AI chatbots ethically. As an example of how to use of chatbots effectively and ethically, The University of Queensland (2023) states that generative AI tools may be able to:

    • help you study and prepare for exams e.g. generate quizzes or flashcards.
    • summarise information on a topic to help you get started.
    • recommend authoritative sources on a topic for you to follow up.
    • help you improve your grammar, sentence construction and other language skills.
    • explain the solution to different types of problems to increase your understanding e.g mathematical problems, coding errors, formulas.
    • help you analyse data e.g. create spreadsheets, tables and organise information.
    • restore low quality images or video.
    • provide creative inspiration or suggestions that you can build on.

AI chatbots can greatly assist student learning. For example, students may ask ChatGPT to define a term such as Neuroplasticity. Then, have it rewritten once or twice by ChatGPT and compare and contrast the definitions (Ankucic, 2021). Compare and contrast, like spotting patterns and having multi-sensory inputs, is a well-known strategy for positive neuroplastic change in the brain that strengthens memories through activation of neural firing, blood flow, proteins, and neurochemical activation of dopamine and serotonin in particular. Such strategies result in the student changing their neural structure, i.e., learning. Similarly, providing different access routes to the same or similar information (Schallert, 1980), or multisensory experiences, such as reading aloud and hearing, results in new memory formation (see, for example, Killian, 2021).

The student must do the work to change their neurological networks. Just reading the work by ChatGPT will form weak memory traces at best. Interacting with the content in multiple ways using different strategies results in deep learning and changes the brain. Like exercise in the gym, such brain workouts contribute to their ability to do similar hard work regularly. This results in better achievement and improvements in self-efficacy and wellness.

As educators, we seek to help learners build more capable and creative neural pathways to think through dealing with messy and complex real-world problems with their many variables and constraints. This can be supported by asking chatbots relevant questions so that the AI can generate aspects of more basic thinking thus further freeing cognitive resources of users for higher-order and creative thinking.

In analysing complex problems, chatbots can help students to identify First Principles, where a complex problem is broken down into its most basic, foundational elements. That facilitates a decision-making approach that focuses on the most obvious facts and solutions. ChaptGPT and similar chatbots are very useful at this. So, by freeing up those valuable cognitive resources in working memory learners can focus on higher-order thinking and creative solutions more readily (Purnell et al., 1991).

As another example, Playsted’s language teaching and learning research has found that language teachers spend a lot of time looking for and creating resources. Creating the needed individualized resources is an area of teaching where AI can help. For example, some of the features of OpenAI’s “Playground” tool have the potential to reduce preparation time. Automatic Speech Recognition tools such as OpenAI’s “Whisper” are being used in language classrooms as teaching tools; for example, to provide feedback on speech. Other technology-assisted oral language learning apps can also be used by students both in class and at home, thus providing multiple opportunities to improve fluency in writing and speech as well as reading.

It pays researchers and educators to keep an open mind when new technology is introduced into education. Chatbots and similar AI have the potential to improve the learning of students and reduce teacher preparation time in diverse teaching contexts in ways we are only beginning to discover.

So, from the positive neuroplastic changes that chatbots may assist us with, we now turn to the “elephant in the room”: assessment requirements. As educators, we have heard clearly how chatbots can be used by students and others to do their work for them. So, how do we “chatbot-proof” our assessment requirements?

Assessing student achievement (not chatbot) plus improving educators' assessment literacy

ChatGPT and similar AI chatbots can produce responses to assessment requirements for students at any level. For example, a school student might ask for a response to an assignment such as “Write an 800-word report on . . . “ or “Get references for the theory of evolution.” High school students, when using chatbots properly, can do things better. Villasenor (2023) argues that: “I’m encouraging my students to become responsible, aware users of the AI technologies that will play a profoundly important role over the course of their careers.” The AI writing, so to speak, is on the wall.

So, chatbots in education are here to stay. An analogy a half-century ago is when politicians and others wanted to ban Elvis Presley and only film his movements from the waist up—if at all. ChatGPT is getting a similar reception, with some jurisdictions banning its use in schools—good luck!

The big question is: Is the use of chatbots the issue, or the design of the assessment instrument? We argue that it is the latter. So, how do educators address this?

There are well-established design principles in the assessment literature that can help. A valuable example is from the Queensland Curriculum and Assessment Authority (QCAA, 2023). We encourage educators to use these in their assessment practices to minimise the likelihood of chatbots producing responses to assessment requirements that are not the student’s work.

Assessment instruments should be:

    • aligned with the curriculum (content and what is taught) and pedagogy (how the content is taught)
    • equitable for all students
    • evidence-based, using established standards/continua to make defensible and comparable judgments about students’ learning (not that of an AI chatbot)
    • ongoing, with a range and balance of evidence compiled over time to reflect the depth and breadth of students’ learning (to reduce the possibility of end-of-term essays that may be AI chatbot produced)
    • transparent, to enhance professional and public confidence in the processes used, the information obtained, and the decisions made
    • informative of where students are in their learning (at the point of time of the assessment in a particular subject).


High-quality assessment instruments are characterised by three attributes:

    • validity, through alignment with what is taught, learned, and assessed
    • accessibility, so that each student is given opportunities to demonstrate what they know and can do
    • reliability, so that assessment results are consistent, dependable, or repeatable (by the students in a range of contexts).

In addition to such checks on the assessment instruments that we set, it is critical that educators continuously improve the quality of their assessment literacy. That is “the skills and knowledge teachers require to measure and support student learning through assessment” (QCAA, 2022a). Assessment literacy is arguably the weakest area of the professional knowledge and practice of educators. As an example of addressing this shortfall, QCAA (2022b) has three modules on basic high-quality assessment literacy worth looking at:

    • Attributes of quality assessment
    • Developing valid and accessible assessment
    • Making reliable judgments

An example of “guaranteeing” the authenticity of student work is given by Purnell (2023b) in his ten-minute recording on YouTube. There, he describes how students create a reflective journal on how their learning in educational neuroscience informs their practice. The content from which the reflective journal is created is locked down on the university Moodle website, requiring the student ID to access it. Students are required to apply their learning to their current or intended workplace by making a ten-minute video where they must be viewed throughout (invigilated by design). In that video, the student provides high-quality professional development for colleagues—a very personal and detailed story. So, in this example, the content is not accessible to chatbots but only to enrolled students and instructors, and the students have to create a personal reflective journal and video related to that content which tells their story of how they are implementing ideas in their workplace/intended workplace. Try that, chatbot!


Chatbots have enormous potential to help people in new and innovative ways. In using AI-produced works we need to have an informed view of their biases, potential misuse, and positive applications to human learning.

Chatbots and similar AI are here to stay and will have an ever-increasing impact on our lives, including in education. As educators, we need to have informed views to maximise the positive use of chatbots as we lead teaching and learning. Like computers with sophisticated purpose-built software, calculators, and other  technologies that were once considered the end of learning, we now have to adapt and use chatbots powerfully in our repertoire of technology-aided instructional strategies to support learning in new and innovative ways.

For example, we see a continued reduction of preparation and marking time for educators and greater use of chatbots in administrative tasks. This should be welcome at a time when many educators are burnt out from high workload issues. However, as we noted, chatbots have exposed the assessment literacy issue for educators. That can actually be a good thing, as we have to rethink aspects of our assessment practice in light of what chatbots can be used to do to circumvent assessment. Chatbots have clear implications for our many of our existing assessment practices. Their use will mean that, as educators, we have to continuously get smarter at our assessment practices by, for example, creating instruments that necessitate true human interaction and production, and that can be supported by (not replaced by) chatbots.


…marks a brand-new world of chatbot-related transformations that will impact many areas of life. Teachers, students, parents, business organisations, and government departments—everywhere—will need to be cognizant of our emerging chatbot world and approach it with wisdom in an informed way. There is great potential for the judicious use of chatbots to contribute positively to many aspects of our lives. Not the least of which is in education.


Professor Ken Purnell is the Head of Educational Neuroscience at CQUniversity Australia. Ken focuses on translating the implications of evidence-based neuroscientific findings for education into brain-friendly classrooms.

Skye Playsted is a part-time lecturer at the University of New England and a Ph.D. researcher at the University of Queensland, in Australia. She has taught music, German, and TESOL in schools, vocational colleges, and universities for more than 20 years. Her research is in professional learning for teachers of preliterate adults who are learning English.

Professor Justin Kennedy is an adjunct at CQUniversity Australia is an expert in organizational neuroscience. He holds the position of the Professor of wellbeing neuroscience and organisational behaviour at UGSM-Monarch Business School, in Hagedorn, Zug, Switzerland. He is the Ph.D. Professor of applied neuroscience with Canterbury Christchurch University as well as with several other U. K. universities, like Middlesex and Chester University.

Leave a Reply

Your email address will not be published. Required fields are marked *