Imagine you had an app on your phone that could do your homework for you. I don’t mean it would help you with your homework. I mean it would do it all for you. For example, if your teacher asks you to find out about climate change in Australia and write a 3-page report about it, you type into the app “Write a 3-page report about climate change in Australia,” and a few seconds later your report is written and ready to send to the teacher. Wouldn’t that be wonderful?
Well, maybe not exactly wonderful. You would miss the opportunity to learn by doing the research and writing the report, but it would certainly save you some time and energy. From the teacher’s point of view, it would certainly not be wonderful: they wanted you to learn by doing the assignment, and they also wanted to look at your writing and see what you are good at and what you still need to study more—but you didn’t write that report, so it isn’t your writing!
That’s why some teachers are very worried about a new software service called “ChatGPT.” It is free* to use and it does just what I said in my example: it will produce a report on any topic in any style (in English) in just a few seconds.
Some people say that ChatGPT is the latest kind of Artificial Intelligence that could take over more and more of the things that humans have used their intelligence to do until now. In fact, though, ChatGPT is not intelligent at all. It is just very, very good at doing one thing: it can predict what word (in a sentence or a report) is likely to come next. You can do this, too! Look at the beginning of this sentence:
She went into the room and closed the .
What word do you think will come next? If you think it is “door,” your answer is very likely to be correct. “Window” is a likely answer, too, but maybe not as likely as “door.” “Store” is even less likely, and “elephant” is probably impossible. So, you not only know what words are possible, you also know how likely those words are. This is exactly what ChatGPT knows.
How does ChatGPT know this? In the same way you know it: you and ChatGPT have both had a chance to read (and hear) sentences like:
She went into the room and closed the .
and you have had chances to see how these sentences end. Your knowledge about what word can come next comes from your experience of these sentences. It’s the same for ChatGPT: it has had a chance to experience millions and millions of sentences. Where do these sentences come from? From the Internet, of course. Millions of sentences on thousands of topics, already in electronic form, so ChatGPT can “read” them. Just like you, it uses this knowledge to predict what word is likely to come next, and how likely it is. That’s the one thing ChatGPT is good at. That’s how it can write reports for you: by knowing what word will probably come next.
The difference between you and ChatGPT is that, as well as having experience of words, you also have experience of seeing people going into rooms and closing something. ChatGPT does not. It does not know what a room is, or a door, or closing. It doesn’t know what climate change is either, or what or where Australia is. It only knows what words are likely to be used, in what order, when people write about the topic of climate exchange in Australia. That’s why I say it is not intelligent: it doesn’t know anything except the words.
People around the world are beginning to imagine ways they can use ChatGPT to help them with their work. Some teachers are finding ways to use it in their classrooms. Some students are beginning to think it might be good to have ChatGPT do their writing assignments for them. But is it really a good idea? If ChatGPT writes your 3-page report, you still won’t know anything about climate change in Australia. In fact, because it only knows the words, it might make crazy mistakes that people who know about Australia and climate change would never even think of writing, like “If Australia continues to get hotter, it might start a fire and burn its neighbours’ house.” If you use ChatGPT, your teacher might think that your English writing suddenly got better, but it hasn’t got better at all. We need to think very carefully about ways that software like ChatGPT can help us.
There is one interesting point, though: the way ChatGPT learns about words is the same way that we think humans learn, too. They both use their experiences to help them make predictions about what is likely to happen next. More experience means better predictions. You can use this idea when you think about how you learn English: the more English you experience, the better your predictions about what to say or write next will be. Time to start reading for your report!
*You might want to think about why it is free, although it clearly cost a lot of money to make this software. As usual with free Internet software, it is probably because the owners of ChatGPT want to use the information you give them.