NOTE: This is an announcement I just shared with my online English students, just so you know the audience.
I'm sure most of you have heard talk of ChatGPT, which in its own words describes itself thus:
"ChatGPT is an AI language model developed by OpenAI, which is capable of generating human-like text based on the input it is given. The model is trained on a large corpus of text data and can generate responses to questions, summarize long texts, write stories and much more. It is often used in conversational AI applications to simulate a human-like conversation with users."
In other words (and per the Mashable story linked above) it's an online service that we can use to write love letters, help debug computer code and even write essays.
I've played around with it a bit myself, and have used it to write an unsourced informative essay, some rather convincing poems in the style of e.e. cummings, and a rather terrible poems in the style of Edgar Lee Masters. I've also used new services that are popping up to help others detect text written by artificial intelligence.
I'm also aware of a professor locally who has had students write an essay using ChatGPT and then develop strategies to combat detection by services meant to identify "machine-written" language in comparison to "human-written" language.
A few of you have sent me articles about ChatGPT, so I'd be a fool to think the majority of you haven't heard of it -- as it's also been given a fair amount of press lately.
What do I *think* about ChatGPT, and its potential uses, inside and outside the classroom?
I'll be honest, the jury is still out.
The knee-jerk reaction is to say using such services is cheating. And I suppose, in a way, it is. I was able to generate the essay linked above in only about twenty minutes. As I explain in my blog post, it's not perfect. But, by Grabthar's hammer Links to an external site., what a time-savings.
Similar debates about "cheating" cropped up when the calculator was invented, and revived again when graphing calculators entered the scene, making it much easier for morons like me to "do math," even though that's something I avoid doing as much as humanly possible.
Do I want to see machine-generated text pop up in class?
Probably the wrong question.
Have I already seen machine-generated text pop up in class and missed it?
That's a better question.
Understand I'm not lobbing accusations at anyone in this course. I am not running your text through AI-detection websites. Artificially-created writing could actually be an important step forward in many industries, if we learn how to use and master it correctly. I've read enough poorly-written instructions to know that if there's a better way to make better instructions out there, we ought to be taking advantage of the technological developments coming our way.
But let me share this:
In 2016, comedienne Carol Burnett published "In Such Good Company," a memoir of eleven years' involvement in the Carol Burnett Show, a comedy-music variety show that Americans of a certain age are likely familiar with. She records a conversation with Larry Gelbart, a well-known television writer of the same vintage:
Burnett: I don't know, but when I watch a comedy show on TV today, I know exactly what's coming as far as the writing goes. No surprises. No originality. Usually, it's the 'setup' first, and then comes the obvious joke, and then you hear that awful laugh track. It's as if all the shows are alike and repeating themselves.
Gelbart: I think it's because most of the writers today grew up watching television. That was their childhood, so they're writing about life once removed.
Burnett: What do you mean?
Gelbart: They never played stickball in the street.
Calculators are great for getting the correct answer. But the Cs and Ds I got in algebra belie the fact that I can use calculators to get the correct answer almost every time. I remain terrible at math because I never had the discipline to sit down and practice running through equations enough time to really understand what I was doing.
ChatGPT may indeed be great for generating text, text likely good enough as the artificial intelligence learns more to fool us poor English instructors into thinking our students are brilliant (and, based on your change essays, you are that). But such tools, as Gelbart says, represent learning the art of writing "once removed." If we allow ChatGPT to produce words for us without pausing long enough to consider how the words were put together and how we could put the words together on our own, we are writing "once removed." We're not learning a skill. If we don't sit down with a blank page, with our research by our side, with ideas bubbling in our head, waiting to come out onto paper or the screen, we're not playing stickball in the street.
ChatGPT is a tool, and one that should not be dismissed as useless or morally suspect. But at the same time, it should not become a crutch, a substitute for learning.