First, he says through judicious use of ChatGPT, he's been able to unlock "skills and opportunities that might not have been possible without it." He quickly learned that when he encountered a gap in his programming knowledge, ChatGPT was a quick way to get hints on how to fill the gaps. The AI hints also prompted him to learn more to fill in the gaps he noticed in his knowledge. The hints from ChatGPT worked in tandem with the knowledge and experience he already has as a programmer to fill in those gaps: He could see how ChatGPT's suggestions worked.
So, good use of AI here, which he illustrates thus:
He goes on to say: "Where I think it is not as healthy is where you are relying on it to substitute for large gaps in knowledge."
He illustrates these "traps" as follows:
When confronted with a programming problem he had some familiarity with but not near enough knowledge to tackle, he presented the problem to ChatGPT -- which provided a solution that didn't work. When he "pressed" ChatGPT that the provided solution didn't work, the AI suggested a different approach that would have meant sweeping changes in the overall program and customer interface.
It was then he realized he had more context for the problem and impact of the suggested solutions than ChatGPT had. He set to work without AI and found a solution that worked without the sweeping changes ChatGPT suggested. He asked ChatGPT about the solution he came up with, and asked why the AI didn't come up with the idea: "It said essentially that my approach is tailored to this specific situation, whereas ChatGPT was generally going to go for a more general solution."
"Having context of the problem and the system in general allows you to search for solutions where a coding assistant might miss them," he adds.
His experience has application to use of AI in English composition. We have artificial intelligence assistants in place already (spellcheck, Grammarly, predictive text) that can help us fill in the little gaps. Often enough as we use these assistants, we decide it's easier to learn the logic behind the suggestions and get things right the first time. I used to misspell antenna on a consitent basis, with spellcheck helping me out. It wasn't until I wrote the word out a hundred times longhand that I trained my brain to spell the word correctly. So using these simple AI tools to help us fill these little gaps is useful, and leads us to learn so we avoid making the mistakes in the first place.
When it comes to using AI to write outlines, or to help brainstorm, I see that as an acceptable application. We still have to look at the outline and see how it works with the assignment requirements, or look at the brainstorm suggestions to see what makes sense in context. But when we or our students rely on artificial intelligence to bridge bigger gaps in our knowledge, we see ChatGPT and the like stumble, and often stumble bigly. Even worse -- relying on AI to fill those larger writing gaps robs us of creating more elegant solutions we could come up with if we put in the effort ourselves. But that means researching, vetting the research, brainstorming and outlining and writing and revising on our own. That initially can take a lot of time, and maybe we don't have the time. But if we rely on AI as a time boost, we lose in the learning department. We don't become better writers, who become more familiar with common pitfalls and problems and solutions and aren't able to recreate them on our own.
Kudos to YouTuber Joshua Morony for this content, and inspiration as I look to fill my tiny knowledge gaps as I try to help my students -- and myself -- avoid those big traps.
No comments:
Post a Comment