Short Note: ChatGPT and Slide Rules
Students are using ChatGPT to write or help them write school assignments. Remember slide rules being replaced by calculators? That transition offers some perspective.
A relative of mine, who I only know because of my wife’s genealogical research, posted an interesting comment on Facebook about yesterday’s article on ChatGPT and Bing. He’s a high-school teacher and says that “We’ve been having an extended conversation about ChatGPT and AI in general at my school. We’ve had dozens of students turn in papers written fully or in part by AI.” The issue, of course, is whether it is cheating for a student to “write” using ChatGPT.
This reminds me of my college days (early 1970s). My roommate’s parents gave him a TI calculator for Christmas. None of us had ever seen a calculator before.
At the time, we, and many of our friends, were nerdy kids taking physics and other science classes. We calculated with slide rules and most of us were pretty quick with them.
But holy cow, that calculator was way faster than even the best of the slide rule jocks. And slide rules didn’t keep track of the decimal point so you always had to estimate the solution to make sure you put the decimal point in the right place. Calculators even kept track of the decimal point.
Only a few people had calculators and they were expensive, so a debate ensued: Is it cheating to use a calculator doing homework? Is it cheating to use a calculator while taking an exam?
My memory’s a little foggy, but I think that the outcome was homework ok, exam’s not ok. But by the time I finished college there was no more debate — people used calculators in all their science classes, including on exams.
By the time my kids were in high school (early 2000s), they were required to purchase and use a graphing calculator in several of their classes. They still learned about mathematical concepts and learned to graph functions, etc., by hand, but they were also expected to be able to use technology to help.
Whenever a new tool is introduced, there’s always a debate about how it should be used and even whether it should be used. You can no doubt think of many examples.
We are at that stage with ChatGPT and similar AI technologies. I would guess that in ten years students will still be expected to be able to write well but will also be expected to be able to use ChatGPT-like technology to augment their own abilities.
It’s not clear what this might mean. Telling ChatGPT to write an essay on the assigned topic and then submitting the output directly would be akin to searching with Google, finding a good essay online, and submitting it verbatim. That is plagiarism.
But “conversing” with ChatGPT (sometimes called prompt engineering) to guide its writing based on your own thoughts might become a standard way of writing more rapidly and maybe even better. The dividing line between plagiarism and acceptable augmentation of human capability is fuzzy and takes time to discern.
There’s no stopping progress.