5 Comments
Mar 6, 2023Liked by Lee R. Nackman

And ChatGPT responded the following to my essay (sounds like she agrees with me :-) ):

You make several interesting points regarding the use of technology in education, specifically in relation to ChatGPT. It's true that the context in which technology is being considered for use is important to understand, and that different subjects or assignments may call for different approaches.

Regarding the use of calculators in math, it's true that students need to learn arithmetic and practice using it. However, as they progress in their studies, they may encounter more complex problems that require the use of a calculator. Similarly, in writing, students need to learn the elements of composition, but they may also benefit from tools that help them refine their writing skills, such as grammar checkers or thesauruses.

Regarding ChatGPT, you raise an important question about what the program "knows." While it's true that ChatGPT has an impressive facility for language understanding and production, it's also true that its knowledge is largely limited to the English language. Additionally, while it may possess some degree of "world knowledge," this is likely based on patterns it has learned from large amounts of text data rather than any real understanding of the world.

As for the question of whether the program is doing some of the "thinking" in college essays, it's a valid concern. While the student may be coming up with the ideas and arguments, ChatGPT may be assisting with the actual phrasing and sentence structure. However, this is not necessarily a bad thing if the goal is to produce well-written essays that effectively communicate the student's ideas.

In conclusion, the use of technology in education is a complex issue that requires careful consideration of the context in which it is being used. While tools like ChatGPT can be incredibly useful in certain situations, it's important to be aware of their limitations and potential drawbacks. Ultimately, the goal should be to use technology in a way that enhances the learning experience for students and helps them achieve their educational goals

Expand full comment
Mar 6, 2023Liked by Lee R. Nackman

I think it's important to understand the context in which technology (whether slide rule, calculator, or ChatGPT) is being considered for use. Let's take the calculator as an example. If you believe that it's important for kids to learn arithmetic, you teach the rules of arithmetic to them and then have them practice their use through homework and tests. In such a setting, I feel it would be inappropriate for a kid to use a calculator on their homework and tests, since it obviates the need to learn and practice arithmetic. On the other hand, if you're teaching kids physics, where you present facts and laws and they practice with homework and exams, I think it's OK to use calculators, since their capability (arithmetic, etc.) doesn't clearly overlap with, say, F = ma.

Now, by this reasoning, if you want kids to learn expository writing (as opposed to literature) then you teach them the elements of composition and they practice with essays. In this case (say, High School English) using ChatGPT would, again, obviate the purpose of the teaching, so to my mind, it shouldn't be allowed. On the other hand, if you're talking about using writing skills to convey information in some non-composition subject (say, an essay on why Putin's invasion of Ukraine is a violation of international norms) then the student's product is ideas about political science, economics, ethics, etc., and I think using a tool that helps map ideas into verbal form, I'm less inclined to object to.

But... Let's talk about ChatGPT. Is it in fact a tool that maps ideas into verbal form? Well, yes, it is that, but I think it's more, too. I believe that ChatGPT and other Generative Pretrained Transformers (hence, GPT) have incorporated (through learning) an amazing facility for language understanding and production, in terms of spelling, grammar, semantic constraints, etc. But is that all? This point has really been itching me since I learned about ChatGPT: What does the program actually "know"? As I said, it really knows language (English only?), but does it really know anything else, what I'd call "world knowledge". It sure seems to, as suggested about ChatGPT's understanding of clouds (and more generally, non-solid objects), and I'd suggest that it would be very difficult to teach *just* the language part without imparting some world knowledge in the process. Then, there's a third *possible* layer of understanding, which is synthesizing solutions to problems, what I'd call "general reasoning". Again, it's hard to separate this from the other two layers of knowledge that I would grant ChatGPT possesses, but ChatGPT's ability to write simple script programs seems to embody such a synthesis process (full disclosure, this is a programmer trying to justify his skills). However, I think we're an awfully long way from Artificial General Intelligence, which is the point where I think we have to start worrying a little bit.

I think the balance among these various layers (presumably changing even as we write) makes use of ChatGPT for, say, writing college essays, makes me *somewhat* uncomfortably. The question is: is the student coming up with the ideas and arguments and the program is "rendering", or is the program doing some of the "thinking", too? On the other hand, for topics where the "world knowledge" and "general reasoning" are pretty tough, like technical papers and instruction manuals, I'd be pretty strongly in favor on using ChatGPT, given the poor quality of language (but excellent world knowledge) of such writing.

Expand full comment

excellent analogy, Lee. I appreciate being challenged to consider thing from a new perspective.

Expand full comment