Struck by how some of the folks defending ChatGPT don't seem to recognize what parts of the writing process are the useful, important parts and also have very little sense of how those skills are taught.
"AI can incompetently perform the important parts" — not a strong sell!
A lot of it is missing that the process of outlining and organizing is a key part of the analysis and that the analysis — the
thinking behind the writing — is actually what is valuable in a written product.
The goal is to think and then communicate thoughts.
ChatGPT cannot think or analyze — the reason it "hallucinates" is that it — and I keep stressing this —
doesn't know anything about its topic and is just stringing words together in plausible arrangements.
"The AI lets me write something without thinking about it" (by say, feeding it some facts and a predetermined conclusion, letting it sort out the rest) defeats the purpose of writing analysis where someone is supposed to have thought about something.
Ask yourself: how valuable to you is an essay written by an idiot who knows nothing about a topic and has given it no thought at all?
Now, how about the same but by an expert who knows many things about it and has spent time thinking hard about it?
ChatGPT is the first one.
That first essay is worse than useless: it wastes your time and the best it may do is mislead you.
The second one is valuable — if you are interested in it — because it may inform you. But everything that makes it valuable are the things ChatGPT is incapable of doing.
The process of organizing those facts, working out the smaller conclusions they create and how those conclusions build to a final, useful conclusion — that's the actual work of analytical thinking happening there.
The writing process is a vehicle for it.
But ChatGPT cannot do that thinking — it doesn't have the capability to do so. Maybe some future AI will be able to, but not this one. The "low reliability" of its answers isn't a small problem; it isn't a bad thinking engine, but a babble engine that cannot think at all.
Meanwhile the idea that ChatGPT can replace the
purpose of the essay confuses all of this. In published form, the purpose of the essay is to convey thoughts that a thinking person has, clearly and persuasively.
ChatGPT can produce "copy", it can churn out lots of low quality "content" but it cannot analyze or explain because that's all communicating the thoughts of a mind it does not have.
That some journalists think this is valuable might tell us something about their process.
If I ask ChatGPT to do, say, @KofmanMichael's job and explain the Ukraine War to me, it will babble confidently but because that babble is not and cannot be based on any knowledge or analysis of the topic, it is entirely worthless except in its power to deceive.
Alternately, we use essays to train these skills: analysis, organization and communication. ChatGPT replaces none of these save perhaps the last in trivial ways.
When I assign an essay in a class, it isn't because I want the "copy"/"content" — that's actually almost wholly worthless to me. If my goal was to get One Unit of Essay, I could write a far better one than my students do and in far less time than it takes to sort/grade them.
The essay is a learning exercise. By analogy, I am asking my students to forge some nails not because I want nails — their nails will mostly be quite poor — but because I want them to learn how to smith things.
I want them to learn how to analyze, organize and communicate.
If the student uses ChatGPT to automate their analysis, well it can't do that, so the result is bad. Likewise for organization — it doesn't understand any of these concepts and so doesn't know how they go together. It does organization beyond rote formula badly.
That leaves communication — which is something it can sort of do. But here's the thing: students will need, in a variety of contexts and genres in their life, to be able to communicate the ideas they have clearly and convincingly, based on sound reasoning.
They cannot rely on a chatbot to do that for them in all cases, because some of those cases are going to be effectively live, in a conversation or an interview or a meeting or delivered live via notes or a Q&A.
If they've learned with a crutch, like taking a ChatGPT product and then editing it into a less insane shape and fixing all of its facts, they're not going to have learned those communication skills to frame an entire response live.
So a ChatGPT-assisted assignment wouldn't
teach the things I want my assignments to
teach because again they are for teaching not because I want One Unit of Essays as some kind of final product.
(Grades, by the by, are signals, also not products).
Now I can imagine more tailored chatbots still being useful in producing some things — highly formulaic documents, for instance. But making a machine that can analyze, understand its material and output an idea from that understanding — ChatGPT doesn't even attempt that.
So many of the defenders of this technology see a carriage rolling downhill and declare that the creators have invented the automobile.
There is no engine in there, and the engine is the hard part. ChatGPT does the easy part … badly … and the hard part not at all.