ChatGPT and the future of essay assignments

 It's the singularity!  AI is taking over!  Or maybe not, but you should definitely put all of your money into any company that uses AI buzzwords, because that investment strategy has never gone wrong.  The markets are doing strange things, as markets are wont to do, those prone to freakouts are freaking out about either SkyNet, or at least the latest iteration of technology-will-put-everyone-out-of-work, which is a story that humanity has been hearing since the industrial revolution, or possibly earlier.  Yet the industrial revolution was also why Malthus was wrong, so either way, there are apocalyptic predictions, and the world just keeps not ending.  Bad shit keeps happening on smaller scales, but the world only ends once, which means my ability to grouse about failed predictions in the past is a direct function of the failure of those predictions.  What will the newest coming thing bring us?  Fuck if I know.  For the most part.

Here's what I can tell you.  The college essay is dead, dead, dead.  Behold, a parrot with lovely plumage.

Have you played around with ChatGPT, Google's "Bard," or any of the other emerging technologies?  They serve no real function for me, but I don't need to write college papers anymore.  I'm on the other side.  I've been messing around with some of these programs, giving them essay prompts.  What kind of prompts?  Example:  Write a 1000 word essay on blasphemy in Salman Rushdie's The Satanic Verses.

I teach a course on banned books.

I plugged that one into Google's Bard.

Google got a B-.  However, Google was willing to revise in response to my comments, and indeed, more responsive to my requests for revision than students who submit drafts and then ignore my advice.

Case Western Reserve University does not have pluses or minuses on course grades, and there are freebee points for participation, so if you translate that to a course grade, you're looking at a B, at least.  Of course, I did not stop there, because I take my life lessons from Ferris Bueller.  You can never go too far.  Also, if I'm going to get caught, it is not going to be by a guy like me.  I teach about topics like political ideology, so what happens if I prompt these programs with topics like Friedman vs. Keynes?  Still in the B range.  Major topics stay solidly within the B range, and what it takes for me to trip up the programs right now is for me to ask them to write about obscure journal articles, to perform original quantitative analysis, or something like that.  As of right now, I can still stump these things.

However, I can only do so either by raising the bar (e.g. by requiring original empirical work) or going into deep cuts.  That second trick will cease to work, as the programs get better, and the first "trick" isn't a trick, so much as requiring something other than merely a 1000-2000 word essay.  We can still require more rigorous assignments, appropriate for higher level courses.  However, the basic college essay?  That's done.

One may note several things about ChatGPT and Bard's ability to get a B from me.  First, it really isn't that hard to get a B on a Political Science paper.  Once upon a time, I tried to defend the existence of the field, but I'm done with that.  It's a joke major.  Grades have been inflating across many disciplines, but not at the same rates, and the social sciences and humanities have seen more grade inflation than, for example, math.

Get out of that defensive crouch, social science and humanities majors, it's true, we all know it, and we have the data to show it.

It is not that the current AI programs write startlingly brilliant essays.  Rather, the bar for a B just isn't that high.  For those faculty who are pre-tenure, As are the rule.  Why?  They need to keep every student happy, so that they max out their teaching evaluations.  Grade-inflated students are happy students, and happy students give positive teaching evaluations.  That way lies tenure.  Well, that plus some publications, soon to be written by AIs, particularly in the fields that merely require you to crank out meaningless, ideological drivel.

Remember, also, that most students across most institutions are taught by faculty who are not even tenure track.  They face even more pressure to keep the students happy by handing out As like it's a bribe.

Because it is.

Which just perpetuates the cycle of grade inflation by reinforcing students' expectations of As, and then even tenured faculty decide it's easier to give out inflated grades than deal with blubbering.

The result?  A language algorithm can spit out a paper that gets a B from a guy with two thumbs that used to be known as his department's hardass.

This guy.

Can an AI make a two-thumbs joke?  Of course.  I'm useless.

Anyway, the basic problems are as follows.  Grade inflation has been building for years, creating low academic standards.  Some asshole introduces a language algorithm that can write a paper that gets Bs, given those standards, and what are we, professors, to do?

Or K-12 teachers?

This kills the essay.  I'm not sure I see a way around it, and I am not happy about it, but I do not see the essay surviving.

Consider the calculator.  Once upon a time, not too long ago, calculators were humans.  It was a job.  The job of calculating.  Then electronic calculators became affordable.  Did that end arithmetic education?  No.  Why not?  "Show your work."  Combine that with in-class tests, creating the incentive, and the calculator did not end arithmetic education.  Adults seem to lose their ability to do arithmetic, which means they cannot balance their checkbooks, but that's as much irresponsibility as diminished capacity.  Still, the educational process survived.

This is different.  If I tell my students to write a paper about polarization in Congress, and a bunch of them decide that they are too busy with their drinking assignments from their drinking professors, who have assigned them so much drinking that they cannot possibly get through all of their drinking and still write my silly, little paper, what happens?  They will drunkenly type CHERTGRPOT, but eventually find their way to ChatGPT, and after a few miscommunications, the bot will eventually spit out a B-level paper.

And I cannot simply demand that the students "show their work."  No such extra text appears on a paper assignment.  When it comes to research papers based on quantitative analysis, there will be a data set, output from the statistics program, and other materials.  That would be the work, shown.

An essay?  That is both the work, shown, and the end product.  Who could ask for anything more?

Well, I could.  I could demand that the students write, not 1000-word essays, but full research papers, but that seems to make people really unhappy, and with increasing numbers of students taught by faculty off the tenure track-- remember the As for evals bribe-- that's not a system-level fix.

And how long will it be before ChatGPT can download the National Election Studies data set, regress Y on X, and write a bonehead paper on it?

With grade inflation, it'll still get a B, and I can't stop it.

Am I worried about ChatGPT taking my job?  If universities figure out how to break tenure, then yes, but then again, if bots can be professors, then why bother with college?  At that point, are we all fucked?

Stop it.  We're not doing a futurism-freakout.  Technology never seems to work that way.  Factories and automation didn't create mass poverty, they just shifted the jobs, as an example.  Anyone telling you what these programs will do to the economy or the structure of society is bullshitting you.  We don't know, so calm the fuck down.

But as of right now, these things can write B-level papers, and professors are acting like nothing is happening.  From now on, I'm just going to assume that some proportion of whatever gets submitted to me was written by an AI because the students were lazy.  Which ones?  I won't know.  That's not quite the Turing test, but it's getting there.

So here's the big question.  Why bother?  Why bother teaching writing, or learning to write?

My advice to students has always been that they need to develop marketable skills.  Their major is less important than the skills they develop.  That has meant math, writing, or sector-specific technical knowledge.  Writing is going the way of arithmetic.  Math still matters, in that your ability to comprehend calculus, differential equations, linear algebra (which isn't what you think it is), and such will help you get a job.  Learning statistics will help get you a job.  Being able to crunch numbers in your head?  You might get a job as a "mathemagician" at a kids' party, but that's about it.

Will writing still be a marketable skill?

Not at the level that it is now.

Right now, people are having stupid debates about if/when AIs will be able to create good-to-great art.  You can tell an AI to compose art within specific parameters, and your role is the construction of the parameters.  How good are the products?  Right now, they suck.  (Remember, a B on a college paper isn't really that hard to get.)  Will there ever be a program that can approximate the infinite number of monkeys and infinite number of typewriters required to produce Hamlet?

I am skeptical, but I rule out nothing.  Here is the basic issue, as I see it.  The way the algorithms work right now is that they scour existing materials and perform I'm not touching you plagiarism.  That's my terminology.

If you grew up with siblings, you recognize the phrase, "I'm not touching you."  It comes from being in the back seat of a car, on a long car trip.

Stop touching me!

...

...

[Puts finger as close as possible without contact.]

I'm not touching you!  {in singsong voice}

I'm not touching you plagiarism occurs when a student finds existing materials, and alters the wording just enough to avoid a plagiarism charge.  According to the technical rules of the academic integrity code, that's still plagiarism, but the AIs are more sophisticated because they can scour more sources and recombine them in different ways.  When students do it, they usually use techniques like what I call, "the ole' comma switcharoo."  That refers to the technique of taking a sentence with multiple clauses and changing the order of the passages separated by commas.  That way, it is no longer an exact quote.  The student then claims it as an original statement.  The ole' comma switcharoo!

The thing about the ole' comma switcharoo is that it is a play on one source.  The AIs can take and recombine materials from far more sources, making them harder to catch with plagiarism checkers.

(So we're really fucked.)

But you know what they are not doing?  Anything original.

For most people, and most purposes, that's fine.  For most businesses, this will work.

All the entertainment industry does is remake last year's remake anyway, so who cares if AIs put those hack writers out of business?  Fuck 'em.

I have nothing but contempt for pop music, and in a few years, the AIs will be writing all pop music.  You won't notice, and it's all just background for underwear models anyway, so again, who cares?

What can an I'm not touching you plagiarism algorithm not do?  Anything original.  I have been gushing lately about S.A. Chakraborty's Daevabad Trilogy, which is fucking good.  There are interesting and original ideas in it.  An AI Chakraborty?  An AI Stanislaw Lem (who prompted me with this concept a few weeks back), an AI Neal Stephenson, an AI Fyodor Dostoevsky, an AI Shakespeare, an AI Thomas Pynchon, an AI Salman Rushdie...

I won't hold my breath waiting for that, as I think about the AI John Keats in Dan Simmons's Hyperion.  Great book, by the way.

There is AI generated music.  There are youtube musicians playing around with it.  When will there be the AI Miles Davis?

Of course, originality is a high bar, and most writing in the economy need not be that.

So why bother?

Writing is the best way to think.  Writing forces you to put disordered thoughts into linear form.  It thereby forces you to confront gaps and fallacies.  Sure, if your goal is to blather your way to an A in the grade-inflated college system, you can do that, but what if your goal is actually something else?  Like solving a problem?

You write.  You write because the application of the written form orders your ideas.

I cannot make you do this.  I cannot make my students do this.  Not anymore.  I am concerned about a bifurcation that will occur, yet a bifurcation that will be both self-imposed and reinforcing in some potentially troubling ways.

Those who are self-motivated always have and always will do better at everything.  Whine to me about unfairness and inherited privilege and all that.  I don't care, or at least, not much, because the real question is this.  What are you going to do?

Why are Americans fat and lazy?  It's the Heathers answer.  Because we can be.  However, it is individually and socially destructive.  You know who does better?  Those who are sufficiently self-motivated.

Yes, I can still do arithmetic in my head.  Why?  Because I still bother.  I can still look at maps and find my way around.  Why?  Because I never used GPS.  (No, I am not pulling over to ask my iPhone for directions!  That's not how you fold an iPhone!  Gimme that!)

I could have a chatbot write posts on this blog, and no one would notice because no one reads it except CWRU lawyers trying to find something to use (go play in a toxic waste dump, kids), so why bother?

The point is to think.  Those who choose to write will be those who think more clearly.

There are obviously people who write, and have no clarity of thought.  Yet, find me people who cannot write, and still think coherently.

That is a more difficult task.

Socrates, of course, warned that the written word would dull the mind, by diminishing memory, and he was almost certainly right about that.  He was also a very smart guy, although one might question is taste in beverages.

Coffee, sir.  Coffee.

Yet Socrates was also wrong.  We gain something through the act of writing.  I cannot make anyone take advantage of the gift of writing.  Until now, I have been able to structure incentives in such a way as to reward the hard thing, for those not otherwise inclined towards it.  No more.  The gift remains for anyone inclined to take it.

Also, EVERYONE FREAK OUT WE'RE ALL GONNA DIE BECAUSE SKYNET IS COMING AHHHHH!!!!!!!!!!!

Chris Whitley, "New Machine."  Here's a live version.  The studio version is from Din of Ecstasy.


Comments