ChatGPT and Plagiarism

Posted on
ai tutorial writing

Plagiarism is a serious transgression for writers for the same reason joke theft is a cardinal sin for comedians - creative theft hurts.

Creative theft is so hated because it steals the fruits of our labor even more funadmentally than just office politiking or project credit-taking. It erases the identity of the original author (of the joke, insight, analysis, etc) and creates in its place a lie - that the perpetrator of the theft is the true source, and deserves the accolades, finacial and social, associated with it.

Creative theft also cheats the thief - they miss the formative opportunity to make the argument that might change their own way of thinking. Skipping that paper on Tale of Two Cities sure saves you time, but you risk impoverishing your own intellect if you never engage with great literature, and only become adept at aping (or stealing) others’ takes.

That’s all to say there are good reasons plagiarism is such a sin, and plagiarising another author can make you personna non grata at places like the New York Times and the Washington Post. It makes everyone poorer, including the plagiaser.

But - honest question: Is using ChatGPT plagiarism?

For advocates, the answer is a category error - ChatGPT is a tool just like any other. Is using Google Search or any of the investigative powers of the modern Internet plagiarism? ChatGPT is just another tool that will be at the disposal of people in the future (and increasingly present). Using ChatGPT won’t be seen as cheating or plagiarising any more than it’s seen as cheating to use a calculator for the SAT Math section. Society has simply made the decision that’s an advantage you’re allowed to have. Also, there’s no “harm” to “stealing” from ChatGPT because (Google engineers’ opinions notwithstanding) ChatGPT isn’t sentient, intelligent, or even alive.

For critics the answer is a little murkier. Sure maybe there’s no “there” there - ChatGPT isn’t a being that can have feelings about being slighted or suffer from being out of work, so there’s no question around “stealing” from it.

But what about the second harm - that one that befalls the thief?

I’ve seen multiple responses to the fear professors have of students turning in ChatGPT-written papers with a a glorified “So?” The argument, it goes, is that if ChatGPT can pass a test, then that test must now be obsolete. It’s something that can be done by a tool now and is no longer the domain of human intelligence (or something).

But what that take completely ignores is how writing, ultimately, is a way for structuring and communicating thought. By learning how to write certain ways we also learn how to order our thoughts in said ways, and build arguments out of logical contentions instead of schozophrenic emotional appeals. Writing is thinking, thinking is writing. The two are inextricably tied.

That means that when we cheat with ChatGPT, we cheat ourselves - in the same way it’s always been cheating ourselves to take credit for another’s work and to substitute another’s reasoning for our own.

So what does this mean - should ChatGTP be given a byline or otherwise acknowledged by the author of a piece? Or should it be seen as just an uber-powerful autocorrect, and not worth actually calling out? An what’s the answer to the question posed above - is ChatGPT plagiarism?

I’m not entirely sure on any of the above counts. But I do lean towards thinking ChatGPT should be called out when used. And that we should pay attention to the harm of substituting human thought for AI generated pseudo-thought. Writing is more than just the ideas and concepts it conveys - it’s also an important step in the art of logical thinking.

We shouldn’t cede it so easily.