AI is here and damn is it exciting. Maybe, like me, you’ve messed around with ChatGPT and dreamt of the implications this will have on education. When I asked it to write me a college-level history thesis outline on Ralph J. Gleason’s impact on Bay Area politics in the 1960s and 70s, it spit out a fully competent and logical outline in about 15 seconds. Wish I had that tool in 2001. Sure beats the countless hours I spent with notecards strewn on my kitchen floor, agonizing over structure and flow. And, as you’ve figured, kids are going to start using this yesterday. Tons of students are already popping prompts into ChatGPT and submitting the answers as their own. Unlike snagging pre-written essays off of the internet (the modern version of copying the summary on the back of the book), ChatGPT synthesizes existing works and produces original content, meaning these pieces will pass standard algorithm tests for plagiarism. But is it plagiarism? Well…
Using GPT-3 or ChatGPT to generate text is not automatically considered plagiarism. However, it is important to properly attribute any text that is generated using these tools and to ensure that the text is not used in a way that passes it off as the original work of someone else. If you use GPT-3 or ChatGPT to generate text, it is a good idea to clearly indicate that the text was generated using these tools and to provide a link to the source. It is also important to ensure that the text is not copied and passed off as your own original work. Plagiarism is the act of using someone else’s work or ideas as your own without proper attribution. It is important to always give credit to the original source of any information or ideas that you use in your own work.*
*All essays on this subject are required to have a portion of it written by ChatGPT, and the above paragraph was written by the bot (in 7 seconds) in response to this prompt: “How is using ChatGPT for writing not plagiarism?” If you were reading carefully, you probably noticed how bland and slightly off this paragraph felt. That’s pretty standard for ChatGPT’s output – it typically moves from general to specific, ending with a tie-in to another general theme, along with caveats and warnings. We see that here. Succinct, general response to the prompt (not necessarily plagiarism), followed by guidelines to avoid plagiarism, then a definition of plagiarism, then a connection to an immutable law of writing (credit your sources). You’ll also notice that it doesn’t really answer the question. In this regard, it’s strikingly similar to many students’ work…! Lots of words and tangential ideas, but the actual question was never really answered. The bot told me how to avoid plagiarism, but not why using it wasn’t “automatically” considered plagiarism. It didn’t support its own thesis.
The big question for all of us is not if we should use ChatGPT, but how we should use it. Just as a calculator assists our math or a backhoe assists our digging, AI can assist our writing. It’s still a tool, though, and not a replacement for thought. For a calculator, you need to know what to ask of it and how to interpret and contextualize the answer. A backhoe needs to be told where to dig. For ChatGPT, we need to know how to prompt it, and then how to refine that prompt to get at useful information.
I told ChatGPT I was unhappy with its answer, and after an apology (always with the apologies!!) it gave me something pretty useful: “Using GPT-3 or ChatGPT to generate text is not considered plagiarism because it is not someone else’s work that is being used. These tools generate text based on the input that they receive, so the resulting text is not copied from someone else’s work.”
Ahh, ok, now we’re getting somewhere! “So it’s about copying vs not copying, yeah? Are you suggesting the work is in the prompt itself?”
“Yes, that’s correct. Using GPT-3 or ChatGPT to generate text is not considered plagiarism because it does not involve copying someone else’s work. The text that is generated using these tools is based on the input that is provided to them, which can include a prompt or other information. So, in a sense, the “work” that goes into generating the text is in the prompt or input that is given to the tool, rather than in the text itself.”
The tool does the work it’s directed to do. (Not included here in my back and forth are the numerous admonishments ChatGPT gave me to make sure I credit others’ work and never pass it off as my own.) Coming soon to a classroom near you: AI-assisted essays. If I was going to submit an answer to the above prompt using an AI assist, I’d probably start by mashing together the various answers I got from the AI. Something like this:
Using GPT-3 or ChatGPT to generate text is not automatically considered plagiarism, which is the act of using someone else’s work or ideas as your own without proper attribution. It’s not plagiarism because it does not involve copying someone else’s work. The text that is generated using these tools is based on the input that is provided to them, which can include a prompt or other information. So, in a sense, the “work” that goes into generating the text is in the prompt or input that is given to the tool, rather than in the text itself. However, it is important to properly attribute any text that is generated using these tools and to ensure that the text is not used in a way that passes it off as the original work of someone else.
Then I’d expand on the answer, since “plagiarism” in the strictest sense doesn’t capture the essence of the question, which is really about dishonesty. Next I’d comb through and try to put it in my voice, getting rid of the mechanical, computerese style that ChatGPT employs. For this I’d actually re-type stuff, just because that’s how I can best get a feel for the language. Many students will likely skip this stage. My final draft would would be “my” work, despite my use of ChatGPT for both ideas and structure, and would read something like this:
Using ChatGPT to help with writing is never plagiarism, because that would require copying someone else’s work, and that’s not how ChatGPT functions. It’s a tool, and like all tools, it can be helpful or harmful, depending on how it’s used. A lot of the benefit of ChatGPT is in formulating coherent, well-structured answers to specific questions. This means that there has to be a clear, well-formulated question to answer. If a student uses the bot to generate ideas, examines the responses for flaws and gaps, adjusts the prompt, and then uses those responses to help structure an answer, that’s a tool well-used. If, on the other hand, the student simply types in a prompt from a teacher, copies and pastes the response, slaps their name on it and submits it, that’s academic dishonesty. It’s not plagiarism, but it’s not the student’s own work, and it defeats the point of the assignment. As AI becomes more prevalent in our lives, we’ve got to figure out what “our own work” really means. Glasses help us see, computers help us navigate and calculate. AI can help us write.
Excellent take on a very pressing issue. Will pass this along to some AI users I know who may be indiscriminately cutting and pasting. 🙂