Tech

Students Are Using AI to Write Their Papers, Because Of Course They Are

An overhead shot of students sitting in chairs using laptops in a library.

innovate_rye’s professors know them as a first-year biochemistry major, and an “A” student. What their professors don’t know about them is that they’re using a powerful AI language model to finish most homework assignments.

“It would be simple assignments that included extended responses,” innovate_rye, who asked to use their Reddit handle to avoid detection by their college, told Motherboard. “For biology, we would learn about biotech and write five good and bad things about biotech. I would send a prompt to the AI like, ‘what are five good and bad things about biotech?’ and it would generate an answer that would get me an A.”

Videos by VICE

Without AI, innovate_rye says the homework they consider “busywork” would take them two hours. Now homework assignments like this take them 20 minutes.

“I like to learn a lot [and] sometimes schoolwork that I have done before makes me procrastinate and not turn in the assignment,” innovate_rye explains. “Being able to do it faster and more efficient seems like a skill to me.”

innovate_rye isn’t alone. Since OpenAI unveiled the latest application programming interface (API) for its widely-used language model, GPT-3, more students have begun feeding written prompts into OpenAI’s Playground and similar programs that use deep learning to generate text. The results continue the initial prompt in a natural-sounding way, and often can’t be distinguished from human-written text.

When AeUsako_ was a high school senior last spring, they used OpenAI to generate an entire essay about contemporary world affairs. They told Motherboard that, while they didn’t ace the assignment—they lost points for failing to cite outside sources—they did learn that plagiarism-checking algorithms wouldn’t flag the AI-generated text.

“Because I used Open AI I didn’t feel the constant anxiety of needing to focus all my time on writing it,” AeUsako_, who also asked to use their online pseudonym, told Motherboard.

George Veletsianos, Canada Research Chair in Innovative Learning & Technology and associate professor at Royal Roads University says this is because the text generated by systems like OpenAI API are technically original outputs that are generated within a blackbox algorithm.

“[The text] is not copied from somewhere else, it’s produced by a machine, so plagiarism checking software is not going to be able to detect it and it’s not able to pick it up because the text wasn’t copied from anywhere else,” Veletsianos told Motherboard. “Without knowing how all these other plagiarism checking tools quite work and how they might be developed in the future, I don’t think that AI text can be detectable in that way.”

It’s unclear whether the companies behind the AI tools have the ability to detect or prevent students from using them to do their homework. OpenAI did not comment in time for publication.

Peter Laffin is a writing instructor and founder of the private tutoring program Crush the College Essay. He says that tools like OpenAI’s are emblematic of other compensation techniques that technology has produced in the last decade, such as cloud-based typing assistants that are meant to help struggling writers.

“In literacy education, particularly for developing writers, instructors are looking for the level of desirable difficulty, or the point at which you are working yourself just as hard so that you don’t break but you also improve,” Laffin told Motherboard. “Finding the right, appropriate level of desirable difficulty level of instruction makes their capacity to write grow. So if you are doing compensation techniques that go beyond finding that level of desirable difficulty and instructing at that place, then you’re not helping them grow as a writer.”

Veletsianos notes that it’s probable that we are past the point of no return with AI-generated text, and that students aren’t the only ones being courted.

“We can also begin to see where this technology might generate a lecture on the fly and all sorts of questions around the lecture,” he said. “I’m not saying that the system we have is the best system but I am saying these are conversations we need and ought to have to see how we can use these tools to improve not just efficiency of teaching, but its effectiveness and engagement as well.”

While Laffin acknowledges that a reevaluation of effective education is necessary, he says this can happen when looking at the types of prompts educators assign students, noting a difference between the regurgitation of facts and information discovery. However, he worries that products like OpenAI’s text generator will make essay writing a moot point.

“We lose the journey of learning,” said Laffin. “We might know more things but we never learned how we got there. We’ve said forever that the process is the best part and we know that. The satisfaction is the best part. That might be the thing that’s nixed from all of this. And I don’t know the kind of person that creates more than anything. Beyond academics, I don’t know what a person is like if they’ve never had to struggle through learning. I don’t know the behavioral implications of that.”

Meanwhile, innovate_rye eagerly awaits GPT-4, which is anticipated to be trained on 100 trillion machine learning parameters and may go beyond mere textual outputs. In other words, they aren’t planning to stop using AI to write essays anytime soon.

“I still do my homework on things I need to learn to pass, I just use AI to handle the things I don’t want to do or find meaningless,” innovate_rye added. “If AI is able to do my homework right now, what will the future look like? These questions excite me.”