This article is more than 1 year old
University students recruit AI to write essays for them. Now what?
Teachers need to work harder to get students to write and think for themselves
Feature As word of students using AI to automatically complete essays continues to spread, some lecturers are beginning to rethink how they should teach their pupils to write.
Writing is a difficult task to do well. The best novelists and poets write furiously, dedicating their lives to mastering their craft. The creative process of stringing together words to communicate thoughts is often viewed as something complex, mysterious, and unmistakably human. No wonder people are fascinated by machines that can write too.
Unlike humans, language models don't procrastinate and create content instantly with a little guidance. All you need to do is type a short description, or prompt, instructing the model on what it needs to produce, and it'll generate a text output in seconds. So it should come as no surprise students are now beginning use these tools to complete school work.
Students are the perfect users: They need to write often, in large volumes, and are internet savvy. There are many AI-writing products to choose from that are easy to use and pretty cheap too. All of them lure new users with free trials, promising to make them better writers.
Monthly subscriptions for the most popular platform, Jasper, costs $40 per month to generate 35,000 words. Others, like Writesonic or Sudowrite, are cheaper at $10 per month for 30,000 words. Students who think they can use these products and get away with doing zero work, however, will probably be disappointed.
And then there's ChatGPT...
Although AI can generate text with perfect spelling, great grammar and syntax, the content often isn't that good beyond a few paragraphs. The writing becomes less coherent over time with no logical train of thought to follow. Language models fail to get their facts right – meaning quotes, dates, and ideas are likely false. Students will have to inspect the writing closely and correct mistakes for their work to be convincing.
Prof: AI-assisted essays 'not good'
Scott Graham, associate professor at the Department of Rhetoric & Writing at the University of Texas at Austin, tasked his pupils with writing a 2,200-word essay about a campus-wide issue using AI. Students were free to lightly edit and format their work with the only rule being that most of the essay had to be automatically generated by software.
In an opinion article on Inside Higher Ed, Graham said the AI-assisted essays were "not good," noting that the best of the bunch would have earned a C or C-minus grade. To score higher, students would have had to rewrite more of the essay using their own words to improve it, or craft increasingly narrower and specific prompts to get back more useful content.
"You're not going to be able to push a button or submit a short prompt and generate a ready-to-go essay," he told The Register.
The limits of machine-written text forces humans to carefully read and edit copy. Some people may consider using these tools as cheating, but Graham believes they can help people get better at writing.
Don't waste all your effort on the first draft....
"I think if students can do well with AI writing, it's not actually all that different from them doing well with their own writing. The main skills I teach and assess mostly happen after the initial drafting," he said.
"I think that's where people become really talented writers; it's in the revision and the editing process. So I'm optimistic about [AI] because I think that it will provide a framework for us to be able to teach that revision and editing better.
"Some students have a lot of trouble sometimes generating that first draft. If all the effort goes into getting them to generate that first draft, and then they hit the deadline, that's what they will submit. They don't get a chance to revise, they don't get a chance to edit. If we can use those systems to speed write the first draft, it might really be helpful," he opined.
Whether students can use these tools to get away with doing less work will depend on the assignment. A biochemistry student claimed on Reddit they got an A when they used an AI model to write "five good and bad things about biotech" in an assignment, Vice reported.
AI is more likely to excel at producing simple, generic text across common templates or styles.
Listicles, informal blog posts, or news articles will be easier to imitate than niche academic papers or literary masterpieces. Teachers will need to be thoughtful about the essay questions they set and make sure students' knowledge are really being tested, if they don't want them to cut corners.
Ask a silly question, you'll get a silly answer
"I do think it's important for us to start thinking about the ways that [AI] is changing writing and how we respond to that in our assignments -- that includes some collaboration with AI," Annette Vee, associate professor of English and director of the Composition Program at the University of Pittsburgh, told us.
"The onus now is on writing teachers to figure out how to get to the same kinds of goals that we've always had about using writing to learn. That includes students engaging with ideas, teaching them how to formulate thoughts, how to communicate clearly or creatively. I think all of those things can be done with AI systems, but they'll be done differently."
The line between using AI as a collaborative tool or a way to cheat, however, is blurry. None of the academics teaching writing who spoke to The Register thought students should be banned from using AI software. "Writing is fundamentally shaped by technology," Vee said.
"Students use spell check and grammar check. If I got a paper where a student didn't use these, it stands out. But it used to be, 50 years ago, writing teachers would complain that students didn't know how to spell so they would teach spelling. Now they don't."
Most teachers, however, told us they would support regulating the use of AI-writing software in education. Anna Mills, who teaches students how to write at a community college in the Bay Area, is part of a small group of academics beginning to rally teachers and professional organizations like the Modern Language Association into thinking about introducing new academic rules.
Critical thinking skills
Mills said she could see why students might be tempted to use AI to write their essays, and simply asking teachers to come up with more compelling assessments is not a convincing solution.
Just $10 to create an AI chatbot of a dead loved oneREAD MORE
"We need policies. These tools are already pretty good now, and they're only going to get better. We need clear guidance on what's acceptable use and what's not. Where is the line between using it to automatically generate email responses and something that violates academic integrity?" she asked The Register.
"Writing is just not outputs. Writing and revising is a process that develops our thinking. If you skip that, you're going to be skipping that practice which students need.
"It's too tempting to use it as a crutch, skip the thinking, and skip the frustrating moments of writing. Some of that is part of the process of going deeper and wrestling with ideas. There is a risk of learning loss if students become dependent and don't develop the writing skills they need."
Mills was particularly concerned about AI reducing the need for people to think for themselves, considering language models carry forward biases in their training data. "Companies have decided what to feed it and we don't know. Now, they are being used to generate all sorts of things from novels to academic papers, and they could influence our thoughts or even modify them. That is an immense power, and it's very dangerous."
Lauren Goodlad, professor of English and Comparative Literature at Rutgers University, agreed. If they parrot what AI comes up with, students may end up more likely to associate Muslims with terrorism or mention conspiracy theories, for example.
Computers are alredy interfering and changing the ways we write. Goodlad referred to one incident when Gmail suggested she change the word "importunate" to "impatient" in an email she wrote.
"It's hard to teach students how to use their own writing as a way to develop their critical thinking and as a way to express knowledge. They very badly need the practice of articulating their thoughts in writing and machines can rob them of this. If people really do end up using these things all the way through school, if that were to happen it could be a real loss not just for the writing quality but for the thinking quality of a whole generation," she said.
Rules and regulation
Academic policies tackling AI-assisted writing will be difficult to implement. Opinions are divided on whether sentences generated by machines count as plagiarism or not. There is also the problem of being able to detect writing produced by these tools accurately. Some teachers are alarmed at AI's growing technical capabilities, whilst others believe its overhyped. Some are embracing the technology more than others.
Marc Watkins, lecturer, and Stephen Monroe, chair and assistant professor of writing and rhetoric, are working on building an AI writing pilot programme with the University of Mississippi's Academic Innovations Group. "As teachers, we are experimenting, not panicking," Monroe told The Register.
"We want to empower our students as writers and thinkers. AI will play a role… This is a time of exciting and frenzied development, but educators move more slowly and deliberately… AI will be able to assist writers at every stage, but students and teachers will need tools that are thoughtfully calibrated."
- Human-replacing AI startups reach $1bn unicorn status
- GPT-3 'prompt injection' attack causes bad bot manners
- FauxPilot: It's like GitHub Copilot but doesn't phone home to Microsoft
- AI chatbot trained on posts from web sewer 4chan behaved badly – just like human members
Teachers are getting together and beginning to think about these tools, Watkins added. "Before we have any policy about the use of language models, we need to have sustained conversations with students, faculty, and administration about what this technology means for teaching and learning."
"But academia doesn't move at the pace of Big Tech. We're taking our time and slowly exploring. I don't think faculty need to be frightened. It's possible that these tools will have a positive impact on student learning and advancing equity, so let's approach AI assistants cautiously, but with an open mind."
Regardless of what policies universities may decide to implement in the future, AI presents academia with an opportunity to improve education now. Teachers will need to adapt to the technology if they want to remain relevant, and incentivise students to learn and think on their own with or without assistance from computers. ®