This article is more than 1 year old
OpenAI is developing software to detect text generated by ChatGPT
Plus: Apple using fake AI voices to help indie publishers release audiobooks
In brief OpenAI is building software capable of detecting whether text was generated by its ChatGPT model after New York City education officials announced it was blocking students from accessing the tool in public schools.
Reports of students using AI to do their homework for them have prompted teachers to think about how they affect education. Some have raised concerns about how language models can plagiarize existing work or allow students to cheat. Now OpenAI is reportedly working to develop "mitigations" that will help people detect text automatically generated by ChatGPT.
"We made ChatGPT available as a research preview to learn from real-world use, which we believe is a critical part of developing and deploying capable, safe AI systems. We are constantly incorporating feedback and lessons learned," a company spokesperson told TechCrunch.
"We've always called for transparency around the use of AI-generated text. Our policies require that users be up-front with their audience when using our API and creative tools… We look forward to working with educators on useful solutions, and other ways to help teachers and students benefit from AI."
Being able to distinguish writing produced by a human or machine will change the way they can be used in academia. Schools would be able to enforce banning AI-generated essays more effectively, or maybe they might be more willing to accept papers if they can see how these tools can help their students.
Yes, generative language models can be good but they don't know what they're talking about
As impressive as AI-generated writing may seem, and with academic conferences and schools banning machine-written papers, here's a reminder the software lacks the comprehension, understanding, and critical thinking we'd like to think human writers and people in general possess.
When tools like GPT-3 or ChatGPT surprise us by coming up with shockingly good responses, experts say it's proof the model is capable of encoding and regurgitating knowledge. Don't be fooled by this very artificial intelligence, Gary Smith, a professor of economics at Pomona College, argued: it's just an illusion of reasoning and judgement. There's no real brainpower behind it.
In an op-ed published in Salon, he showcased a few examples where GPT-3 fails to reason and answer questions effectively.
"If you play around with GPT-3 (and I encourage you to do so) your initial response is likely to be astonishment… You seem to be having a real conversation with a very intelligent person. However, probing deeper, you will soon discover that while GPT-3 can string words together in convincing ways, it has no idea what the words mean," he wrote.
"Predicting that the word down is likely to follow the word fell does not require any understanding of what either word means – only a statistical calculation that these words often go together. Consequently, GPT-3 is prone to making authoritative statements that are utterly and completely false."
- Alphabet reshuffles to meet ChatGPT threat
- OpenAI predicts biz can break a billion in revs by 2024
- Apple taps brake on self-driving cars, now aims for 2026
- OpenAI opens doors to ChatGPT, another AI to fill the world with kinda-true stuff
OpenAI released ChatGPT, a newer model last November that is designed to be an improvement on GPT-3, but it still suffers these same issues nonetheless – like all existing language models.
Apple is publishing audiobooks narrated by AI bots
Apple is seeking to partner with indie writers and publishers to help them narrate their books using voices synthesized by AI.
Authors were advised to reach out to Draft2Digital and Ingram CoreSource, two companies that produce and publish e-books on the Apple Books app, if they want to turn their work into audiobooks. They are only accepting submissions written in English for romance and fiction, other genres are not yet supported.
"More and more book lovers are listening to audiobooks, yet only a fraction of books are converted to audio – leaving millions of titles unheard," Apple said in a blog post.
"Many authors – especially independent authors and those associated with small publishers – aren't able to create audiobooks due to the cost and complexity of production. Apple Books digital narration makes the creation of audiobooks more accessible to all, helping you meet the growing demand by making more books available for listeners to enjoy."
Compared to the robotic, tinny sounds computers used to make when they mimicked humans, synthetic AI voices have vastly improved. They now sound pretty natural and are less monotone.
The new feature will allow self-published writers to expand their audiences and gives them another source of revenue. As always, Apple will take up to a 30 percent of all purchases made on apps available on its App Store. ®