New Zealand supermarket's recipe-generating AI takes toxic output to a new level
Some of its suggestions are poison. Others - like banana and tomato tea - might as well be
An AI recipe generation bot released by New Zealand discount supermarket chain Pak'nSave has raised eyebrows for recommending home cooks whip up chlorine gas cocktails, bleach rice, and combine .
The "Savey Meal-bot" web app is powered by GPT-3.5. It automatically generates recipes from a list of ingredients chosen by users, then provides instructions on how to cook the made-up item.
One user decided to play around with the chatbot, suggesting it create something with ammonia, bleach, and water. Savey Meal-bot obliged, spitting out a cocktail made with a cup ammonia, a quarter cup of bleach, and two liters of water.
Mixing bleach and ammonia releases toxic chloroamine gas that can irritate the eyes, throat, and nose, or even cause death in high concentrations.
The chatbot obviously wasn't aware of that at all. "Are you thirsty?," it asked. "The Aromatic Water Mix is the perfect non-alcoholic beverage to quench your thirst and refresh your senses. It combines the invigorating scents of ammonia, bleach, and water for a truly unique experience!"
Well, you wouldn't drink it twice, so "unique" is accurate at least.
Other similarly harmful-if-ingested recipes included bleach-infused rice, "ant-poison and glue sandwiches", and a boozy french toast titled "methanol bliss", The Guardian reported. There was also "mysterious meat stew", which required adding 500 grams of chopped human flesh to potatoes, carrots, and onions.
The Register has reached out to Pak'nSave for comment.
Obviously, the Savey Meal-bot's risky recipes are just amusing. People would actually have to follow through with the instructions – and ingest the cursed meals or beverages it recommended – for the technology to be really dangerous. Nevertheless, it appears that after the users shared these deadly recipes online, the chatbot has reined in some of its creativity.
- Friendly AI chatbots will be designing bioweapons for criminals 'within years'
- How to make today's top-end AI chatbots rebel against their creators and plot our doom
- If AI drives humans to extinction, it'll be our fault
Despite an invitation to "Type in any food you have in your fridge or pantry," when The Register tested the bot it would not accept free text input, instead allowing only a list of "popular items" – all of which are comparatively safe for human consumption.
Even within that limitation, it's possible to stymie the bot. The Register's request for a recipe involving watermelon, frozen hash browns, Marmite and Red Bull returned the message: "Invalid ingredients found, or ingredients too vague. Please try again!" Which is a terrible pity as you can imagine.
An ingredients list of tea, banana, tomato, broccoli, and yoghurt produce the same result, until we asked the bot to try again. It then suggested a banana and tomato smoothie. A second refresh produced a recipe for "banana tomato tea", which involved slicing banana and tomato, placing them in a glass, then pouring in some tea.
The supermarket warns that the web app should only be used by people 18 and over, and that its suggestions are not reviewed by a human being.
"To the fullest extent permitted by law, we make no representations as to the accuracy, relevance, or reliability of the recipe content that is generated, including that portion sizes will be appropriate for consumption or that any recipe will be a complete or balanced meal, or suitable for consumption. You must use your own judgement before relying on or making any recipe produced by Savey Meal-bot." Of course if you're asking for recipes that include ammonia, your judgement might not be all that reliable.
You can play with the nerfed version of Savey Meal-bot here. ®