This article is more than 1 year old

OpenAI rolls out ChatGPT plugins, granting iffy language model access to your apps

Search aspired to the the command line to the world, but ML models may get there first

Analysis OpenAI this week introduced ChatGPT plugins, a way to extend the scope of its chatbot language model beyond the slurry of internet training data to bespoke business information.

So wary is OpenAI of all the ways that ChatGPT and its other models can misfire that the company begins its announcement by reassuring readers that its cautious rollout follows from its desire to address "safety and alignment challenges."

It does so with good reason – large language models (LLMs), referred to euphemistically as artificial intelligence or just AI, are seen by some to be venomous constructs that must be contained.

LLMs are also limited to whatever information can be accessed or derived from their training data. As OpenAI puts it, "This information can be out-of-date and is one-size fits all across applications. Furthermore, the only thing language models can do out-of-the-box is emit text. This text can contain useful instructions, but to actually follow these instructions you need another process."

This other process consists of third-party applications and services. Presently, the following companies are testing the waters of this ecosystem: Expedia, FiscalNote, Instacart, Kayak, Klarna, Milo, OpenTable, Shopify, Slack, Speak, Wolfram, and Zapier. More can be expected to follow as OpenAI expands access to its plugin program.

"Though not a perfect analogy, plugins can be 'eyes and ears' for language models, giving them access to information that is too recent, too personal, or too specific to be included in the training data," OpenAI explained.

"In response to a user’s explicit request, plugins can also enable language models to perform safe, constrained actions on their behalf, increasing the usefulness of the system overall."

You can rest assured these actions will be safe because OpenAI's post contains more than 20 instances of "safe" or "safety." Repetition is the new compliance.

It's worth noting that the safety testing of GPT-4 [PDF] contemplated a sort of plugin scenario in which the model tried to convince a TaskRabbit worker to solve a CAPTCHA puzzle on the model's behalf.

In practical terms, OpenAI plugins allow people to enter text commands, via typing or speech recognition, and have ChatGPT formulate a response using data from third-party services. If this can be done accurately and quickly, without excessive cost, OpenAI may have found the successor of traditional web search.

Here's the example prompt from the announcement:

Looking to eat vegan food in San Francisco this weekend. Could you get me one great restaurant suggestion for Saturday and a simple recipe for Sunday (just the ingredients)? Please calculate the calories for the recipe using Wolfram Alpha. Finally order the ingredients on InstaCart.

ChatGPT in this scenario has access to plugins from OpenTable, a restaurant reservation service, computational knowledge engine Wolfram Alpha, and InstaCart, a retail delivery service. And, using a recipe from an undisclosed source (presumably not statistically derived), the chat model can chain together the steps in the compound prompt by making a reservation, fetching a recipe, calculating the calorie count, and placing an order for the ingredients.

Back in 2006, the year AWS was founded and the cloud era arguably began, Google executives described the company's on-premises enterprise search capability as an "über-command-line interface to the world," based on its hardware's ability to provide access to information beyond indexed corporate documents, like shipment tracking data and weather information.

OpenAI now has realized a more user-friendly prototype for that vision – it's offering a command line, but not just for those who have memorized obscure command sequences and flags or search operators. Under this new regime, internet users of varied technical abilities can carry out queries without search keyword trial and error, limited more by the participation of third-party partners than by their own familiarity with search syntax.

Google will certainly have noted that OpenAI created a browser plugin that makes GET requests to the Bing search API. "This scopes the browsing plugin to be useful for retrieving information, but excludes 'transactional' operations such as form submission which have more surface area for security and safety issues," says OpenAI in its post.

Right about now, companies should be asking what the equivalent of search engine optimization (SEO) might be in an ecosystem driven by chat models. Is there a way to ensure ChatGPT recommends your widget and not the competition, if choice is an option? And is this approach compatible with the digital ad industry – is ChatGPT better suited for referring traffic to websites or for invoking API-dependent services and whatever business model that entails?

Those contemplating OpenAI's plugin ecosystem should probably be considering the consequences of a dependency on the biz, which recently gave customers three days to shift away from its Codex API, only to relent after public outcry but just for researchers. There's also the possibility of being "Sherlocked" – OpenAI may choose to build its own version of popular services as its platform evolves, making partners redundant.

"We’re working to develop plugins and bring them to a broader audience," OpenAI concludes. "We have a lot to learn, and with the help of everyone, we hope to build something that is both useful and safe."

There's that word again. ®

More about

TIP US OFF

Send us news


Other stories you might like