This article is more than 1 year old

Here's how Microsoft hopes to inject ChatGPT into all your apps and bots via Azure

Stormy clouds ahead?

Microsoft is bringing ChatGPT, with all its promises and shortcomings, to world-plus-dog as a cloud service in Azure.

Redmond this week was "thrilled to announce" ChatGPT will be selectively available as a preview within the Azure OpenAI Service. That service is largely aimed at corporations that want to put large-language models to work in their applications and workflows, such as using Dall-E2 for generating images, GPT-3.5 for text, and Codex for something that resembles code.

By making ChatGPT available from Azure, organizations participating in the program can now tap into the software, embed it in their apps and pipelines, and generate walls of text for whatever purpose they can successfully justify as well as have the chatty, imaginative bot interact with users.

"Developers can integrate custom AI-powered experiences directly into their own applications, including enhancing existing bots to handle unexpected questions, recapping call center conversations to enable faster customer support resolutions, creating new ad copy with personalized offers, automating claims processing, and more," Eric Boyd, corporate vice president for Microsoft's AI Platform, gushed.

"Cognitive services can be combined with Azure OpenAI to create compelling use cases for enterprises."

Then again, he would say that wouldn't he. Microsoft is billions deep in OpenAI. Part of that investment deal includes a pact in which Microsoft gets rights to commercialize OpenAI's technology. And that includes the upstart's text-generating ChatGPT, a non-intelligent bot that predicts what humans might write from given input prompts. You can ask it to come up with a movie synopsis, for instance, and it'll take a swing at it.

Redmond has aggressively injected machine-learning features into much of its portfolio and cloud services, launching Azure OpenAI Services in 2021. There has been some success. According to Boyd, more than 1,000 companies are using the system, with Microsoft noting such examples as Moveworks, KPMG, and Al Jazeera.

Now they and other organization can have a crack at ChatGPT in Azure and everything that comes with it. The service, in preview mode right now, will cost $0.002 per 1,000 tokens. Pricing for OpenAI's various AI models are based on tokens, which the upstart describes as pieces of words, with 1,000 tokens being about 750 words.

Billing for ChatGPT use begins March 13. Developers can apply for access to the software in Azure here.

The bot, which is trained on mountains of text scraped from internet pages and other sources, quickly captured the world's imagination after it became available to the public via OpenAI's website in November. According to a study that was partly based on statistics from Similarweb, the bot became the fastest app to reach 100 million users, hitting that mark by the start of February.

There are problems.

In a column early this month in The Register, Alexander Hanff, a computer scientist and privacy technologist, recounted how the chatbot had told him he was dead, and doubled down with faked obits. We've also pointed out other issues.

A toy robot lying on its side, head fallen off

OpenAI CEO heralds AGI no one in their right mind wants

READ MORE

None of that likely will slow down Microsoft from continuing to push the code into its products, which already include Bing, Edge, and Skype. The Windows 11 giant will talk even more about plans for ChatGPT and other AI technologies at an event on March 16 titled "The Future of Work with AI," which will be hosted by CEO Satya Nadella.

Microsoft's Boyd gave a nod to the challenges that come with AI tools like ChatGPT.

"We recognize that any innovation in AI must be done responsibly," he said. "This becomes even more important with powerful, new technologies like generative models. We have taken an iterative approach to large models, working closely with our partner OpenAI and our customers to carefully assess use cases, learn, and address potential risks."

When developers apply to use ChatGPT from Azure, they need to outline how they intend to use the technology before given access, Microsoft said. It also plans to filter out abusive and offensive content.

"In the event of a confirmed policy violation, we may ask the developer to take immediate action to prevent further abuse," he added. ®

More about

TIP US OFF

Send us news


Other stories you might like