This article is more than 1 year old

If you're deemed cool enough, Microsoft will offer you access to Azure-based GPT-3

Text'n'code-emitting system still available from OpenAI

Ignite Microsoft will provide access to OpenAI’s text'n'code-generating GPT-3 model via an API service in the Azure cloud.

The machine-learning system will be part of Azure Cognitive Services, though it won't be generally available just yet; Microsoft is only working with select customers that have been invited to use the API. Up until now commercial access to the model was solely managed by OpenAI.

“We are just in the beginning stages of figuring out what the power and potential of GPT-3 is, which is what makes it so interesting,” Eric Boyd, Microsoft corporate vice president for Azure AI, said on Tuesday as Microsoft kicked off its annual Ignite conference.

“Now we are taking what OpenAI has released and making it available with all the enterprise promises that businesses need to move into production.”

Microsoft and OpenAI have a cozy relationship. Last year Microsoft obtained exclusive rights to OpenAI's GPT-3 model. OpenAI agreed to give up the chance of hawking its large language model via Azure's rival providers, such as Google Cloud and Amazon Web Services, in return for access to Microsoft’s extensive cloud resources.

Microsoft previously said OpenAI could use its in-house AI supercomputer that contains “more than 285,000 CPU cores and 10,000 GPUs” for its AI experimentation. Building non-trivial machine-learning systems requires huge amounts of computing power. Developers need to repeatedly train them on large datasets to improve their performance. It costs millions of dollars to train a model as large as GPT-3.

Companies have used the text-generating software to do all sorts of language tasks from automated content moderation to building online text-based games.

“GPT-3 has really proven itself as the first powerful, general purpose model for natural language — it’s one model you can use for all these things, which developers love because you can try things very easily,” OpenAI’s CEO Sam Altman said. “For a while now, we’ve wanted to figure out a way to scale it as broadly as possible, which is part of the thing that really excites us about the partnership with Microsoft.”

Given a writing prompt, GPT-3 generates hopefully relevant text in response; it functions a bit like autocomplete. It can be used to do things like code generation, translation, text summarisation, creative writing, or question and answering. But it's difficult to control the model’s outputs, and it's been known to generate racist, sexist, or inappropriate responses before. Microsoft said it will offer customers content filters to screen GPT-3’s text and is only supporting companies that are applying the software in “well-defined use cases” that are low risk.

“The content filter is Microsoft’s own, which takes into account learnings from both OpenAI’s experience and Microsoft’s Office, Bing and Xbox teams. Additionally, OpenAI and Microsoft are working together on more Responsible AI tooling for customers,” a spokesperson told The Register.

“Azure OpenAI Service has its own terms of use, separate from the OpenAI API.”

The API will still be sold via OpenAI, too, a Microsoft spokesperson confirmed to The Register: “Azure OpenAI Service does not replace OpenAI’s API. OpenAI’s GPT-3 commercial service provides access to the latest model instances and technologies to help teams ideate, innovate and develop applications that will later be put in production. Azure OpenAI Service is a new Azure Cognitive Service that will give select customers access to OpenAI’s GPT-3 technology with the enterprise-ready capabilities of Microsoft Azure."

“Invited customers will be able to access Azure OpenAI Service through their existing Azure account. We will have specifics about pricing available in the future,” they added. ®

More about


Send us news

Other stories you might like