Microsoft Copilot for Security prepares for April liftoff
Automated AI helper intended to make security more manageable
Microsoft Copilot for Security, a subscription AI security service, will be generally available on April 1, 2024, the company announced on Wednesday.
Its arrival on April Fool's Day is purely coincidental.
As a measure of the company's commitment to software-as-revenue-generating-service, Microsoft on Tuesday invited a handful of journalists, this reporter among them, to attend a media briefing and make inquiries about the automation offering from enthusiastic employees and customers.
Lyft vouchers were provided to cover transit costs. A boxed lunch and tame beverages were served up.
Copilot for Security, early access to which was offered in October, provides generative AI in two modes. It's available as a standalone portal that can be integrated with third-party products. And it's also available as an embedded service within Microsoft products like Sentinel, Defender XDR, Purview, Priva, and Entra.
Based on GPT-4 and a Microsoft security-specific model, Copilot for Security takes input (prompts) from people or scripts, passes the text through an orchestrator layer, a context layer, and possibly application plugins, then returns a response from the underlying AI model. This might involve summarizing a document or flagging a suspicious interaction with the AI model, or perhaps generating recommendations to shore up security practices.
Whatever the case, Copilot for Security does so through a "pay-as-you-go" licensing model tied to Microsoft Azure. Redmond has created a new billing unit called a Security Compute Unit, which is "anticipated" to be billed monthly at a rate of $4/hour.
"The speed, the scale, the sophistication of attacks has increased pretty dramatically over the last year," Vasu Jakkal, corporate vice president of security, compliance, identity, and management at Microsoft told reporters.
"Identity continues to be the battleground for security. We see 4,000 password attacks per second from 567 password attacks per second just two years back. Year over year, we've seen a 10x increase from 3 billion to 30 billion attacks in the same time frame for identity."
"And the time it takes attackers to get access to data is also shrinking. On average, it takes 72 minutes or less for an attacker to get access to the user's data and inbox once a user clicks on a phishing link."
And amid all this, Jakkal said, there's a shortage of security talent.
Copilot for Security, Jakkal said, is "designed to help customers and users defend at machine speed, to catch what others may miss, to reduce this talent shortage that we are facing, and to make everyone have a great outcome."
Copilot for Security was initially conceived for security operations and threat protection tasks, Jakkal explained, like threat investigation, reverse engineering malware, incident reporting, and guided incident response plans. And as of last October, the service was expanded to handle tasks related to identity, data security, and IT skills.
The primary value proposition of Copilot for Security is said to be productivity. According to Microsoft's own research [PDF] into Microsoft XDR, those using the security service with help from Copilot for Security finished tasks (analyzing scripts and incident reports, and summarizing incidents) 22 percent faster on average than those without AI help.
This productivity gain was not seen for all activities, however. For response tasks, Copilot actually slowed things down by about 26 percent: "We note also that Copilot currently often takes 20+ seconds to open," the research paper says. "This necessarily slowed the Copilot users. Product improvements should reduce this duration and further increase the time savings for users with Copilot."
But overall, the company's data supports the company product, citing improvements in accuracy and quality, and employee enthusiasm as consequence of Copilot adoption.
Rui Correia, security operations center manager for Signode in Switzerland, told The Register that his firm has been using Copilot for Security since November for tasks like malware analysis, incident response, and alert investigations.
- Your PC can probably run inferencing just fine - so it's already an AI PC
- GitHub fixes pull request delay that derailed developers
- Microsoft sends OneDrive URL upload feature to the cloud graveyard
- Copilot can't stop emitting violent, sexual images, says Microsoft whistleblower
"Whenever something suspicious happens in the company and it generates an alert, we are utilizing Copilot to speed up the investigation," he said.
Correia said he had compared the investigation process both with and without Copilot. "I found that with each step, it was roughly between 20 and 50 percent faster for Copilot to do it, given that you do need to go into multiple portals and log in there and wait for everything to load," he said.
The latest iteration of Copilot for Security includes: support for custom promptbooks, which allow customers to craft and save their own prompts for common tasks; company-specific knowledge base integrations; support for prompts and responses in eight languages, with 25 languages via the standalone interface; third-party integration with partner services; and usage reporting that shows how teams are using Copilot.
"I do believe this is going to be the most consequential technology of my lifetime," said Jakkal. ®