Ignite Got governance? Microsoft reckons there is room for improvement – it should know – and has used its Ignite Florida knees-up to batter compliance with its overused AI stick.
Are you... compliant?
Protecting data is a challenge. Microsoft 365 customers can already slap classifications and labels on documents to control which users can see what, but the process is a tad manual at present and potentially hugely time consuming. The solution for AI-happy Microsoft is, of course, machine learning, with a classification engine trained to label data automatically and apply governance policies.
Microsoft 365 senior director Alym Rayani told The Register a customer would need to feed the beast "at least 50" examples to get things started. He went on to say that training would have to go on until the model is stable, at which point it could be pointed at vast libraries of SharePoint data and left to do its thing.
It all sounds a bit worrying for the average compliance officer. And it reminded us a little of auto-tagging of photos. Rayani saw it as a way of dealing with the vast pools of data lurking in enterprises (he reckoned that the majority of data in enterprises tended to be left ungoverned by compliance policies due to the sheer effort involved in dealing with it).
Once set up, the algorithm would always be watching and labelling as new documents arrive in the cloud.
For many, that will be the rub. You really have to be running in the cloud for this to work properly. Rayani pointed to the Azure Information Protection scanner, which is an option for on-premises customers but admitted that it was "not yet fully integrated" since "today, the AI stuff is in the cloud".
A further limitation lies in the data that can actually be checked. While Microsoft 365 documents can be scanned, and there is also a nod to third parties in the form of PDFs, something like the schema of a SQL database remains a no-no for the time being.
Rayani told us that the next batch of supported data sets would be dictated by feedback gathered between the December's public preview and the next year's GA.
And finally, Rayani told us: "Right now… we don't learn from user behaviour or classification" – meaning that there is no getting away from the model training process for the time being.
It was the user, at the leaving do, with the USB stick
The gang has also added Insider Risk Management in private preview, which is aimed at spotting potentially iffy activities, such as a resigning user taking an interest in confidential files and popping them on a USB stick. Of course, kind old Microsoft will also keep the data anonymised in order to keep the lawyers at bay until they're needed.
Also in preview, although in the public gaze, is Communication Compliance, aimed at ensuring that a company's code of conduct is maintained over the likes of Teams or Exchange Online.
The whole lot goes towards an overall Compliance Score, replete with tips on how to improve things. A user's Microsoft 365 environment is continually scanned, and the Compliance Score calculated based on what policies have been configured around protection, governance and so on.
Microsoft, which has had more experience than it might like with the regulatory bodies of the world, has helpfully included an assessment view to give weary admins a hint of just how bad something like a General Data Protection Regulation audit is likely to be ahead of the actual auditor stopping by for a visit. ®