This article is more than 1 year old

Microsoft, Nvidia extend Azure confidential computing to GPUs

Remember when orgs were worried about processing sensitive info in public cloud?

Microsoft has linked up with Nvidia to enable confidential computing in its Azure cloud to encompass the graphics giant's GPUs. This will allow GPUs to process workloads on Azure that call for the highest level of protection, such as applications in healthcare or financial services.

With confidential computing support for Nvidia A100 GPUs paired with hardware-protected virtual machines, organizations will be able to use even sensitive datasets to train and deploy more accurate AI models without compromising security or performance, Microsoft claimed.

The move was announced by Microsoft's CTO Mark Russinovich, who said it was intended to help both individuals and organizations derive new insights from data without having to consider security or privacy threats.

Azure confidential computing was itself developed to help allay fears by organizations about processing sensitive data in the public cloud, and was first demonstrated by Russinovich at Microsoft's Ignite conference in 2017.

Confidential computing initially relied upon a Trusted Execution Environment (TEE), which is basically a hardware-enforced secure enclave in memory. Code is placed inside the enclave, and encrypted data is passed in and encrypted results passed out. No other code is able to access the contents of the enclave – in theory.

However, when AMD introduced its Epyc processors with support for encrypted memory, especially Secure Encrypted Virtualization (SEV) which enables each virtual machine on the system to have its own separate encryption key, Microsoft used this to support confidential VMs. These enable customers to deploy workloads inside virtual machines that are protected not only from other cloud users, but also from Azure itself.

To enable confidential computing in GPUs, data gets encrypted to be transferred between the CPU and GPU across the PCIe bus, using encryption keys that are securely exchanged between Nvidia’s device driver and the GPU.

Once transferred to the GPU, the data is only decrypted within a hardware-protected, isolated environment inside the GPU package, where it can be processed to generate models or inference results.

Microsoft is currently inviting interested parties to sign up to test out confidential GPU support in a private preview of Azure confidential GPU VMs. Through this, users get access to a secure environment with a virtual Trusted Platform Module (vTPM) and up to four Nvidia A100 GPUs.

Users will be able to run machine learning workloads using their chosen machine learning frameworks, Microsoft said. They will also be able to remotely verify that their VM boots up with trusted code and the Nvidia device driver for confidential GPUs, and that data remains encrypted as it is transferred to and from the GPUs.

Microsoft said it is already working with several customers, such as Bosch and Royal Bank of Canada, on implementing applications using confidential computing with GPU support. ®

More about

TIP US OFF

Send us news


Other stories you might like