Confidential computing is an emerging paradigm in today’s data-centric era that helps maintain privacy and confidentiality regardless of where data is processed. Intel is working to make it possible for customers to maintain control of their data, even when it’s in the cloud. That is why I’m very excited that Microsoft has announced the general availability of the Microsoft Azure DCsv2-Series, featuring a hardware-based Trusted Execution Environment (TEE) built on Intel® Software Guard Extensions (Intel® SGX). Tested during private and public preview phases, Microsoft is now scaling their confidential computing offering and making it broadly available to enterprise customers looking to leverage the advantages of cloud computing while including a layer of protection for their sensitive workloads.
A Trusted Foundation
Security is only as strong as the layer below it. That is why it’s important to have trusted security technologies rooted in hardware. Intel continues to innovate with advanced security technologies built directly into Intel® Xeon® processors, promoting trust and protection at the most foundational layers. Intel SGX is the most researched, tested, and deployed application isolation technology in the market today. It allows application developers to partition their applications into private regions of memory called enclaves, designed to be protected from higher-level processes, including even the OS and hypervisor. It’s long been a common practice to encrypt data at rest, in storage and in transit over the network, and now code and data can also be encrypted in use while being processed in memory. This helps to isolate the data from other applications or tenants, the service provider, rogue administrators, and even from malicious code with root privileges.
Enabling New Usages
This isn’t just about making existing workloads more secure. This is about making whole new usages possible. In addition to enabling sensitive workloads and data to migrate to the cloud, new multi-party shared compute scenarios that have been difficult to build in the past due to privacy, security, and regulatory requirements, are now possible. An example is Federated Learning, which enables parties to conduct machine learning across broader data sources, while helping to keep algorithms and data sets confidential. This extends to private blockchain, in-memory databases, and more.
A Continuing Partnership
Scott Woodgate, Microsoft Sr. Director of Azure Security and Management Marketing, joined us at Intel Security Day last month and shared a real-world example of multi-party machine learning: "We’ve seen multiple banks around the world implement multi-party machine learning to find specific patterns of fraud and help the bottom line of these banks. To do this, banks perform machine learning to find patterns on shared datasets in Azure using Intel SGX enabled protected enclaves without ever exposing an individual bank’s dataset to other parties including banks or even an administrator on the virtual machines.” Multi-party machine learning brings new possibilities to industries where data security is paramount but not all participating devices are trusted.
As leading members of the Confidential Compute Consortium (https://confidentialcomputing.io/), Intel and Microsoft want to empower our customers to execute their data in a more secure and private cloud environment and are committed to collaborating with the industry to deliver secure compute infrastructure today and into our future.
Start today with Azure Confidential Computing: http://aka.ms/AzureCC
Written by Jason Grebe, Corporate Vice President & General Manager of the Cloud & Enterprise Solutions Group
Disclaimer: No product or component can be absolutely secure.