I had the opportunity to participate in this year's Open Confidential Computing Conference (OC3), hosted by our software partner, Edgeless Systems. This year's event was particularly noteworthy due to a panel discussion on the impact and future of confidential computing. The panel featured some of the industry's most respected technology leaders including Greg Lavender, Chief Technology Officer at Intel, Ian Buck, Vice President of Hyperscale and HPC at NVIDIA, and Mark Papermaster, Chief Technology Officer at AMD. Felix Schuster, Chief Executive Officer at Edgeless Systems, moderated the panel discussion, which explored topics such as the definition of confidential computing, customer adoption patterns, current challenges, and future developments. The insightful discussion left a lasting impression on me and my colleagues.
What is confidential computing?
When it comes to understanding what exactly confidential computing entails, it all begins with a trusted execution environment (TEE) that is rooted in hardware. This TEE protects any code and data placed inside it, while in use in memory, from threats outside the enclave. These threats include everything from vulnerabilities in the hypervisor and host operating system to other cloud tenants and even cloud operators. In addition to providing protection for the code and data in memory, the TEE also possesses two crucial properties. The first is the ability to measure the code contained within the enclave. The second property is attestation, which allows the enclave to provide a verified signature that confirms the trustworthiness of what is held within it. This feature allows software outside of the enclave to establish trust with the code inside, allowing for the safe exchange of data and keys while protecting the data from the hosting environment. This includes hosting operating systems, hypervisors, management software and services, and even the operators of the environment.
Regarding what is not confidential computing, it is not other privacy enhancing technologies (PETs) like homomorphic encryption or secure multiparty computation. It is hardware rooted, trusted execution environments with attestation.
In Azure, confidential computing is integrated into our overall defense in depth strategy, which includes trusted launch, customer managed keys, Managed HSM, Microsoft Azure Attestation, and confidential virtual machine guest attestation integration with Microsoft Defender for Cloud.
Customer adoption patterns
With regards to customer adoption scenarios for confidential computing, we see customers across regulated industries such as the public sector, healthcare, and financial services ranging from private to public cloud migrations and cloud native workloads. One scenario that I'm really excited about is multi-party computations and analytics where you have multiple parties bringing their data together, in what is now being called data clean rooms, to perform computation on that data and get back insights that are much richer than what they would have gotten off their own data set alone. Confidential computing addresses the regulatory and privacy concerns around sharing this sensitive data with third parties. One of my favorite examples of this is in the advertising industry, where the Royal Bank of Canada (RBC) has set up a clean room solution where they take merchant purchasing data and combine it with their information around the consumers credit card transactions to get a full picture of what the consumer is doing. Using these insights, RBC’s credit card merchants can then offer their consumer very precise offers that are tailored to them, all without RBC seeing or revealing any confidential information from the consumers or the merchants. I believe that this architecture is the future of advertising.
Another exciting multi-party use case is BeeKeeperAI’s application of confidential computing and machine learning to accelerate the development of effective drug therapies. Until recently, drug researchers have been hampered by inaccessibility of patient data due to strict regulations applied to the sharing of personal health information (PHI). Confidential computing removes this bottleneck by ensuring that PHI is protected not just at rest and when transmitted, but also while in use, thus eliminating the need for data providers to anonymize this data before sharing it with researchers. And it is not just the data that confidential computing is protecting, but also the AI models themselves. These models can be expensive to train and therefore are valuable pieces of intellectual property that need to be protected.
To allow these valuable AI models to remain confidential yet scale, Azure is collaborating with NVIDIA to deploy confidential graphics processing units (GPUs) on Azure based on NVIDIA H100 Tensor Core GPU.
Regarding the challenges facing confidential computing, they tended to fall into four broad categories:
Availability, regional, and across services. Newer technologies are in limited supply or still in development, yet Azure has remained a leader in bringing to market services based on Intel® Software Guard Extensions (Intel® SGX) and AMD Secure Encrypted Virtualization-Secure Nested Paging (SEV-SNP). We are the first major cloud provider to offer confidential virtual machines based on Intel® Trust Domain Extensions (Intel® TDX) and we look forward to being one of the first cloud providers to offer confidential NVIDIA H100 Tensor Core GPUs. We see availability rapidly improving over the next 12 to 24 months.
Ease of adoption for developers and end users. The first generation of confidential computing services, based on Intel SGX technology, required rewriting of code and working with various open source tools to make applications confidential computing enabled. Microsoft and our partners have collaborated on these open source tools and we have an active community of partners running their Intel SGX solutions on Azure. The newer generation of confidential virtual machines on Azure, using AMD SEV-SNP, a hardware security feature enabled by AMD Infinity Gaurd and and Intel TDX, lets users run off-the-shelf operating systems, lift and shift their sensitive workloads, and run them confidentially. We are also using this technology to offer confidential containers in Azure which allows users to run their existing container images confidentially.
Performance and interoperability. We need to ensure that confidential computing does not mean slower computing. The issue becomes more important with accelerators like GPUs where the data must be protected as it moves between the central processing unit (CPU) and the accelerator. Advances in this area will come from continued collaboration with standards committees such as the PCI-SIG, which has issued the TEE Device Interface Security Protocol (TDISP) for secure PCIe bus communication and the CXL Consortium which has issued the Compute Express Link™ (CXL™) specification for the secure sharing of memory among processors. Open source projects like Caliptra which has created the specification, silicon logic, have read-only memory (ROM), and firmware for implementing a Root of Trust for Measurement (RTM) block inside a system on chip (SoC).
Industry awareness. While confidential computing adoption is growing, awareness among IT and security professionals is still low. There is a tremendous opportunity for all confidential computing vendors to collaborate and participate in events aimed at raising awareness of this technology to key decision-makers such as CISOs, CIOs, and policymakers. This is especially relevant in industries such as government and other regulated sectors where the handling of highly sensitive data is critical. By promoting the benefits of confidential computing and increasing adoption rates, we can establish it as a necessary requirement for handling sensitive data. Through these efforts, we can work together to foster greater trust in the cloud and build a more secure and reliable digital ecosystem for all.
The future of confidential computing
When the discussion turned to the future of confidential computing, I had the opportunity to reinforce Azure's vision for the confidential cloud, where all services will run in trusted execution environments. As this vision becomes a reality, confidential computing will no longer be a specialty feature but rather the standard for all computing tasks. In this way, the concept of confidential computing will simply become synonymous with computing itself.
Finally, all panelists agreed that the biggest advances in confidential computing will be the result of industry collaboration.
Microsoft at OC3
In addition to the panel discussion, Microsoft participated in several other presentations at OC3 that you may find of interest:
- Removing our Hyper-V host OS and hypervisor from the Trusted Computing Base (TCB).
- Container code and configuration integrity with confidential containers on Azure.
- Customer managed and controlled Trusted Computing Base (TCB) with CVMs on Azure.
- Enabling faster AI model training in healthcare with Azure confidential computing.
- Project Amber—Intel's attestation service.
Finally, I would like to encourage our readers to learn about Greg Lavender’s thoughts on OC3 2023.
All product names, logos, and brands mentioned above are properties of their respective owners.
Source: Azure Blog Feed