What Is Confidential Computing?

Big tech companies are adopting a new security model called confidential computing to protect data while it’s in use

By Fahmida Y Rashid

A handful of major technology companies are going all in on a new security model they’re calling confidential computing in an effort to better protect data in all its forms.

The three pillars of data security involve protecting data at rest, in transit, and in use. Protecting data at rest means using methods such as encryption or tokenization so that even if data is copied from a server or database, a thief can’t access the information. Protecting data in transit means making sure unauthorized parties can’t see information as it moves between servers and applications. There are well-established ways to provide both kinds of protection.

Protecting data while in use, though, is especially tough because applications need to have data in the clear—not encrypted or otherwise protected—in order to compute. But that means malware can dump the contents of memory to steal information. It doesn’t really matter if the data was encrypted on a server’s hard drive if it’s stolen while exposed in memory.

Proponents of confidential computing hope to change that. “We’re trying to evangelize there are actually practical solutions” to protect data while it’s in use, said Dave Thaler, a software architect from Microsoft and chair of the Confidential Computing Consortium’s Technical Advisory Council.

The consortium, launched last August under the Linux Foundation, aims to define standards for confidential computing and support the development and adoption of open-source tools. Members include technology heavyweights such as Alibaba, AMD, Arm, Facebook, Fortanix, Google, Huawei, IBM (through its subsidiary Red Hat), Intel, Microsoft, Oracle, Swisscom, Tencent, and Vmware. Several already have confidential computing products and services for sale.

Confidential computing uses hardware-based techniques to isolate data, specific functions, or an entire application from the operating system, hypervisor or virtual machine manager, and other privileged processes. Data is stored in the trusted execution environment (TEE), where it’s impossible to view the data or operations performed on it from outside, even with a debugger. The TEE ensures that only authorized code can access the data. If the code is altered or tampered with, the TEE denies the operation.

Many organizations have declined to migrate some of their most sensitive applications to the cloud because of concerns about potential data exposure. Confidential computing makes it possible for different organizations to combine data sets for analysis without accessing each other’s data, said Seth Knox, vice president of marketing at Fortanix and the outreach chair for the Confidential Computing Consortium. For example, a retailer and credit card company could cross-check customer and transaction data for potential fraud without giving the other party access to the original data.

Confidential computing may have other benefits unrelated to security. An image-processing application, for example, could store files in the TEE instead of sending a video stream to the cloud, saving bandwidth and reducing latency. The application may even divide up such tasks on the processor level, with the main CPU handling most of the processing, but relying on a TEE on the network interface card for sensitive computations.

Such techniques can also protect algorithms. A machine-learning algorithm, or an analytics application such as a stock trading platform, can live inside the TEE. “You don’t want me to know what stocks you’re trading, and I don’t want you to know the algorithm,” said Martin Reynolds, a technology analyst at Gartner. “In this case, you wouldn’t get my code, and I wouldn’t get your data.” [READ MORE]