Continuum AI is now public. Try out the most secure GenAI service!

Collaboration icon on purple gradient background

Blog

Confidential computing and multi-party computation (MPC)

Lara Montoya Laske


The landscape of privacy-preserving computing (PPC) methods


As organizations share sensitive information, worries about compliance breaches and data leaks become more and more significant. In the following, we introduce some of the most prominent privacy-preserving computing (PPC) methods, and highlight specific use cases of confidential multi-party computation (MPC).

 

Amongst some of the most commonly cited technologies in this realm, we find Fully Homomorphic Encryption (FHE). FHE theoretically enables calculations on encrypted data without prior decryption. The output, when decrypted, matches results as if operations were performed on unencrypted data. The downside of FHE is that it requires extremely intensive computational power, which makes it unusable in real-world use cases.

 

Another PPC technology is confidential computing (CC), which also enables encryption of data during processing, leveraging hardware capabilities such as the latest CPUs from Intel and AMD. Equipped with Trusted Execution Environments (TEEs), these processors ensure that all data remains encrypted in memory at runtime. Additionally, CC enables workload integrity verification through remote attestation. Unlike FHE, CC is much more practical and is currently being used in many real-world deployments to protect sensitive workloads. Explore further about CC in our whitepaper on confidential computing or our wiki.


Finally, MPC is part of a branch of cryptography focused on enabling parties to jointly compute a function over their inputs while preserving privacy. In simpler terms, it allows different entities to collaborate on computing a result without revealing their individual data.

 

Unlike conventional cryptographic tasks, which safeguard communication or storage from external parties, MPC ensures privacy among participants themselves. Its origins trace back to the late 1970s, evolving to techniques by the late 1980s allowing secure computation without a trusted third party.

 

Other methods include differential privacy, which adds controlled noise to data for privacy while maintaining accurate analysis, and federated learning, which allows devices to collaboratively train a shared AI model without sharing raw data, thus enhancing privacy. Both can be seen as specialized MPC techniques.

 

Confidential-computing powered MPC applications

 

Historically, applications necessitating data-in-use protection, like secure MPC, relied solely on cryptographic techniques, which incurred substantial processing performance overhead. The emergence of confidential computing technologies has transformed this. By processing data within Trusted Execution Environments (TEEs) shielded by CC-enabled chips, we can now build new applications to safeguard data in use with minimal performance impact. This advancement opens up a realm of new MPC use cases.

 

Confidential-computing powered MPC platforms can be used for all use cases where joint data analytics is blocked by privacy or compliance issues, such as collaborative research, data sharing among competitors, secure financial transactions, and all kinds of secure data pooling. For example, financial institutions can perform joint analytics on encrypted customer data and transactions, enabling better risk assessment, fraud detection, and money laundering investigation.

 

Real-world examples

 

There are multiple applications of MPC that are emerging in recent years. One of these is the so-called “Data clean room”. Data clean rooms are secure collaborative environments, where two or sometimes more participants (companies, publishers, teams, or other entities) come together to share and/or combine their data.

 

One example of such kind of technology is the “Privacy Data Exchange” platform from Hope for Justice. Hope for Justice is a global non-profit organization that aims to bring an end to modern slavery that partnered with Intel and Edgeless Systems to address the problem of data privacy in anti-trafficking efforts through sharing intelligence amongst different anti-trafficking agencies. Their collaboration led to the creation of the platform, which leverages Intel's technology and the EGo open-source software. This solution enables secure data sharing and pooling without exposing plaintext data. To learn more about the “Privacy Data Exchange”, read the full case study.

 

The financial sector can also greatly benefit from these technologies. This is the reason why Accenture developed an insurance claim fraud detection platform leveraging EGo as well, thought to help insurers cross-query their claims databases to assess if a claim could be suspicious or not. The platform would alert the insurance company if it found multiple instances of some key indicators (e.g. phone numbers). You can read the full case study here.

 

In conclusion, there are many other use cases where such platforms could help business optimize their processes by leveraging more data than their own while keeping it protected. Reach out to us to discover how you can create confidential computing-powered data pooling applications, or if you have any questions about this exciting and emerging technology.


Author: Lara Montoya Laske


Related reading

View all