Facebook: What Are Privacy-Enhancing Technologies (PETs) and How Will They Apply to Ads?

Home » blog » Facebook: What Are Privacy-Enhancing Technologies (PETs) and How Will They Apply to Ads?
Facebook: What Are Privacy-Enhancing Technologies (PETs) and How Will They Apply to Ads?

Privacy-enhancing technologies (PETs) can minimize the amount of data processed to help protect personal information. They can be used in many different contexts, like COVID-19 contact tracing, identifying city relocation trends and sending electronic payments. 

We believe that PETs will support the next generation of digital advertising, which is why we’re investing in a multi-year effort with academics, global organizations and developers to build solutions and best practices. 

PETs infographic

Below, we explain more about how these technologies work. 

PETs and Ads

PETs involve advanced techniques drawn from the fields of cryptography and statistics. These techniques help minimize the data that’s processed while preserving critical functionality like ad measurement and personalization.

Let’s take a closer look at three kinds of PETs and how we might use them to build ad personalization or measurement solutions in the future: secure multi-party computation (MPC), on-device learning and differential privacy. 

Secure Multi-Party Computation (MPC)

Secure multi-party computation (MPC) allows two or more organizations to work together while limiting the information that either party can learn. Data is encrypted end-to-end: while in transit, in storage and in use, ensuring neither party can see the other’s data. 

MPC is useful for enhancing privacy while calculating outcomes from more than one party, such as reporting the results of an ad campaign or training a machine-learning model where the data is held by two or more parties. 

Today, this type of reporting requires at least one party to learn which specific people made a purchase after seeing a specific ad. With MPC, say one party has the information about who saw an ad and another party has information on who makes a purchase. MPC and encryption make it possible for both parties to learn insights about how an ad is performing, without the need to entrust a single party with both data sets. 

And we’ve already started putting MPC to use. Last year, we began testing a solution called Private Lift Measurement, which uses MPC to help advertisers understand performance.

While we expect Private Lift Measurement to be available to all advertisers next year, we’ve open-sourced our framework for Private Computation so any developer can now create privacy-centric measurement products using MPC. 

On-Device Learning

On-device learning trains an algorithm from insights processed right on your device without sending individual data such as an item purchased or your email address to a remote server or cloud. This technology could help us find new ways to show people relevant ads, without needing to ever learn about specific actions individuals take on other apps and websites.

For example, if lots of people who click on ads for exercise equipment also tend to buy protein shakes, on-device learning could help identify that pattern without sending individual data to a Facebook server or cloud. Then, Facebook can use this pattern to find an audience for protein shakes using ads.

Similar to a feature like autocorrect or text prediction, on-device learning improves over time. As millions of devices each make small improvements and start to identify new patterns, these patterns can train an algorithm to get smarter so you may see more ads that are relevant to you and less that aren’t.

On-device learning data can be further protected by combining it with differential privacy.

Differential Privacy

Differential privacy is a technique that can be used on its own or applied to other privacy-enhancing technologies to protect data from being re-identified.  

Differential privacy works by including carefully calculated “noise” to a dataset. For example, if 118 people bought a product after clicking on an ad, a differentially private system would add or subtract a random amount from that number. So instead of 118, someone using that system would see a number like 120 or 114. 

Adding that small random bit of incorrect information makes it harder to know who actually bought the product after clicking the ad, even if you have a lot of other data. As a result, this technology is often used with large data sets released for public research.

Ensuring privacy throughout our apps while reducing the data we collect is a long-term effort. We’ll share more details about our collaborative efforts in personalization using PETs as they progress.