Privacy is essential to the sustainable success of the advertising ecosystem. This document takes up the W3C TAG's Privacy Principles [[?Privacy-Principles]] and specialises them for advertising-related situations.

This document is a draft of the Private Ad Technologies Community Group, it is intended to be contributed to an eventual Private Ad Technologies Working Group on the Note Track. This document does not yet reflect the consensus of the PATCG.

How This Document Fits In

This document elaborates on the W3C TAG's Privacy Principles [[?Privacy-Principles]]. The latter document is intended to describe principles of privacy that apply across the Web, and therefore leaves the door open to a variety of approaches so that different use cases can be approached with some flexibility. This document is therefore more specific in detailing how the Web's broader privacy principles are to be understood in an advertising context.

Private Advertising Principles

Advertising-specific privacy principles may address the following issues:

Principles are organized in sections below regarding particular use cases or common concepts that apply across different use cases.

Measurement

Measurement should be private for safe, widespread usage, but always be under user control

Opting-out should not be visible

Users may wish to opt-out of participation in measurement, but do so in such a way that is not visible to the sites they visit. Visible opt-out could lead to retaliation against, or coercion of, users who do not wish to participate in measurement.

Measurement should not significantly enable cross-context recognition

Protections of differential privacy take the form of guarantees that aggregation or noise make participation in a particular measurement mostly indistinguishable, but also recognize that some information (as often quantified by parameters including epsilon) is released and could be combined with other known information to learn something with some (presumably very small) probability. "significantly" here is not yet detailed. The aggregated or noised measurement should not reasonably be usable to identify a particular user or to link an user's activity to another context.

Metrics to define significance are being evaluated by a separate task force.

Because measurement and attribution involve all kinds of viewing advertisements and a variety of other actions, in a wide range of different contexts, relying on understanding of, expectations about and consent over cross-context recognition as a result of ad measurement would be inappropriate.

Measurement should not significantly enable inferences about individual people from their participation in the measurement

Related to cross-context recognition, measurement mechanisms should not reasonably be able to be used to learn or infer information about a particular user, for example, that a user visited a site (or class of site) or took an online or offline action.

Population-level measurement can still be used for inference; this principle only indicates that participation (or non-participation) in the measurement cannot be used to enable an inference about that individual.

Accountability

Users should be able to investigate how data about them is used and shared.

Users should be able to learn what measurements they may participate in.

Users should be able to learn what level of risk of re-identification or cross-context data-sharing is possible.
See also: comprehensibility.

Researchers, regulators and auditors should be able to investigate how a system is used and whether abuse is occurring.

Researchers should be able to learn what measurements are taking place, in order to identify unexpected or potentially abusive behavior and to explain the implications of the system to users (whose individual data may not be satisfyingly explanatory).

Most users will not choose to investigate or be able to interpret individual data about measurements. Independent researchers can provide an important accountability function by identifying potentially significant or privacy-harmful outcomes.

Some privacy harms -- including to small groups or vulnerable people -- cannot reasonably be identified in the individual case, but only with some aggregate analysis.

Auditors, with internal access to at least one of the participating systems, should be able to investigate and document whether abuse has occurred (for example, collusion between non-colluding helper parties, or interfering with results). When evidence of abuse is discovered, affected parties must be notified.

When abuse happens, there must be a mechanism to identify the abuse, limit further access and enable consequences.

Acknowledgements

The following people, in alphabetical order of their first name, were instrumental in producing this document: