You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 5, 2024. It is now read-only.
The existing advertiser and publisher reports support the private measurement of “single events”, a topic we’ve previously discussed in the PATCG (and developed a high level consensus here). The two types of reports naturally support this setting if the conversion ID / Ad ID is unique, and you make an “aggregate” query over just a single instance.
Given that we see a lot of value in supporting this setting (particularly for the “publisher reports” which reveal impression features), I have two questions:
Can you confirm that these queries are acceptable to the PAM privacy model? i.e. in other words, you believe the DP protection is enough to protect these events? Given that they are technically supported in the existing design, I believe the answer is “yes”, but would like confirmation.
If the answer to (1) is “yes”, are you open to considering additional privacy mechanisms that optimize for utility in this setting without regressing differential privacy? E.g. If you look at slide 6 in this presentation, you will see that for binary questions, the Laplace mechanism has ~15% increased effective noise rate vs. randomized response for epsilon = ln(3), but provides the same privacy protection. If you are open to considering alternative mechanisms in this setting, I would be happy to constructively engage on that front, since it’s an area we are investing research into.
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
The existing advertiser and publisher reports support the private measurement of “single events”, a topic we’ve previously discussed in the PATCG (and developed a high level consensus here). The two types of reports naturally support this setting if the conversion ID / Ad ID is unique, and you make an “aggregate” query over just a single instance.
Given that we see a lot of value in supporting this setting (particularly for the “publisher reports” which reveal impression features), I have two questions:
Can you confirm that these queries are acceptable to the PAM privacy model? i.e. in other words, you believe the DP protection is enough to protect these events? Given that they are technically supported in the existing design, I believe the answer is “yes”, but would like confirmation.
If the answer to (1) is “yes”, are you open to considering additional privacy mechanisms that optimize for utility in this setting without regressing differential privacy? E.g. If you look at slide 6 in this presentation, you will see that for binary questions, the Laplace mechanism has ~15% increased effective noise rate vs. randomized response for epsilon = ln(3), but provides the same privacy protection. If you are open to considering alternative mechanisms in this setting, I would be happy to constructively engage on that front, since it’s an area we are investing research into.
The text was updated successfully, but these errors were encountered: