Score:0

differential privacy over a normal vector

cn flag

We're given a vector $x\in \mathbb{R}^d$ whose coordinates where sampled from a known normal distribution $\mathcal{N}(0, \sigma^2)$.

How should I send this vector while maintaining (local) differential privacy? with some sensitivity over its $\ell_2$ norm (i.e., two close vectors should not be distinguishable). Is there a way to take the fact that we know the source distribution into account?

Thank you!

Bihu Duo avatar
dj flag
Could you clarify how addition or removal of an individual effect your vector?
cn flag
The machanism should make nearby vectors \epsilon - indistinguishable. So instead of the regular missing point in the dataset formulation the changed dataset(vector) is any other vector "nearby" (given an l2 bound on the distance)
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.