Score:0

Compute Differential Privacy level for any randomised algorithm

ma flag

I have recently started learning differential privacy for my BTech project. I understand that it adds noise to the input stream based on a privacy level (say $\epsilon$) and a query function (say $f$), to provide privacy to the input dataset. The distribution parameters for the noisy signals are computed based on these things only.

Now suppose, we have a randomized algorithm that adds noise to the input dataset based on its corresponding privacy level (say $\pi$) and does not consider any query function during its computation. The algorithm just provides noise values based on its privacy formulation.

Now, if I choose a query function (say the same function $f$) and try to model the noisy signals of the randomized algorithm to find the differential privacy level ($\epsilon$) of the randomized algorithm, is it possible to establish a relationship between $\epsilon$ and $\pi$?

Sumana bagchi avatar
ma flag
Privacy level helps to determine the distribution parameters of the noise
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.