Score:1

l-Diversity logarithm

tl flag

I wanted to make a little example for anonymization evaluation using l-Diversity. For that I'm using the following formula for Entropy l-Diversity ($E$ is the equivalence class, $S$ are all possible values for a sensitive attribute, $s$ a specific value):

$$ \operatorname{Entropy}(E) = - \sum_{s \in S} p(E,s)\cdot \log(p(E,s)) $$

In the paper they never defined which logarithm is used. It could be base $2$, $e$ or $10$, but I have no Idea what is actually used. Can someone help me?

Score:1
ru flag

Entropy and other measures of information can be defined to any base and so should always be quoted with units (bits/shannons for base 2, nats for base e, and bans/harts for base 10). The most common base is base 2, but this is by no means universal

Titanlord avatar
tl flag
The problem is, that the result should be a standardised score. I saw, that there is one single part used for proof in the appendix, where they evaluated some values. This leads to logarithm naturalis, therefore base $e$.
Paul Uszak avatar
cn flag
@Titanlord Yet in cryptography, it's almost universally base 2. That leads to bits.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.