How to quantify privacy, when using homomorphic encryption?

us flag

How can you measure how secure or private the new variables are relative to the real (actual) variables.

I want to compare homomorphic encryption and differential privacy in combination with machine learning models. Maybe using a measures like Kullback-leibler, but I will need the distribution of the encrypted variables.

I'm using the tenseal python package and random forest.


Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.