Citation
Bindschaedler, V.; Rane, S.; Brito, A.; Rao, V.; Uzun, E. Achieving Differential Privacy in Secure Multiparty Data Aggregation Protocols on Star Networks. ACM Conference on Data and Applications Security and Privacy.
Abstract
We consider the problem of privacy-preserving data aggregation in a star network topology, i.e., several untrusting participants connected to a single aggregator. We require that the participants do not discover each others data, and the service provider remains oblivious to each participants individual contribution. Further- more, the final result is to be published in a differentially private manner, i.e., the result should not reveal the contribution of any single participant to a (possibly external) adversary who knows the contributions of all other participants. In other words, we require a secure multiparty computation protocol that also incorporates a differentially private mechanism. Previous solutions have resorted to caveats such as postulating a trusted dealer to distribute keys to the participants, or introducing additional entities to withhold the decryption key from the aggregator, or relaxing the star topology by allowing pairwise communication amongst the participants. In this paper, we show how to obtain a noisy (differentially private) aggregation result using Shamir secret sharing and additively homomorphic encryption without these mitigating assumptions. More importantly, while we assume semi-honest participants, we allow the aggregator to be stronger than semi-honest, specifically in the sense that he can try to reduce the noise in the differentially private result. To respect the differential privacy requirement, collusions of mutually untrusting entities need to be analyzed differently from traditional secure multiparty computation: It is not sufficient that such collusions do not reveal the data of honest participants; we must also ensure that the colluding entities cannot undermine differential privacy by reducing the amount of noise in the final result. Our protocols avoid this by requiring that no entity neither the aggregator nor any participant knows how much noise a participant contributes to the final result. We also ensure that if a cheating aggregator tries to influence the noise term in the differentially private output, he can be detected with overwhelming probability.