UCL Discovery Stage
UCL home » Library Services » Electronic resources » UCL Discovery Stage

Private and Communication-Efficient Algorithms for Entropy Estimation

Bravo-Hermsdorff, G; Busa-Fekete, R; Ghavamzadeh, M; Medina, AM; Syed, U; (2022) Private and Communication-Efficient Algorithms for Entropy Estimation. In: Koyejo, S and Mohamed, S and Agarwal, A and Belgrave, D and Cho, K and Oh, A, (eds.) Advances in Neural Information Processing Systems 35 (NeurIPS 2022). NeurIPS: New Orleans, LA, USA. Green open access

[thumbnail of 11037_private_and_communication_effi.pdf]
Preview
Text
11037_private_and_communication_effi.pdf - Published Version

Download (629kB) | Preview

Abstract

Modern statistical estimation is often performed in a distributed setting where each sample belongs to a single user who shares their data with a central server. Users are typically concerned with preserving the privacy of their samples, and also with minimizing the amount of data they must transmit to the server. We give improved private and communication-efficient algorithms for estimating several popular measures of the entropy of a distribution. All of our algorithms have constant communication cost and satisfy local differential privacy. For a joint distribution over many variables whose conditional independence is given by a tree, we describe algorithms for estimating Shannon entropy that require a number of samples that is linear in the number of variables, compared to the quadratic sample complexity of prior work. We also describe an algorithm for estimating Gini entropy whose sample complexity has no dependence on the support size of the distribution and can be implemented using a single round of concurrent communication between the users and the server. In contrast, the previously best-known algorithm has high communication cost and requires the server to facilitate interaction between the users. Finally, we describe an algorithm for estimating collision entropy that matches the space and sample complexity of the best known algorithm but generalizes it to the private and communication-efficient setting.

Type: Proceedings paper
Title: Private and Communication-Efficient Algorithms for Entropy Estimation
Event: 36th Conference on Neural Information Processing Systems (NeurIPS 2022)
ISBN-13: 9781713871088
Open access status: An open access version is available from UCL Discovery
Publisher version: https://proceedings.neurips.cc/paper_files/paper/2...
Language: English
Additional information: This version is the version of record. For information on re-use, please refer to the publisher’s terms and conditions.
Keywords: Shannon entropy, collision entropy, Gini entropy, differentially private algorithms, communication-efficient algorithms
URI: https://discovery-pp.ucl.ac.uk/id/eprint/10186003
Downloads since deposit
378Downloads
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item