Article Contents
Article Contents

# Observations on the bias of nonnegative mechanisms for differential privacy

• * Corresponding author: Oliver Mason

This work is supported by SFI grant 13/RC/2094

• We study two methods for differentially private analysis of bounded data and extend these to nonnegative queries. We first recall that for the Laplace mechanism, boundary inflated truncation (BIT) applied to nonnegative queries and truncation both lead to strictly positive bias. We then consider a generalization of BIT using translated ramp functions. We explicitly characterise the optimal function in this class for worst case bias. We show that applying any square-integrable post-processing function to a Laplace mechanism leads to a strictly positive maximal absolute bias. A corresponding result is also shown for a generalisation of truncation, which we refer to as restriction. We also briefly consider an alternative approach based on multiplicative mechanisms for positive data and show that, without additional restrictions, these mechanisms can lead to infinite bias.

Mathematics Subject Classification: Primary: 68P27; Secondary: 62E99.

 Citation:

•  [1] J. M. Abowd, The U.S. Census Bureau adopts differential privacy, in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, (2018), 2867. doi: 10.1145/3219819.3226070. [2] J. Domingo-Ferrer and J. Soria-Comas, From t-closeness to differential privacy and vice versa in data anonymization, Knowledge Based Systems, 74 (2015), 151-158.  doi: 10.1016/j.knosys.2014.11.011. [3] C. Dwork, Differential privacy, Automata, languages and programming. Part II, Lecture Notes in Comput. Sci., Springer, Berlin, 4052 (2006), 1–12. doi: 10.1007/11787006_1. [4] C. Dwork, F. McSherry, K. Nissim and A. Smith, Calibrating noise to sensitivity in private data analysis, Theory of Cryptography, 3876 (2006), 265-284.  doi: 10.1007/11681878_14. [5] C. Dwork and A. Roth, The algorithmic foundations of differential privacy, Found. Trends Theor. Comput. Sci., 9 (2013), 211-487.  doi: 10.1561/0400000042. [6] F. Fioretto, P. Van Hentenryck and K. Zhu, Differential privacy of hierarchical census data: An optimization approach, arXiv: 2006.15673, (2020). [7] G. R. Grimmett and  D. R. Stirzaker,  Probability and Random Processes, Oxford University Press, New York, 2001. [8] R. Hall, L. Wasserman and A. Rinaldi, Random differential privacy, Journal of Privacy and Confidentiality, 4 (2013), 43-59.  doi: 10.29012/jpc.v4i2.621. [9] N. Holohan, S. Antonatos, S. Braghin and P. Mac Aonghusa, The bounded Laplace mechanism in differential privacy, Journal of Privacy and Confidentiality, 10 (2020). doi: 10.29012/jpc.715. [10] N. Holohan, D. J. Leith and O. Mason, Differential privacy in metric spaces: Numerical, categorical and functional data under the one roof, Inform. Sci., 305 (2015), 256-268.  doi: 10.1016/j.ins.2015.01.021. [11] N. Holohan, D. J. Leith and O. Mason, Optimal differentially private mechanisms for randomised response, IEEE Transactions on Information Forensics and Security, 12 (2017), 2726-2735.  doi: 10.1109/TIFS.2017.2718487. [12] K. Kalantari, L. Sankar and A. Sarwate, Robust privacy-utility tradeoffs under differential privacy and hamming distortion, IEEE Transactions on Information Forensics and Security, 13 (2019), 2816-2830.  doi: 10.1109/TIFS.2018.2831619. [13] J. Le Ny and G. J. Pappas, Privacy-preserving release of aggregate dynamic models, in Proceedings of HiCoNS, 2013. [14] F. Liu, Statistical properties of sanitized results from differentially private Laplace mechanism with bounding constraints, preprint, arXiv: 1607.08554. [15] F. Liu, Generalized Gaussian mechanism for differential privacy, IEEE Transactions on Knowledge and Data Engineering, 31 (2019), 747-756.  doi: 10.1109/TKDE.2018.2845388. [16] L.-J. Liu, H. R. Karimi and X. Zhao, New approaches to positive observer design for discrete-time positive linear systems, J. Franklin Inst., 355 (2018), 4336-4350.  doi: 10.1016/j.jfranklin.2018.04.015. [17] F. McSherry and K. Talwar, Mechanism design via Differential Privacy, in Proceedings of 48th Annual Symposium of Foundations of Computer Science, (2007), 94–103. doi: 10.1109/FOCS.2007.66. [18] P. Sadeghi, S. Asoodeh and F. du Pin Calmon, Differentially private mechanisms for count queries, arXiv: 2007.09374, (2020). [19] J. Soria-Comas and J. Domingo-Ferrer, Optimal data independent noise for differential privacy, Inform. Sci., 250 (2013), 200-214.  doi: 10.1016/j.ins.2013.07.004. [20] J. Soria-Comas, J. Domingo-Ferrer, D. Sánchez and D. Megias, Individual differential privacy: A utility-preserving formulation of differential privacy guarantees, IEEE Tran. Information Forensics and Data Security, 12 (2017), 1418-1429.  doi: 10.1109/TIFS.2017.2663337. [21] V. Torra, Data Privacy: Foundations, New Developments and the Big Data Challenge, Springer-Verlag, 2017. doi: 10.1007/978-3-319-57358-8. [22] A. Triastcyn and B. Faltings, Bayesian differential privacy for machine learning, in Proceedings of the 37th International Conference on Machine Learning, (2020), 9583–9592. [23] E. Valcher and A. Rantzer, A tutorial on positive systems and large scale control, in Proceedings of IEEE Conf. on Dec. and Cont., (2018). [24] K. Zhu, P. Van Hentenryck and F. Fioretto, Bias and variance of post-processing in differential privacy, arXiv: 2010.04327, (2020).