A note on bounded distance-based information loss metrics for statistical disclosure control of numeric microdata

arXiv (Cornell University)(2023)

引用 0|浏览1
暂无评分
摘要
In the field of statistical disclosure control, the tradeoff between data confidentiality and data utility is measured by comparing disclosure risk and information loss metrics. Distance based metrics such as the mean absolute error (MAE), mean squared error (MSE), mean variation (IL1), and its scaled alternative (IL1s) are popular information loss measures for numerical microdata. However, the fact that these measures are unbounded makes it is difficult to compare them against disclosure risk measures which are usually bounded between 0 and 1. In this note, we propose rank-based versions of the MAE and MSE metrics that are bounded in the same range as the disclosure risk metrics. We empirically compare the proposed bounded metrics against the distance-based metrics in a series of experiments where the metrics are evaluated over multiple masked datasets, generated by the application of increasing amounts of perturbation (e.g., by adding increasing amounts of noise). Our results show that the proposed bounded metrics produce similar rankings as the traditional ones (as measured by Spearman correlation), suggesting that they are a viable additions to the toolbox of distance-based information loss metrics currently in use in the SDC literature.
更多
查看译文
关键词
statistical disclosure control,information loss metrics,distance-based
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要