Informing Users about Data Imputation: Exploring the Design Space for Dealing With Non-Responses

Ananya Bhattacharjee, Haochen Song,Xuening Wu, Justice Tomlinson,Mohi Reza, Akmar Ehsan Chowdhury,Nina Deliu,Thomas W. Price,Joseph Jay Williams

Proceedings of the AAAI Conference on Human Computation and Crowdsourcing(2023)

引用 0|浏览3
暂无评分
摘要
Machine learning algorithms often require quantitative ratings from users to effectively predict helpful content. When these ratings are unavailable, systems make implicit assumptions or imputations to fill in the missing information; however, users are generally kept unaware of these processes. In our work, we explore ways of informing the users about system imputations, and experiment with imputed ratings and various explanations required by users to correct imputations. We investigate these approaches through the deployment of a text messaging probe to 26 participants to help them manage psychological wellbeing. We provide quantitative results to report users' reactions to correct vs incorrect imputations and potential risks of biasing their ratings. Using semi-structured interviews with participants, we characterize the potential trade-offs regarding user autonomy, and draw insights about alternative ways of involving users in the imputation process. Our findings provide useful directions for future research on communicating system imputation and interpreting user non-responses.
更多
查看译文
关键词
data imputation,non-responses
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要