Just Resource Allocation? How Algorithmic Predictions and Human Notions of Justice Interact

Economics and Computation(2022)

引用 2|浏览8
暂无评分
摘要
BSTRACTWe examine justice in data-aided decisions in the context of a scarce societal resource allocation problem. Non-experts (recruited on Amazon Mechanical Turk) have to determine which homeless households to serve with limited housing assistance. We empirically elicit decision-maker preferences for whether to prioritize more vulnerable households or households who would best take advantage of more intensive interventions. We present three main findings. (1) When vulnerability or outcomes are quantitatively conceptualized and presented, humans (at a single point in time) are remarkably consistent in making either vulnerability- or outcome-oriented decisions. (2) Prior exposure to quantitative outcome predictions has a significant effect and changes the preferences of human decision-makers from vulnerability-oriented to outcome-oriented about one-third of the time. (3) Presenting algorithmically-derived risk predictions in addition to household descriptions reinforces decision-maker preferences. Among the vulnerability-oriented, presenting the risk predictions leads to a significant increase in allocations to the more vulnerable household, whereas among the outcome-oriented it leads to a significant decrease in allocations to the more vulnerable household. These findings emphasize the importance of explicitly aligning data-driven decision aids with system-wide allocation goals.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要