Recommendation fairness and where to find it: An empirical study on fairness of user recommender systems.

2023 IEEE International Conference on Big Data (BigData)(2023)

引用 0|浏览2
暂无评分
摘要
Recommender systems play a crucial role in how users consume information and establish new social relations. However, different factors (such as the data collection process, the designed recommendation model, or even the interpretation of findings) could make recommenders (unintendedly) prone to biases, favouring certain user groups or items, thus resulting in unfair outcomes. Recommenders also face fairness criticism for inducing filter bubbles, echo chambers, and, more generally, facilitating opinion manipulation. In this work, we study the impact of user recommender systems on fairness. To this end, we carry out a user recommendation task on a politically polarized Twitter data collection. Then, we evaluate how the different politically aligned user groups experience recommendation quality. Finally, we explore causal models to identify data and model-related features that could affect the fairness of recommender outcomes. Our study shows that political alignment is associated with the unfairness of recommenders affecting not only the relevance of recommendations, but also their diversity and the resulting interaction patterns.
更多
查看译文
关键词
Recommender Systems,Twitter,User Groups,Causal Model,Echo Chambers,Relevant Recommendations,Recommendation Model,Filter Bubbles,Political Alignment,Interactive,General Linear Model,Akaike Information Criterion,User Experience,Dynamic Network,Matrix Factorization,Human-computer Interaction,Kullback-Leibler,Causal Analysis,Clustering Coefficient,User Preferences,Correlation In Group,Sensitive Attributes,Half Of Users,Interaction Ratio,Affordable Care Act,Collaborative Filtering,Anti-discrimination,High Calibration,Notions Of Fairness,Sensitive Features
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要