Fundamental Properties of Causal Entropy and Information Gain
CoRR(2024)
摘要
Recent developments enable the quantification of causal control given a
structural causal model (SCM). This has been accomplished by introducing
quantities which encode changes in the entropy of one variable when intervening
on another. These measures, named causal entropy and causal information gain,
aim to address limitations in existing information theoretical approaches for
machine learning tasks where causality plays a crucial role. They have not yet
been properly mathematically studied. Our research contributes to the formal
understanding of the notions of causal entropy and causal information gain by
establishing and analyzing fundamental properties of these concepts, including
bounds and chain rules. Furthermore, we elucidate the relationship between
causal entropy and stochastic interventions. We also propose definitions for
causal conditional entropy and causal conditional information gain. Overall,
this exploration paves the way for enhancing causal machine learning tasks
through the study of recently-proposed information theoretic quantities
grounded in considerations about causality.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要