Higher-order Common Information

arxiv(2024)

Cited 0|Views1
No score
Abstract
We present a new notion R_ℓ of higher-order common information, which quantifies the information that ℓ≥ 2 arbitrarily distributed random variables have in common. We provide analytical lower bounds on R_3 and R_4 for jointly Gaussian distributed sources and provide computable lower bounds for R_ℓ for any ℓ and any sources. We also provide a practical method to estimate the lower bounds on, e.g., real-world time-series data. As an example, we consider EEG data acquired in a setup with competing acoustic stimuli. We demonstrate that R_3 has descriptive properties that is not in R_2. Moreover, we observe a linear relationship between the amount of common information R_3 communicated from the acoustic stimuli and to the brain and the corresponding cortical activity in terms of neural tracking of the envelopes of the stimuli.
More
Translated text
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined