Directed information is an information theory measure that quantifies the information flow from the random string to the random string . The term directed information was coined by James Massey and is defined as[1]
Directed information has applications to problems where causality plays an important role such as the capacity of channels with feedback,[1][2][3][4] capacity of discrete memoryless networks,[5] capacity of networks with in-block memory,[6] gambling with causal side information,[7]compression with causal side information,[8] real-time control communication settings,[9][10][11] and statistical physics.[12]
The essence of directed information is causal conditioning. The probability of causally conditioned on is defined as[5]
.
This is similar to the chain rule for conventional conditioning except one conditions on "past" and "present" symbols rather than all symbols . To include "past" symbols only, one can introduce a delay by prepending a constant symbol:
.
It is common to abuse notation by writing for this expression, although formally all strings should have the same number of symbols.
One may also condition on multiple strings: .
Causally conditioned entropy
The causally conditioned entropy is defined as:[2]
Similarly, one may causally condition on multiple strings and write
.
Properties
A decomposition rule for causal conditioning[1] is
.
This rule shows that any product of gives a joint distribution .
Directed Information can be written in terms of causal conditioning:[2]
.
The relation generalizes to three strings: the directed information flowing from to causally conditioned on is
.
Conservation law of information
This law, established by James Massey and his son Peter Massey,[13] gives intuition by relating directed information and mutual information. The law states that for any , the following equality holds:
Estimating and optimizing the directed information is challenging because it has terms where may be large. In many cases, one is interested in optimizing the limiting average, that is, when grows to infinity termed as a multi-letter expression.
Estimation
Estimating directed information from samples is a hard problem since the directed information expression does not depend on samples but on the joint distribution which may be unknown. There are several algorithms based on context tree weighting[15] and empirical parametric distributions[16] and using long short-term memory.[17]
Optimization
Maximizing directed information is a fundamental problem in information theory. For example, given the channel distributions , the objective might be to optimize over the channel input distributions .
Massey's directed information was motivated by Marko's early work (1966) on developing a theory of bidirectional communication.[27][28] Marko's definition of directed transinformation differs slightly from Massey's in that, at time , one conditions on past symbols only and one takes limits:
Marko defined several other quantities, including:
Total information: and
Free information: and
Coincidence:
The total information is usually called an entropy rate. Marko showed the following relations for the problems he was interested in:
and
He also defined quantities he called residual entropies:
and developed the conservation law and several bounds.
Relation to transfer entropy
Directed information is related to transfer entropy, which is a truncated version of Marko's directed transinformation .
The transfer entropy at time and with memory is
where one does not include the present symbol or the past symbols before time .
Transfer entropy usually assumes stationarity, i.e., does not depend on the time .
References
↑ 1.01.11.2Massey, James (1990). "Causality, Feedback And Directed Information".
↑Permuter, Haim Henry; Weissman, Tsachy; Goldsmith, Andrea J. (February 2009). "Finite State Channels With Time-Invariant Deterministic Feedback". IEEE Transactions on Information Theory55 (2): 644–662. doi:10.1109/TIT.2008.2009849. Bibcode: 2009ITIT...55..644P.
↑Permuter, Haim H.; Kim, Young-Han; Weissman, Tsachy (June 2011). "Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing". IEEE Transactions on Information Theory57 (6): 3248–3259. doi:10.1109/TIT.2011.2136270. Bibcode: 2011ITIT...57.3248P.
↑Simeone, Osvaldo; Permuter, Haim Henri (June 2013). "Source Coding When the Side Information May Be Delayed". IEEE Transactions on Information Theory59 (6): 3607–3618. doi:10.1109/TIT.2013.2248192. Bibcode: 2013ITIT...59.3607S.
↑Charalambous, Charalambos D.; Stavrou, Photios A. (August 2016). "Directed Information on Abstract Spaces: Properties and Variational Equalities". IEEE Transactions on Information Theory62 (11): 6019–6052. doi:10.1109/TIT.2016.2604846. Bibcode: 2016ITIT...62.6019C.
↑Vinkler, Dror A; Permuter, Haim H; Merhav, Neri (20 April 2016). "Analogy between gambling and measurement-based work extraction". Journal of Statistical Mechanics: Theory and Experiment2016 (4). doi:10.1088/1742-5468/2016/04/043403. Bibcode: 2016JSMTE..04.3403V.
↑Massey, J.L.; Massey, P.C. (September 2005). "Conservation of mutual and directed information". Proceedings. International Symposium on Information Theory, 2005. ISIT 2005. pp. 157–158. doi:10.1109/ISIT.2005.1523313. ISBN0-7803-9151-9.
↑Amblard, Pierre-Olivier; Michel, Olivier (28 December 2012). "The Relation between Granger Causality and Directed Information Theory: A Review". Entropy15 (1): 113–143. doi:10.3390/e15010113. Bibcode: 2012Entrp..15..113A.
↑Jiao, Jiantao; Permuter, Haim H.; Zhao, Lei; Kim, Young-Han; Weissman, Tsachy (October 2013). "Universal Estimation of Directed Information". IEEE Transactions on Information Theory59 (10): 6220–6242. doi:10.1109/TIT.2013.2267934. Bibcode: 2013ITIT...59.6220J.
↑Quinn, Christopher J.; Kiyavash, Negar; Coleman, Todd P. (December 2015). "Directed Information Graphs". IEEE Transactions on Information Theory61 (12): 6887–6909. doi:10.1109/TIT.2015.2478440. Bibcode: 2015ITIT...61.6887Q.
↑ 17.017.117.2Aharoni, Ziv; Tsur, Dor; Goldfeld, Ziv; Permuter, Haim Henry (June 2020). "Capacity of Continuous Channels with Memory via Directed Information Neural Estimator". 2020 IEEE International Symposium on Information Theory (ISIT). pp. 2014–2019. doi:10.1109/ISIT44484.2020.9174109. ISBN978-1-7281-6432-8.
↑ 18.018.1Naiss, Iddo; Permuter, Haim H. (January 2013). "Extension of the Blahut–Arimoto Algorithm for Maximizing Directed Information". IEEE Transactions on Information Theory59 (1): 204–222. doi:10.1109/TIT.2012.2214202. Bibcode: 2013ITIT...59..204N.
↑ 19.019.1Permuter, Haim; Cuff, Paul; Van Roy, Benjamin; Weissman, Tsachy (July 2008). "Capacity of the Trapdoor Channel With Feedback". IEEE Transactions on Information Theory54 (7): 3150–3165. doi:10.1109/TIT.2008.924681. Bibcode: 2008ITIT...54.3150P.
↑ 21.021.1Sabag, Oron; Permuter, Haim H.; Kashyap, Navin (January 2016). "The Feedback Capacity of the Binary Erasure Channel With a No-Consecutive-Ones Input Constraint". IEEE Transactions on Information Theory62 (1): 8–22. doi:10.1109/TIT.2015.2495239. Bibcode: 2016ITIT...62....8S.
↑ 22.022.1Peled, Ori; Sabag, Oron; Permuter, Haim H. (July 2019). "Feedback Capacity and Coding for the $(0,k)$ -RLL Input-Constrained BEC". IEEE Transactions on Information Theory65 (7): 4097–4114. doi:10.1109/TIT.2019.2903252.
↑ 23.023.1Shemuel, Eli; Sabag, Oron; Permuter, Haim H. (March 2024). "Finite-State Channels With Feedback and State Known at the Encoder". IEEE Transactions on Information Theory70 (3): 1610–1628. doi:10.1109/TIT.2023.3336939. Bibcode: 2024ITIT...70.1610S.
↑ 24.024.1Aharoni, Ziv; Sabag, Oron; Permuter, Haim Henri (18 August 2020). "Reinforcement Learning Evaluation and Solution for the Feedback Capacity of the Ising Channel with Large Alphabet". arXiv:2008.07983 [cs.IT].
↑Sabag, Oron; Permuter, Haim Henry; Pfister, Henry (March 2017). "A Single-Letter Upper Bound on the Feedback Capacity of Unifilar Finite-State Channels". IEEE Transactions on Information Theory63 (3): 1392–1409. doi:10.1109/TIT.2016.2636851. Bibcode: 2017ITIT...63.1392S.
↑Sabag, Oron; Huleihel, Bashar; Permuter, Haim Henry (2020). "Graph-Based Encoders and their Performance for Finite-State Channels with Feedback". IEEE Transactions on Communications68 (4): 2106–2117. doi:10.1109/TCOMM.2020.2965454. Bibcode: 2020ITCom..68.2106S.
↑Marko, H. (December 1973). "The Bidirectional Communication Theory--A Generalization of Information Theory". IEEE Transactions on Communications21 (12): 1345–1351. doi:10.1109/TCOM.1973.1091610.
0.00
(0 votes)
Original source: https://en.wikipedia.org/wiki/Directed information. Read more