Abstract
Information processing theory endeavors to quantify how well signals encode information and how well systems, by acting on signals, process information. We use information-theoretic distance measures, the Kullback–Leibler distance in particular, to quantify how well signals represent information. The ratio of distances calculated between two informationally different signals at a system's output and input quantifies the system's information processing properties. Using this approach, we derive the fundamental processing capabilities of simple system architectures that apply universally: the systems and the kinds of signals they process and produce do not affect our general results. Applications in array signal processing and in neural signal analysis illustrate how to apply the theory.
Original language | English |
---|---|
Pages (from-to) | 1326-1344 |
Number of pages | 19 |
Journal | Signal Processing |
Volume | 87 |
Issue number | 6 |
Early online date | 11 Dec 2006 |
DOIs | |
Publication status | Published - Jun 2007 |
Keywords
- information theory
- Kullback–Leibler distance
- Information processing