Information processing is performed when a system preserves aspects of the input related to what the input represents while it removes other aspects. To describe a system's information processing capability, input and output need to be compared in a way invariant to the way signals represent information, Kullback-Leibler distance, information-theoretic measure that reflects the data processing theorem, is calculated on the input and output separately and compared to obtain information transfer ratio. We consider the special case where input serves several parallel systems and show that this configuration has the capability to represent the input information without loss. We also derive bounds for asymptotic rates at which the loss decreases as more parallel systems are added and show that the rate depends on the input distribution.
|Title of host publication||Proceedings of the 2002 IEEE International Conference on Acoustics, Speech, and Signal Processing|
|Number of pages||4|
|Publication status||Published - 13 May 2002|
- Markov processes
- optimised production technology
- Information processing
- asymptotic rates