Many systems in nature can be described using discrete input-output maps. Without knowing details about a map, there may seem to be no a priori reason to expect that a randomly chosen input would be more likely to generate one output over another. Here, by extending fundamental results from algorithmic information theory, we show instead that for many real-world maps, the a priori probability P(x) that randomly sampled inputs generate a particular output x decays exponentially with the approximate Kolmogorov complexity K(x)K(x) of that output. These input-output maps are biased towards simplicity. We derive an upper bound P(x)2K(x)-b2- A K(x)-b, which is tight for most inputs. The constants a and b, as well as many properties of P(x), can be predicted with minimal knowledge of the map. We explore this strong bias towards simple outputs in systems ranging from the folding of RNA secondary structures to systems of coupled ordinary differential equations to a stochastic financial trading model.
Bibliographical noteFunding Information:
We thank B. Frot for providing data for the L-systems and S.E. Ahnert, B. Frot, P. Gács, I.G. Johnston and H. Zenil for helpful discussions. We thank the EPSRC for funding K.D. through EPSRC/EP/G03706X/1 and the Clarendon Fund for funding C.Q.C.
© 2018 The Author(s).