Showing posts with label Kolmogorov Algorithmic Complexity. Show all posts
Showing posts with label Kolmogorov Algorithmic Complexity. Show all posts

Friday, June 4, 2010

Kolmogorov Information and Evolution

The reappearing criticism of Stephen Meyer’s book, Signature in the Cell has led me to delve into the world of information theory again. Evolutionists claim emphatically that information can, indeed, be created out of nothing whatsoever. This claim is used to bolster the idea that information can just appear, and then get itself increased in size and complexity. Not only that, but there is now a claim that it is shown that evolution is entropic and is a natural outcome of entropy.

Now this in no way corresponds with the physics I know and love. So what is it that the evolutionists are talking about? First let’s examine information theory as they are using the term. There is a computing theorem that some evolutionists use, called the Kolmogorov Information Theory. This is in no way related to the Shannon communication information equations, and in fact could be considered orthogonal to it. Shannons’ information concept started with the concept that information contained meaning, and that due to entropic causes - noise, loss of signal strength, etc. – that other means for ensuring the delivery of the meaning content of the information need to be taken.

None of this is true in Kolmogorov Information Theory. For Kolmogorov Complexity, there is no concept of meaning in the character string it is analyzing . The purpose of the analysis is to define a minimum string length:
”More formally, the Algorithmic "Kolmogorov" Complexity (AC) of a string x is defined as the length of the shortest program that computes or outputs x , where the program is run on some fixed reference universal computer. “
A related concept is the Solomonov Probability which computes the likelihood of a given string being output from a random program:
” A closely related notion is the probability that a universal computer outputs some string when fed with a program chosen at random.”
Note that both of these concepts have preconceptions. First that a universal computer exists; second that it is compatible with some code; third that compatible code exists; that a mechanism for properly feeding the code into the machine exists; that a mechanism for outputting the computation product exists, and that there is no “meaning” involved in the classical sense of information.

The most important preconception to note is that information in this mathematical sense has no relation to information in the classical sense; in other words mathematical information has no meaning attached to it, while classical information is completely about the meaning content, independent of the carrier or the modulation types. So these two concepts are totally divorced from each other.

Yet their conflation is used by evolutionists.

In addition, evolutionist preconceptions include: first that thecomputer pre-exists and is functionally ready to accept and run code; that the pre-existing computer is biological; that the code is biological; that both the input and output mechanisms of delivery for the code are biological and that the output contains some sort of instruction (meaning) for the host organism in terms of physical change which is required for access by Natural Selection.

The evolutionists deny any meaning at this point; for them, the information has no meaning, and the change has no meaning, it just is. This interpretation is a bastardization of the meaning of the word, “meaning”. If the output of the universal computing machine produces a useful product that, in turn, produces a change in the organism, then the output has meaning. What the output means to the organism is: change yourself according to this data. It is an instruction, and instructions are sentences with meaning.

What we see here is the mechanism by which evolutionists claim that information in the classical sense is created accidentally by information in the strictly mathematical sense. So meaning is created out of meaninglessness, or randomness. The trick is one that they play on themselves first, and then on the casual observer: information is created from non-information. Of course it is false, a Black and White Fallacy, a conflation of terms with mutually exclusive definitions.

The claim that information is created from non-information or randomness is in no way supported or substantiated by Kolmogorov / Solomonov theories; it is a fallacy and an extrapolation that is not supported by either the mathematics nor by any empirical study.

Finally, Meyer's book apparently uses "information" as having a meaning content, an idea met with sneers and derision from the Atheist camp, specifically Shallit and PZ. However, they provide no more evidence than the reference to Kolmogorov, which they either haven't read and understood, or they think that no one else has.