There are a number of hard-to-classify subunits within evolutionary theory (and outside, just waiting to get in). This article will address some of those subjects which are only loosely related, but which cannot be ignored. Included will be Information in Life; Conditions for Life; RNA World; Metabolism First; Emergence and Self-Assembly; Complexity; Inference to Best Explanation; and some conclusions. And of course there are more subunits than those, and they will be addressed in future articles.
Information In Life
Modern biology has shown that all life, from humans to the simplest prokaryote, are all based on information systems which are managed by animated communication groupings of molecular systems involved in coded signaling. These coded signaling systems are active and necessary to the functioning and regulation of life’s internal systems. Further many of these systems are common to all life.  Types of signaling include endocrine signaling, paracrine signaling, and autocrine signaling, each of which are performed by molecules. Some molecules perform dual functions, one of which is signaling.  Cells also use molecules for cell to cell communication.  Some communication paths require that ports be opened through the membrane, and signals are produced for performing that function. Other communication paths require a signal modem lodged in the membrane which transmits the external molecular signal, recoded, into the cell where it is received and then transmitted by another molecule.  Animations of molecular signaling are available. 
Information in the form of coded, meaningful instructions is found everywhere in living systems.
”The existence of a genome and the genetic code divides living organisms from nonliving matter. There is nothing in the physico-chemical world that remotely resembles reactions being determined by a sequence and codes between sequences”
”The fact that all phenomena of life are based upon information and communication is indeed the principle characteristic of living matter. Without the perpetual exchange of information at all levels of organization, no functional order in the living organism could be sustained. The processes of life would implode into a jumble of chaos if they were not perpetually stabilized by information and communication.”
To that I would add, self-animation, which will be discussed later, in a future installment.
What exactly is meant by “information”? There are several levels of information:
1. Metadata (Statistics); concepts about the information. (Number of bits; bandwidth required; etc.).
2. Grammar; symbols and syntax rules 
3. Bit data, as analyzed by Shannon’s theory of entropy and bandwidth. Shannon’s theory does not include any meaning involved with the transfer of bits through communication channels.
”One must remember that the word entropy is the name of a mathematical function, nothing more; One must not ascribe meaning to the function that is not in mathematics” Yockey 
4. Semantic data, which contains specific, meaningful concepts available for transfer on any suitable medium. Semantic data cannot be compressed (non-reductive to algorithms) 
5. Pragmatics: Useful data, which contains concepts for producing something. 
DNA satisfies all of the top requirements for information content (only metadata is not retrieved or stored, but it is available to outside observers under laboratory conditions). It is specifically semantic and its information content is useful for the maintenance of the functions of life. RNA, RNA polymerase and the enzymes and proteins also satisfy the requirements for fully formed information content, bearing semantic information, although limited in proteins, and utility in living organisms.
In addition there is the issue of “complexity”, a term which is misused constantly in some quarters. Complexity is defined by irreducibility in the following sense: a perfectly repetitive sequence, however long, is simple because it is reducible to a simple algorithm which contains its essence and the ability to regenerate the sequence. A purely random sequence cannot be reduced to any algorithm, and thus it is complex. So semantic information bearing sequences are also complex because they are not reducible to simple algorithms. DNA is not reducible; the ENCODE project has shown that well over 90% of the genome is used, and the rest has not been adequately tested in order to know if it is also used. Also, the embedded and superimposed coding removes any hope of reducibility for DNA to algorithmic condensation, since these features add to put the information content well over 100%..
The information contained in DNA is specific, useful and efficient; Yockey gives some other characteristics:
”Characteristics of the Genetic Code:
The genetic code is instantaneous: (“instantaneously decodable at each stage”).
The genetic code is optimal: the genetic code has the property of being optimal which means that the genetic code employs the most economical use of its nucleotides.”
Further the genetic code is overwritten with embedded or superimposed codes. The second DNA strand is read backwards to obtain a separate reading (a feature found during the ENCODE program).  And there are redundancy features as well as error-checking and repair.
However, the actual functionality contained in (i.e., expression of) the DNA is partially controlled by the “epigenome” which is external to the DNA/RNA/RNA polymerase complex. The epigenome is comprised of a large variety of proteins which are used to modify the function of DNA by attaching themselves as markers to mediate ON/OFF switching for certain portions of the DNA molecule.  So the actual genomic information content is greater than 100%, and the actual extent is currently unknown.
The genome and biological systems are somewhat analogic to modern digital computers. The following elements are parallels, according to J. Seaman :
1. DNA <-> Hard Drive
DNA is analogous to a hard drive because it serves as the canonical, non-volatile copy that is copied but not frequently edited.
2. RNA <-> RAM
RNA is analogous to the RAM in a computer because it acts as the active, working copy of the information that is edited, used, and discarded.
3. Tandem Repeats <-> Data Blocks
This explains the anomalously high mutation rates. Cells are purposefully storing inherited information in the DNA Strand.
4. Polymerase + Ribosomes <-> Processors
The cell is a multi-processor system, with multiple parallel events occurring and being communicated through epigenetic modification and RNA. Variety in protein/RNA complexes are processors specialized for different tasks.
5. Cytoplasm Phenotype <-> Output
Most of the computation that goes into the decision process is never obvious to the user.
6. Nucleus <-> Motherboard
Computational center for the cell with hard drives integrated as closely as possible.
7. Nucleolus <-> CPU
Central area where most of the processors and memory is congregated for speed reasons.
The cell is actually more complex than digital computers in the sense that it uses multiple (unrelated) codings, each of which is used specifically for a dedicated system control purpose, using dedicated communication processing systems. This is especially true in the macro-organism, where functions of necessary organs are controlled by other messaging organs. These control organs are specifically dedicated to the information feedback loops formed with a target organ. This controller-dominated system is necessary for stability in the operation of the target organ, or adjustment for necessary increased/decreased target organ activity or even special situations requiring target organ adjustment for changes in environment. Communication information feedback systems also exist in intra-cellular process regulation, and inter-cellular process regulation systems, where they exist in large numbers.
There are now entire professional disciplines and even professional journals dedicated specifically to signaling and systems control in biological systems. 
It’s appropriate at this point to note that no such signaling function is found in minerals, interacting only with other minerals. Such signaling and processing is not seen to arise without intelligent cause. And as will be discussed, these functions cannot be expected to be created entropically, anentropically or negentropically.
It has been demonstrated that information, even in entropically neutral open systems such as living systems, always degrades, and that mutation – even “beneficial” mutations – are not selectable to the extent that information entropy is overcome, much less reversed. 
“In terms of numerical scores within a simulation experiment, just a few extremely beneficial mutations can more than compensate for large numbers of low-impact deleterious mutations. But this leads to increasing “fitness” only in a narrow and artificial sense. In the broader sense, the whole genome is still degenerating, because, while a few nucleotides are being improved, large numbers are being degraded. This type of trade-off is not sustainable, as it results in a shrinking functional genome size. More and more nucleotide sites are losing their specificity, and hence their functionality. Taken to the extreme, this would eventually yield a biological absurdity – a functional genome consisting of a handful of high-impact nucleotide sites that somehow code for all of the organism’s functionality.
Sanford, et. al. 
There have been many calculations done attempting to assess the probability of assembling a protein from a primordial soup (despite the unlikelihood of the existence of such soup, as will be discussed below). The calculations produced by Douglas Axe and related by Stephen Meyer have aroused the ire of the evolutionary community – but without substantive rebuttal it appears. 
Axe has produced two notable assertions. First he calculated the probability of assembling a 150 amino acid protein as 1 chance in 10^164. Meyer took this rationally impossibility a step further, adding in the necessary minimum number of proteins (250), and arrived at nearly the same calculation as Fred Hoyle’s 1 in 10^40,000: Meyer gets 1 in 10^41,000. In terms of probability this is P = 0.0 followed by 39,999 zeros before getting to a 1.
Second, Axe has claimed that converting two almost identical proteins, which have the same amino acids and nearly – but not quite exactly – the same folds, cannot be morphed one into the other in order to use one to do the job of the other. The issue here is the evolutionist claim that there is no need to posit new proteins and new folds at the time of the Cambrian explosion – the old proteins are sufficient even for new organisms. This evolutionist claim is made in order to counter the claim by Axe and Meyer that new organisms that differ greatly would need new proteins, so the source of all these new proteins appearing suddenly and simultaneously in the Cambrian explosion must be accounted for. This, of course, if true, presents a serious threat to evolution, and has rankled evolutionists. Axe, however, has experimentally shown that it (appears) impossible to convert the very similar protein molecules one to the other. Proving a negative is difficult, especially when no one is interested in trying to verify it. On the other hand, apparently no one has produced data which refutes Axe’s claim, either.
For now, there is no reason to claim that random assembly of proteins or higher complexity molecules happened or is the certain precursor to a replicable biont, since the improbability seems to negate any possibility.
The philosophical dichotomy of emergence is demonstrated in the following two statements, taken from Wikipedia:
“’The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. At each level of complexity entirely new properties appear. Psychology is not applied biology, nor is biology applied chemistry. We can now see that the whole becomes not merely more, but very different from the sum of its parts.’ (Anderson 1972)
The plausibility of strong emergence is questioned by some as contravening our usual understanding of physics. Mark A. Bedau observes:
‘Although strong emergence is logically possible, it is uncomfortably like magic. How does an irreducible but supervenient downward causal power arise, since by definition it cannot be due to the aggregation of the micro-level potentialities? Such causal powers would be quite unlike anything within our scientific ken. This not only indicates how they will discomfort reasonable forms of materialism. Their mysteriousness will only heighten the traditional worry that emergence entails illegitimately getting something from nothing.’ "(Bedau 1997)
Indeed: something from nothing. That requirement appears absolutely necessary in the theories of evolutionary development of life through the ages. Theories of physical emergence must apply to cause and effect, if taken in a deterministic, naturalistic universe. They rely on the presumed necessity of the effect of a primary cause becoming a cause itself, for a specific predicted outcome, but also secondarily generating either a deterministic outcome which is not predicted, or producing an epiphenomenal outcome which is always present for certain sequences but unexplained. These secondary outcomes are not in the normal chain of cause and effect, and are said to “emerge” from the causal chain to produce something which is unexpected (new and different).
Assume that cause, Fn, produces effect, Fn+1, and Fn+1 in turn produces effect, F(n+2), but also produces another, emergent, effect:
Type I Emergence:
Fn > Fn+1 > Fn+2
E1 (emergent secondary, deterministic consequence of cause Fn+1).
Type II Emergence:
Fn > Fn+1 > Fn+2
E1 (emergent epiphenomenon of cause Fn+1)
Type III Emergence:
Fn > Fn+1 > Fn+2
E1 (emergent epiphenomenon of Fn+1 combined with Gn+1)
But in the first two of these, if they are always produced by Fn+1, are deterministic and therefore have deterministic causal properties which produced their existence, even if those properties are unknown. But can determinism cause the “emergence” of information?
In the third emergence model, there is a symbiotic emergence between two coincident and new effects, Fn+1, and Gn+1. There could be even more inputs to this model, all at the same point. But is there any reason to believe that this effect of multiple causes exists, based on the necessary and sufficient principle of cause/effect which underlies science and logic? There is one proposed example of Type III emergence, the extra-genomic evolution of obesity in human children, discussed below.
Underlying all emergence theory is the proposition that the emergent property, E, comes from the exact same initial conditions and forcing factors which would normally be expected to produce only effects which are predictable under the known rules of physics. Emergence entails the creation of something which is not reducible to its components, i.e., something more than the sum of its parts. So to remain within deterministic Materialist concepts of reality, emergence requires the existence of extra unknown initial conditions in the environment of the cause, and/or unknown forcing in addition to the known cause.
The emergent property of heritable obesity in children lies in the proposition that extra energy transmission and storage in the inactive and obese mother is transmitted to the fetus as extra energy (glucose in this case) is transmitted to the pancreatic beta-function. This is realized as a tipping point for inherited traits (“anatomic, physiologic, metabolic, and behavioral tipping points”) if the child is inactive, and the tipping point modifications are “actualized”.  This is not a positive development, however, and should not be considered a beneficial mutation away from optimum or norm, but rather than an environmentally selectable trait, it would be deselected in natural environments. It is not demonstrated that strength, speed, intellect, cunning or proliferation of offspring which are stronger contenders in life actually result from such emergent non-genomic heritable modifications. Nor is this a modification which produces new organs or systems beneficial to the offspring. It is a non-beneficial modification which would not benefit being born into an environment of starvation, because it requires prior obesity and chronic inactivity of the mother, which would not occur in a starvation environment. This is de-evolution, not positive evolution.
Other popular examples of emergence include ‘chaos theory’ and the fractal images derived from repetition of simple algorithmic calculations which substitute the outcome of a calculation for a variable in a following calculation, producing an expansion of serial data which leads to patterns.
”Again, the fact that strings are algorithmically incompressible means (by definition, in fact) that they cannot arise as an ‘emergent’ property of some relatively simple algorithmic process, in the same way that beautiful fractal pictures can arise from quite simple equations.”
Stuart Kauffman has claimed that adding energy leads to organization:
”But if the heating is more vigorous so that the temperature difference through the fluid from top to bottom becomes great, then there a reformed rising columns of hot water surrounded by descending columns of cooler water. Together they form convective cells. … Incidentally, this is the first example we have mentioned in which energy flux through an initially homogenous system sets up a spatially ordered pattern. This idea has received enormous attention, with implications in pattern formation in embryos.”
He goes further, examining embryos and eggs for any mathematical content in their physical composition. He finds standing waves and eigen vectors and possible adaptation to “smooth fitness landscapes” as well as Turing class morphological models. But what all this resolves to is just this:
”Simultaneously, we suspect that the morphologies we see are expressions of a modest number of fundamental mechanisms, each yielding a well-defined family of forms. To suspect this is inevitably to confront two basic unanswered questions: What are those fundamental mechanisms? To what extent can and has selection modified the forms which we see from those naturally generated by the underlying mechanisms?”
Alternatively one might ask why Kauffman thinks that mathematical models will lead to mechanisms – at all. One begins to wonder if the mathematics is not the point of his investigations, rather than the mechanism. He wrote 642 pages before he asked the relevant questions on the very last page before the epilogue.
Emergence of Complexity
It’s entirely possible that the “core problem” is not even being addressed at all:
”For the core problem is NOT that of producing the kind of order that is to be seen in a crystal, honeycomb, or even a Belousov-Zhabotinski reaction. It is that of producing the qualitatively different language-type of structures formed by the complex ordering of the amino acids that form a protein. Paul Davies puts the difference very clearly: “life is actually <b>not</b> an example of <b>self-organization</b>. Life is in fact <b>specified, </b>” i.e. genetically directed, organization. Living things are instructed by the genetic software encoded in their DNA (or RNA). Convection cells form spontaneously by slef organization. There is no gene for a convection cell. The source of order is not encoded in the software, it can instead be traced to the boundary conditions of the fluid… In other words, a convection cell’s order is imposed externally, from the system’s environment. By contrast, the order of a living cell derives from <b>internal</b> control… the theory of self-organization as yet gives no clue how the transition is to be made between spontaneous or self-induced organization – which in even the most elaborate non-biological examples still involves relatively simple structures – and the highly complex, information based, genetic organization of living things.
[Emphasis in original)
The concept of negative entropy was introduced, not just to compensate for energy loss, but also to attempt to account for the emergence of an increase in information. If entropy = loss, then negative entropy would be a gain… right?
”[Erwin] Schrödinger (1987, 1992) [‘What Is Life?’ ] used negative entropy to explain the appearance of what he thought was order during evolution. Eigen (1992) thought that information received was negative entropy. It is most unfortunate that these distinguished scientists have misled their readers. Perhaps they believe that the minus sign in Equation 5 means that Shannon entropy is negative entropy. Because all probabilities must range from zero to one the logarithm is negative and that means that H is zero or positive. That is elementary mathematics. This is a serious mathematical objection to the <b>ad hoc</b> notion of negentropy in addition to the fact that a means of measurement has not been proposed”
[Emphasis in original]
There is no such thing as negative entropy; it is a mathematical impossibility due to the nature of the precisely defined and contained mathematical function of entropy. Therefore, it cannot account for the increase in genetic information from zero in minerals, to gigabits in living organisms.
Hox genes, a subset of homeobox genes, are said to prove evolution. So what about the magic and mystery of Hox genes? The magic is what they do; the mystery is what else can they do? Hox genes are purely trigger genes, with no other function. They can release a cascade of triggered responses which lead ultimately to the production of specific morphological features in an organism. It has been shown that by manipulating the proper Hox gene, the morphology of flies and other animals can be changed – not for the better, but creating abnormalities in their physical structures such as extra wings or legs for eyebrows. Is it possible that one mutation in a HOX gene or its cascading pathway could create something new and useful?
”Also Goldschmidt’s name would eventually appear, in late 20th century literature, when developmental genetics succeeded in the identification of major changes in a body structure, such as developing a leg where an antenna would normally be formed, as a consequence of mutational change affecting the spatial expression of a Hox gene. Was this discovery actually the vindication of Goldschmidt’s hopeful monsters? If so, evolutionary biology’s fondness for the results would equate to this discipline’s self-exclusion from the established tradition in evolutionary biology. No harm, however. Fruit-flies with antennae replaced by legs are desperately hopeless monsters, as are their well-known four winged companions, the phenotypic expression of a mutation affecting the expression of another Hox gene. These monsters can only survive in the protected environment of the lab. Their fitness, a population geneticist would say is actually zero. Too much was expected from a single mutation.”
Hox genes actually provide an interesting problem for the deterministic, Philosophical Materialistic nature of modern theories of evolution. Hox genes carry information, unlike other genes and proteins and enzymes. And the nature of their business, the reason for their existence, is to provide coded information which cascades through a chain of triggers for other molecules which ultimately results in the proper assembly of a body part in the correct location on the body. How did this arrangement come about? On one hand, how could it form after the body part fabrication was already present? But on the other hand, how could it form if there were no body parts yet? And how did it form as an information bearing class of molecules which exists in many organisms to perform similar but different functions, functions which are appropriate to each unique species? Further still, how could a number of them have formed simultaneously to form entire, integrated body morphologies from abstract information working on subordinate genetics? The probability of this happening is staggeringly infinitesimal. Nature.com ; NIH 
It is notable that Hox genes were not thought to exist in prokaryotes (simpler single cell life); they were thought to exist only in eukaryotic animals and are used in determining body orientation, front to back; body symmetry, left to right; body part placement; and enabling for development of proper body parts for the specific individual. They are a subset of the homeobox domain of developmental trigger genes.
Homeobox genes apparently have been found in prokaryotes, too. If this pushes even further back, then the question of <i>how they developed</i> also pushes back, possibly to first life.  This sort of complexity in first life is impossible to hypothesize from the physical nature of minerals and the laws of physics.
Interestingly, there is the Lego approach to emergence:
“As systems become more complex, protocols facilitate the layering of additional protocols, particularly involving feedback and signaling. Suppose we want to make a Lego structure incrementally more useful and versatile by “evolving” it to be (i) mobile, then(ii) motorized, then (iii) able to avoid collisions in a maze of obstacles. The first increment is easy to achieve, with Lego protocol–compatible axles and wheels. Motorizing toys involves a second increment in complexity, requiring protocols for motor and battery interconnection as well as a separate protocol for gears. All can be integrated into a motorizedprotocol suite to make modular subassemblies of batteries, motors, gears, axles, and wheels. These are available, inexpensive additions. The third increment increases cost and complexity by orders of magnitude, requiring layers of protocols and modules for sensing, actuation, and feedback controls plus subsidiary but essential ones for communications and computing. All are available, but it is here that we begin to see the true complexity of advanced technologies. Unfortunately, we also start to lose the easily described, intuitive story of the basic protocols. Minimal descriptions of advanced Lego features enabling sensing and feedback control literally fill books, but the protocols also
facilitate the building of elaborate, robust toys, precisely because this complexity is largely hidden from users. This is consistent with the claim that biological complexity too is dominated not by minimal function, but by the protocols and regulatory feedback loops that provide robustness and evolvability.
This added complexity also creates new and often extreme fragilities. Removing a toy’s control system might cause reversion to mere mobility, but a small change in an otherwise intact control system could cause wild, catastrophic behavior.”
The addition of complex systems is hardly predictable by hypothesizing from simpler forms. The above Lego narrative is not intended to dissuade any credibility in evolution; it is intended to show the vast increases in complexity as a comparison between biological development and technological development. What is omitted is the source of complexity in technology: intelligence which is intent on producing the designs necessary to produce working systems of vast complexity, and providing the means for increases in technical complexity. The vast complexity of comparable systems in biology is presupposed to have occurred by deterministic forces only. The comparison itself, by ignoring the differences in developmental mechanisms, is either a false comparison, or is unwittingly an argument for intelligent input to the development of life.
“The poem and its information content is independent of the type of substance used to write the poem. The poem could be written in chalk, ink, paint, ice or any of a wide variety of materials, but the message of the poem is not dependent upon the materials comprising the writing. Similarly, the information in the DNA molecule is independent of the bases of sugars and phosphates which comprise the molecule. If the information is independent from these chemicals, the information did not arise from the chemicals; just as a poem written on a blackboard did not arise from the chalk.”
Complexity For Autodidacts
It seems easy for the properties of living organisms to be reductively trivialized when discussing evolution and its necessary first living organism. In fact, there is only one reason for First Life investigators to try to create reductive types of replicating molecules in their labs: actual living things are far too complicated to even attempt to produce. There is no evidence that life ever existed without DNA, RNA, and all the necessary polymerases and enzymes that must be present in modern cells of all types.
In order to help visualize the actual complexity of living cells there are animations available which show how the complex molecules interact in various capacities to perform various functions. One textbook, The Cell, by Geoffrey M. Cooper and Robert E. Hausman, has an on-line site which contains animations of many different molecular functions. Just one animation can show the complexity involved in causing DNA replication, and the multifunctionality of one molecule called the “clamp loading protein” which performs multiple functions simultaneously.
For an exercise, one might try to calculate the probability of random emergence of such a molecule simultaneously with its necessary secondary molecules and the non-random, semantic information-bearing DNA molecule itself, along with the RNA and polymerases which are also required, and the metabolic functions necessary to sustain all this. There is more which is necessary for biological life, but that is enough to understand the complexity involved, and which must be attributed to the accidental emergence via random mutation and selection, or even epigenetic mutation. Here is the site with very good animations of many necessary biologic functions. I choose the narrated animations, myself:
The DNA animation is here, animation 6.1:
Scientific conceptual reduction works only for natural phenomena which have their basis in simple unifying equations. Electron flow is reducible to Maxwell’s equations, and is predictable when the initial conditions and the current environment are known. Evolution is not such a concept which is reducible to a single, simple, predictive status. Evolution cannot predict its own assertions of the emergence of complexity. Nor can evolution derive singular functions which describe historical events which are asserted to have been due to the amorphous term, “emergent”, such as the rapid “emergence” of all the phyla in the Cambrian Explosion. The term, “emergent”, has no physical properties which are empirically testable in any fashion.
Not being empirically testable, “emergence” is destined to remain an untested hypothesis which is useful only in the nature of an ideological belief, a belief held in the face of contrary evidence and common sense. (Common sense is discussed by Lewontin, below, where he asserts the need to reject it in order to protect both ideological Atheistic Materialism, and evolution, which suffer when subjected to common sense).
The idea that RNA might have emerged as the first replicator in the succession of life came thus:
”Rich, Woese, Orgel, and Crick suggested that: ‘Possibly the first enzyme was an RNA molecule with replicase properties.”
RNA is simpler than DNA, and it turned out that RNA had several properties which enticed the concept of RNA World:
”This suggestion seems to move the problem a step nearer to the probiont but still encounters the primary questions of the handedness of the amino acids [all left-handed] and ribose sugars, the generation of the genetic message and the origin of the genetic code.”
One of the main investigators of first life, Jack Szostak, has produced a series of videos condensing the history of the RNA world investigation into roughly 2 ½ hours. In the third video, he admits that RNA replication is not on the horizon, even after the decades of laboratory pursuit. So the hunt is on for a lesser molecule that might work as a replicase. 
The hard problem for RNA world is that in modern biological organisms, it takes many elements to produce replication, and they are dependent in a circular fashion: RNA + RNA polymerase > DNA > + RNA Polymerase > RNA > plus many other proteins (all deriving from DNA via RNA plus RNA polymerase)> etc. The dependency is circular, involves many complex molecules, and is completely necessary for replication.
Further, cell replication (mitosis for prokaryotes) involves a vast amount of molecular machinery to be built, timed, and animated without the involvement of the DNA/RNA/RNA polymerase complex, other than to provide building block molecules. Triggering the cell reproductive cycle is the control cycle of several components, the protein kinase, Cdk1 (amazingly) encoded by the yeast cdc2 and cdc28 genes, which is teamed with the protein, Cyclin B. When Cdk1 encounters Cyclin B, they attach, Cdk1 is phosphorylized which activates it and thus starts Mitosis. After the initiation of Mitosis, the Cyclin B degrades and detaches and is dephosphorylized. Cdk1 then is available for a new Cyclin B, and a new cycle starts. 
There are a number of other cycles, including growth cycle, which are regulated with protein molecules.
Mitosis events are complex:
”M phase [mitosis] is the most dramatic period of the cell, involving a major reorganization of virtually all of the cell components. During mitosis (nuclear division), the chromosomes condense, the nuclear envelope of most cells breaks down, the cytoskeleton reorganizes to form the mitotic spindle, and the chromosomes move to the opposite poles. Chromosome segregation is then usually followed by cell division (cytokenesis)”
[an animation of mitosis is here: http://highered.mheducation.com/sites/0072495855/student_view0/chapter2/animation__mitosis_and_cytokinesis.html ]
It is easy to see that this is in no manner similar to the RNA world suggested replication method of “the cell stretches out so that shaking it slightly breaks it apart”, suggesting that “gentle wave action on a beach” might produce cell reproduction in archaic cells.
”Furthermore, Nelson, Levy, and Miller (2000) report that:
‘numerous problems exist with the current thinking of RNA as the first genetic material. No plausible prebiotic processes have yet been demonstrated to produce the nucleosides or nucleotides or for efficient two-way nonenzymatic replication.’
Shapiro (1999) has examined the question of prebiotic soup for cytosine. He found:
‘No reactions have been described that far that would produce cytosine, even in a specialized loal setting, at a rate sufficient to compensate for its decomposition. On the basis of this evidence, it appears unlikely that cytosine played a role in the origin of life."
Cytosine is the nucleic acid base (a pyramidine) that is necessary for Cytidylate, one of the necessary nucleotides in DNA and RNA. The half-life of Cytosine at 100 degrees C is nineteen days. This makes it unlikely to have been available for use in prebiotic soups for the formation of DNA or RNA.
Support for RNA world is waning to the point that it is not really seen as a viable contender for the development of the first replicant. Another major reason, again, is that it doesn’t even address the issue of the development of the first genetic code, nor interrelated communication systems.
The development of life due to the prior development of metabolism first, assembled in suitable environments (warm ponds of carbon chain nutrients?) has been mostly abandoned due to its inability to provide any reasoning for the development of replicating molecules contained in cell-like enclosures, and the realization that metabolic replication, by itself, cannot sustain Darwinian evolution. An interesting study refuting “Metabolism First” is reported here:
“In the first half of the 20th century, Alexander Oparin established the "Metabolism First" hypothesis to explain the origin of life, thus strengthening the primary role of cells as small drops of coacervates (evolutionary precursors of the first prokaryote cells). Dr Oparin did not refer to RNA or DNA molecules since at that time it was not clear just how important the role of these molecules was in living organisms. However he did form a solid base for the idea of self-replication as a collective property of molecular compounds.
Science more recently demonstrated that sets of chemical components store information about their composition which can be duplicated and transmitted to their descendents. This has led to their being named "compound genomes" or composomes. In other words, heredity does not require information in order to be stored in RNA or DNA molecules. These "compound genomes" apparently fulfil the conditions required to be considered evolutionary units, which suggests a pathway from pre-Darwinian dynamics to a minimum protocell.
Researchers in this study nevertheless reveal that these systems are incapable of undergoing a Darwinian evolution. For the first time a rigorous analysis was carried out to study the supposed evolution of these molecular networks using a combination of numerical and analytical simulations and network analysis approximations. Their research demonstrated that the dynamics of molecular compound populations which divide after having reached a critical size do not evolve, since during this process the compounds lose properties which are essential for Darwinian evolution.
Researchers concluded that this fundamental limitation of "compound genomes" should lead to caution towards theories that set metabolism first as the origin as life, even though former metabolic systems could have offered a stable habitat in which primitive polymers such as RNA could have evolved.
Researchers state that different prebiotic Earth scenarios can be considered. However, the basic property of life as a system capable of undergoing Darwinian evolution began when genetic information was finally stored and transmitted such as occurs in nucleotide polymers (RNA and DNA).”
Science Daily 
Inference To Best Explanation:
Evolution, as is inferred from the fossil record, cannot be verified empirically. Due to its historical nature and its unpredictable mutational nature, the justification for the concept of evolution is necessarily inferential, only. The fossil data show only the existence of certain organisms which are found in certain geological layers; their relationships are inferred, not presented physically by the fossils. And inferences taken from modern DNA differentials and similarities between species show just that: differentials and similarities. The presumption of historical relationships is inferred.
For an excellent example of rampant inference, there is the book, “Why Evolution is True”, by Jerry Coyne, from which the following is taken:
“But if you think a bit, it’s not hard to come up with intermediate stages in the evolution of flight, stages that might have been useful to their possessors. Gliding is the first step. And gliding has evolved independently many times: in placental mammals, marsupials, and even lizards. Flying squirrels do quite well by gliding with flaps of skin that extend along their sides – a good way to get from tree to tree to escape predators or find nuts.”
This train of “thought” is completely fact-free with respect to the actual development of flight – which is obviously factually unkown; it is a flight of fancy which is projected as plausibility, instead of fact. In other words, those words of Stephen Jay Gould, it is an imaginary Just So Story. And yes, such stories are actually not hard to come up with; objective facts are hard to come up with.
How is inference justified as a suitable path to an actual explanation which is more than just remotely plausible, more than just probable by Bayesian probability inference forcing? In Peter Lipton’s terms, what makes an inference “lovely” rather than just “likely”? Lipton has categorized inductive justifications in his book, “Inference to the Best Explanation”.  Here is the briefest possible summary of the pertinent concepts:
1. Underdetermination and circularity.
An explanation which does not cover all the observed characteristics of a phenomenon is “underdetermined”, and is not robust enough to serve as a complete or useful explanation.
An explanation which requires the phenomenon to be true a priori is circular.
2. Causality; complete or massively underdetermined?
For a causal explanation, all possible events found within the phenomenon being described must be accounted for in a proposed causal explanation. An incomplete causal explanation is not useful.
3. Likeliness vs. loveliness. 
Likeliness of an explanation is one that plausibly accounts for all the effects found in a phenomenon. Loveliness of an explanation is defined as providing the greatest knowledge about a phenomenon. Likeliness and loveliness might coincide, and they might not. If they do not coincide, Lipton chooses loveliness over likeliness, although that is a debatable choice.
4. Failure to accept implausibilities.
If some aspects of the explanation for an hypothesis are extraordinarily unlikely, implausible or even impossible, failure to adapt the inference or drop it altogether is irrational. (This is my interpretation).
5. Contrastive Inference.
Comparing two possible inferential explanations is called “contrastive”; at this point one chooses between the two if they are not congruent. Lipton again prefers the “lovely” explanation over the “likeliest” explanation, because the likeliest “tends toward triviality”.  Still, either type of explanation would fail, if it is incomplete or exhibits an explicit failure to explain a portion of the phenomenon.
6. False Reductivity.
Oversimplification based on ideology. Insertion of ideology as a necessary pre-condition cannot be used as a legitimate sorting process. Lipton does allow the use of “prior beliefs”, which presumably are adequately screened by proper inferential methodology. But as Einstein said, make it as simple as possible, but not simpler.
Inferential Plausibiilty, And Skepticism
Remember that inferences of all types depend upon the assertion of the “plausibility” of the claim being made. And plausibility is defined thus:
“Full Definition of PLAUSIBLE
1: superficially fair, reasonable, or valuable but often specious <a plausible pretext>
2: superficially pleasing or persuasive <a swindler… , then a quack, then a smooth, plausible gentleman — R. W. Emerson>
3: appearing worthy of belief <the argument was both powerful and plausible“</i>
The words, “superficial”, and “appearance”, always apply to inferences, which are claimed to be “plausible”. Remember that all evolutionary claims are inferential (frequently chained two or three or more deep). So evolutionary plausibility is certainly credible in the minds of those inclined a priori to be believers, only. It takes just the slightest skepticism to notice the superficiality and the appearance of credibility which evolutionists present in its defense, rather than justifiable, objective, empirical knowledge which empirical science has been created to both demand and provide. And skepticism is fully warranted not merely due to conceptual “superficiality” of cascaded inferences, but also due to the massive improbability of the complexities having merely emerged, for no teleological reason whatsoever and for no discernible deterministic reasons either.
Consequences for evolutionary inferences.
Evolution is heavily invested in Philosophical Materialism, which famously was invoked as “not allowing a divine foot in the door” by Richard Lewontin:
“Our willingness to accept scientific claims that are against common sense is the key to an understanding of the real struggle between science and the supernatural. We take the side of science in spite of the patent absurdity of some of its constructs, in spite of its failure to fulfill many of its extravagant promises of health and life, in spite of the tolerance of the scientific community for unsubstantiated just-so stories, because we have a prior commitment, a commitment to materialism. It is not that the methods and institutions of science somehow compel us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counter-intuitive, no matter how mystifying to the uninitiated. Moreover, that materialism is absolute, for we cannot allow a Divine Foot in the door. The eminent Kant scholar Lewis Beck used to say that anyone who could believe in God could believe in anything. To appeal to an omnipotent deity is to allow that at any moment the regularities of nature may be ruptured, that miracles may happen.”
The ideology is plainly stated; it dismisses – actually prohibits - without rational deductive reasoning, any contrary position on the matter. Evolution, as a collection of untestable, therefore unfalsifiable, yet “natural” hypotheses, can be extended in all directions to cover any and all eventualities. This is the case, because the theory of evolution is itself so malleable that it encompasses every conceivable material causation and requires many that are actually rationally inconceivable as well (accidental –non-teleological – assembly of semantic information-bearing molecules comes to mind, as do evolutionary landscapes). And despite all that, all those theoretical gyrations, it is completely without causal predictive power, and thus fails as a deterministic, determinative explanation.
In fact, even the very need for an explanation is inferred, not directly observed, from the fossil record. By subjectively inferring relationships from the fossil record, the need for an explanation for those relationships is inferred. This is a double inference for the basic premises themselves, before the inferential explanations even start. This has resulted in the wildly subjective “Just So Stories” which are subjectively inferred as plausible solutions. The resulting triple inference is hardly projectable as a source of physical reality, much less objective fact or even Truth. 
Still, is evolution a “lovely” explanation, in Lipton’s terms? It explains everything and nothing simultaneously and with equal vigor. It is claimed to be the unifying principle of biology, yet it explains no biological system which is found in modern organisms, except in vague generalities of “Well, it evolved – it’s here, isn’t it?” It is very difficult to attribute "loveliness" to such a non-explanatory, ideology-driven, non-falsifiable set of changeable stories.
Is it the case that an explanation with essentially zero likelihood can still be “lovely”? Its adherents deny the impossibility of the evolution of the first cell from information-free, dead minerals, into the necessarily complete, animated systems which are metabolically balanced and sustained, ready for the complex restructuring required during mitosis, and all accidentally self-assembled. That impossibility is mathematically expressed in many calculations by many different sources. And the need for evolution’s protection from contrary opinion is obvious by the assertion of a logically failed but narrative-controlling ideology (Philosophical Materialism) and legally protected by court action subsidized by the ACLU and other ideologists.
What evolution has in its favor is just this: it is a material theory. However flawed, it is necessary to the ideology expressed by Lewontin, above: Materialist Atheism must have a materialist theory of origins. Since materialism is absolutely required when the existence of an intelligent input is denied and locked out of any intellectual debate, then no amount of deterministic impossibility will deter or derail the Materialist true believer from accepting the “possibilities of evolution” from minerals to mind. That makes evolution a religion: blind belief in the face of the failure of any supporting facts. Unlike the voluntary materialistic sciences, such as physics and, yes, modern biology, the imposition of involuntary materialism onto the investigation of the source of life does not meet with useful results in terms of knowledge. In fact, the imposition of involuntary materialism onto the investigation into the source of life results in the equivalent of faerie tales, where “scientists” see what is not there, and refer to it as plausible fact, even “Truth” . That amounts to fraud, and is not science.
“Once more, there is<b> no universal biological law</b> which applies precisely and automatically to every living thing. There are only <b>directions</b> in which life throws out species in general. Each particular species, in the very act by which it is constituted, affirms its independence, follows its caprice, deviates more or less from the straight line, sometimes even remounts the slope and seems to turn its back on the original direction.”
Henri Bergson 
“In Orgel’s last written words to the origin-of-life community, he admonishes advocates of each scenario: ‘Solutions offered by supporters of geneticist or metabolist scenarios that are dependent on ‘if pigs could fly’ hypothetical chemistry are unlikely to help’”.
Rana , Orgel 
None of this is intended to say that invoking the appearance of intelligence in the composition of living organisms should result in stopping any investigation whatsoever. To the contrary, it would open up investigation which is now closed due to ideology, and possibly reduce the loss of assets wasted on fruitless materialist-only pursuits.
1. Cooper/Hausman; The Cell: A Molecular Approach; Boston University, Sinauer Press; 2013; p589,590.
2. Ibid, p590; “For example, integrins and cadherins function not only as cell adhesion molecules but also as as signaling molecules that regulate cell proliferation and survival in response to cell-cell and cell-matrix contacts.
3. Ibid p590.
4. Ibid p600, G protein-coupled receptors; “More than 1000 such G-protein-coupled receptors have been identified, including the receptors for eicosanoids, many neurotransmitters, neuropeptides, and peptides. In addition, the G-protein-coupled receptor family includes a large number of receptors that are responsible for smell, sight, and taste.”
5. Atlas; Principles of Microbiology; University of Lousville; Mosby press; 1995; p418; T-cell receptors detect foreign antigen signals regarding other cells.
7. Hubert P. Yockey; “Information Theory, Evolution, and the Origin of Life; Cambridge University Press; 2005; p2.
8. Kuppers, p170.
9. Werner Gitt; “Without Excuse”; Creation Pubs; p38.
10. Yockey, p29.
11. Bernd Olaf Kuppers; Information and Communication in Living Matter; From Information and the Nature of Matter, Davies/Gregerson, eds, Cambridge University Press, 2010; p 179.
12. Gitt; p39
13. Yockey p107, 108
14. J.C. Sanford; Genetic Entropy & the Mystery of the Genome; https://www.youtube.com/watch?feature=player_embedded&v=eY98io7JH-c
15. ENCODE project website: http://www.genome.gov/27532724
16. Josiah Seaman; “DNA.EXE: a Sequence Comparison… genome and Computer Code”; in “Biological Information, New Perspectives”, Marks, et. al., eds.; World Scientific Pubs, 2013; P397-8.
21. Sanford, Baumgardner and Brewer; "Selection Threshold Severely Constrains Capture of Beneficial Mutations”; in “Biological Information, New Perspectives”; World Scientific pubs; 2013; p288.
22. Sanford, Ibid; p283.
23. Stephen C. Meyer; "Signature in the Cell"; Harper One pubs; 2009; p211 -213.
27. Edward Archer; Mayo Clinic; “The Childhood Obesity Epidemic as a Result of Nongenetic Evolution: The Maternal Resources Hypothesis”; 2015; http://www.mayoclinicproceedings.org/article/S0025-6196%2814%2900740-X/fulltext
28. John C. Lennox; “God’s Undertaker”; Lion Books, 2009; P155
29. Stuart Kauffman; “The Origins of Order, Self-organization and Selection in Evolution”; Oxford University Press; 1993; p 180
30. Stuart Kauffman; Ibid.
31. John C. Lennox; "God’s Undertaker"; Lion Hudson Pubs; 2009; p133.
32. Erwin Schroedinger;”What is Life?”; Cambridge University Press; 1944/2012.
33. Kuppers, p170.
34. Alessandro Minelle; “Evo-devo Does NOT change the Neo-Darwinist Paradigm”; in “Contemporary Debates in Philosophy”, edited by Ayala and Arp; Wiley-Blackwell pubs;2010; p223.
38. Marie E. Csete, John C. Doyle; “Reverse Engineering of Biological Complexity”; Science 295, 1664 http://web.williams.edu/Mathematics/sjmiller/public_html/legos/Science-2002-Csete-1664-9.pdf
39. Dean L. Overman; “A Case Against Accident and Self-Organization”; Rowman and Littlefield, pubs; 1997; p89.
40. Yockey, p145.
42. Cooper/Hausman, p653.
43. Cooper/Hausman; p659
44. Yockey; p 145.
45. Mauro Santos; University of Barcelona; in Science Daily; http://www.sciencedaily.com/releases/2010/01/100108101433.htm
46. Jerry Coyne; “Why Evolution is True”; Penguin/Viking; 2010; p39.
47. Peter Lipton; “Inference to the Best Explanation”; International Library of Philosophy; Routledge pubs; 1991/2004.
He notes that his arguments for inference to the best explanation are themselves “contrastive inference” arguments. (p208) This leaves the argument for inference in a circular presuppositional sort of belief system, supported with individuated circumstantial instances rather than with universal rules.
48. Lipton, Ibid, p61.
49. Csete, et. al.; Ibid, p60, 61.
50. Richard Lewontin; “Billions and Billions of Demons”; New York Review of Books; Jan. 9, 1997. http://www.nybooks.com/articles/archives/1997/jan/09/billions-and-billions-of-demons/
51. Jerry Coyne; “Why Evolution is True”; Penguin/Viking; 2010.
52. Jerry Coyne; Ibid.
53. Henri Berson; “The Evolution of Life – Mechanism and Teleology” (1911); in “Emergence, Complexity, and Self-Organization”, Juarrero and Rubino, eds.; Emergent Pubs, 2010; p69
54. Fazale Rana; “Creating Life in the Lab”; Baker Books, pubs; 2011; p180.
55. Leslie E. Orgel; “The Implausibility of Metabolic Cycles on the Prebiotic Earth”; PLoS Biology 6 (Jan22, 2008); http://www.plosbiology.org/article/info%3Adoi%2F10.1371%2Fjournal.pbio.0060018
Oh boy,I will have to save this article and re-read it several more times.
Quick question Stan and perhaps I missed it in the article.Are there any examples of inanimate objects which have more complexity than a any living organisms?
The typical Atheist claim is that complexity produces life.So if there is a more complex object than a cell or living organism then that theory is immediately falsified.It also seems that complexity is a very subjective concept.What may seem complex to one is simple to another.For example Aquinas thought God was the most simple being yet most Atheists will reject God because he is too complex to comprehend.
"Complexity means life" is a fairy tale meant for support of Materialism and Atheism. This Materialist/Atheist fairy tale is a god replacement (source of origin) which has no evidence, is not based on observation (not inductively known), cannot be deduced from first principles, cannot be used to develop an hypothesis from general rules produced by induction (is not testable), so it is not falsifiable, and is not ever capable of generating knowledge of objective fact. So it is a religious proposition, having passed all the intellectual requirements for a religion. But it is not just any religion, it is the false story religion, made up to combat actual religions.
There is a difference between complexity and order, and between order and life. The difference is this: complexity includes both randomness and unreducible semantic information. A billion bit random chain is complex, as is a billion bit semantic chain. Order is generally simple and is reducible to a suitable algorithm. Neither randomness nor semantic information is reducible to an algorithm.
Life - cellular life, probably includng the first cell - requires a specific type of complexity: (a) semantic information (b) which is useful (teleological), (c) simultaneously available in (d) the thousands of different useful, semantic, feedback communication systems, which (e) all (or most) contain separate semantic languages understood by two or more functional entities through dedicated, noiseless communication channels and which (f) which are animated by directed forces which are not the deterministic four forces, and which (g) perform all the separate necessary functions for anentropic life, including the timed functions of extra complexity such as perfect separation during mitosis.
Also included is the necessity for the prior creation of a containing membrane with the special features of allowing nutrient passage one-way and effluent passage the other way, while maintaining the necessary and sufficient contents within the cell for all the cycles necessary to life, including the DNA/RNA/RNA polymerase (plus many many other complex enzymes necessary for reproduction of DNA, RNA and the enzymes (a self contained cycle of necessity, where all the components must exist prior to the ability to create any of the components). And containing the metabolic cycle and components, which are necessary prior to any metabolism, as well.
This sort of listing of initial complexity can go on and on, and maybe a complete listing is necessary at some point (except that all of the functions are not yet even known, even for simple prokaryotes).
There is one more salient issue: dead cells are complex, too. The same complexity exists in a dead cell as exists in a live cell. So what is missing in a dead cell? None of the components, except one: life.
So complexity, even purposeful, teleological, semantic complexity is not sufficient to produce life, because life is separate from the complex material structure and can leave the complex material structure, rendering it dead.
Complexity does not arise very far in deterministic nature. Kimura's unselectablity has show that entropy prevents it. And, it cannot be shown that semantic complexity arises at all. It is not even possible to produce semantic RNA from simple atoms and forcing functions in the lab with intelligent direction and all the proper conditions. Szostak abandoned trying to get RNA to exist and replicate, after decades of trying with his team, and he is now "searching for a simpler molecule" to get it to replicate.
Science, mathematics, reality and common sense are all against the "complexity" argument.
In fact, arguing "life arises from complexity" is an argument from ignorance, being with zero support from any quarter, and necessitated purely by the intellectual lock-down of Materialist diktat which limits the zone of investigation ideologically.
Thanks for the elucidation.
Deterministic Complexity is an oxymoron,since determinism requires specific particles to produce a specific effect/event.Given the exact state of prior affairs,it will always produce the exact same results.Complexity on the other hand is random and requires decoding before it can pass as intelligible information.Unfortunately,since Materialists have pigeon-holed complexity as a fixed state,it is therefore in a perpetual state of unintelligibility.Yet life as we know it is fundamentally rational and intelligent,contradicting the Materialist's conclusion.
"Deterministic complexity" is an interesting term. And that's right, it would produce only random, unintelligible, non-semantic, non-algorithmic, code-free sets. That's completely due to the random positioning/momentum of electrons at the time that the initial conditions are determined. With random initial conditions, the consequence of any forcing function would produce random results the next moment (one planck time?) later.
Edward Feser writes that some materialists claim that reality might always devolve to a future physics, if not to the present physics. He points out that a future physics might actually find that non-material causation is a requirement of the new physics, thereby invalidating current materialist reductionism. Materialism is always a Just So Story which can be overcome easily with different, better Just So Stories. What Materialists don't have, and cannot have, is material evidence which proves their Materialist claims in a material, empirical fashion, providing objective justification for their belief system - which is necessarily blind ideology.
Post a Comment