One of the things that makes mass production possible is the large-scale integration of nearly identical parts. Much engineering design is based on this principle, which is taken to extremes in microelectronics; a modern microprocessor will contain several hundred million transistors, every one of which needs to be manufactured to very high tolerances if the device is to work at all. One might think that similar considerations would apply to biology. After all, the key components of biological nanotechnology – the proteins that are the key components of most of the nanoscale machinery of the cell – are specified by the genetic code down to the last atom, and in many cases are folded in a unique three dimensional configuration. It turns out, though, that this is not the case; biology actually has sophisticated mechanisms whose entire purpose is to introduce extra variation into its components.
This point was forcefully made by Dennis Bray in an article in Science magazine in 2003: called Molecular Prodigality (PDF version from Bray’s own website). Protein sequences can be chopped and changed, after the DNA code has been read, by processes of RNA editing and splicing and other types of post-translational modification, and these can lead to distinct changes in the operation of machines made from these proteins. Bray cites as an example the potassium channels in squid nerve axons; one of the component proteins can be altered by RNA editing in up to 13 distinct places, changing the channel’s operating parameters. He calculates that the random combination of all these possibilities means that there are 4.5 ×1015 subtly different possible types of potassium channels. This isn’t an isolated example; Bray estimates that up to a half of human structural genes allow some such variation, with the brain and nervous system being particularly rich in molecular diversity.
It isn’t at all clear what all this variation is for, if anything. One can speculate that some of this variability has evolved to increase the adaptability of organisms to unpredictable changes in environmental conditions. This is certainly true for the case of the adaptive immune system. A human has the ability to make 1012 different types of antibody, using combinatorial mechanisms to generate a huge library of different molecules, each of which has the potential to recognise characteristic target molecules on pathogens that we’ve yet to be exposed to. This is an example of biology’s inherent complexity; human engineering, in contrast, strives for simplicity.
I would add something to the statement that engineering strives for simplicity, along the lines of
“Make everything as simple as possible, but not simpler”
where the constraint “as simple as possible” is given by the functional requirements of the object to be created. Great works of engineering are those that through underlying simple design mechanisms can achieve great complexity or expresiveness in the final result. The examples you cite in biology (and in fact dna itself) are great examples of this, basically using a reduced set of primitives that through combinations give rise to a huge amount of results. So both engineering and nature seem to share a love of what I would in general describe as compact representations.
This does not mean we should ignore the random element of evolutionary search, which will no doubt make things much messier.
Whilst biology is complex, and evolution relies on variations occurring, nature, nevertheless, in seeking perfection is always striving for the simplest solution.
> One of the things that makes mass production possible is
> the large-scale integration of nearly identical parts.
> Much engineering design is based on this principle. . .
>
> [In contrast, ]. . . biology actually has sophisticated
> mechanisms whose entire purpose is to introduce extra variation
> into its components.
The neuroscientist Gerald M. Edelman makes a big deal of the indispensability of biological variability in the functioning of the brain in his series of books on what he calls “neural Darwinism”.
From a mailing-list article I once wrote on the topic of Edelman:
(the book-title abbreviations are
BABF _Bright Air, Brilliant Fire: On The Matter Of The Mind_
http://www.amazon.com/Bright-Air-Brilliant-Fire-Matter/dp/0465007643
ND _Neural Darwinism: The Theory of Neuronal Group Selection_
http://www.amazon.com/Neural-Darwinism-Theory-Neuronal-Selection/dp/B0014DNJGO
RP _Remembered Present: A Biological Theory Of Consciousness_
http://www.amazon.com/Remembered-Present-Biological-Theory-Consciousness/dp/046506910X
UoC _A Universe Of Consciousness: How Matter Becomes Imagination_
http://www.amazon.com/Universe-Consciousness-Matter-Becomes-Imagination/dp/0465013775 )
An important property of. . . neuronal groups. . . is a certain degree of random variation in the connections within and between them, which nevertheless does not compromise their functional adequacy. Edelman calls this lack of precise structural specificity “degeneracy” (defined in UoC on p. 86 as “the capacity of structurally different components to yield similar outputs or results”). He claims that it occurs at many levels of organization in the brain (ND p. 58 [Table 3.3]), and is an inevitable outcome of selectional processes as well as providing a source of variability that can be the basis for further ongoing selection (UoC p. 86; ND p. 46; RP pp. 50, 52-53). Degeneracy is not merely tolerated but is essential to the brain’s operation, “to provide the overlapping but nonidentical response characteristics needed to cover a universe of possible stimuli” (ND p. 50; RP p. 242).
An outcome of degeneracy is that there can be many alternative means and pathways, competitively selected out of a large population of variant possibilities, which can accomplish more or less adequately the same functional task. The detailed physical structures which participate in a given task will therefore vary stochastically among brains depending on the vagaries of chance and personal history: no two brains (even those of identical twins) will contain identical populations of neurons or be wired identically (ND p. 34 [Fig. 2.6]; BABF pp. 25, 26 [Fig. 3-5]). Furthermore, the information contained in the human genome would be insufficient to precisely specify the synaptic structure of the developing brain (BABF p. 224). Therefore, it certainly cannot be the case that the brain is precise and “hardwired” like a computer (BABF p. 27).
. . .
Edelman stresses his belief that the analogy which has repeatedly been drawn during the past fifty years between digital computers and the human brain is a false one (BABF p. 218), stemming largely from “confusions concerning what can be assumed about how the brain works without bothering to study how it is physically put together” (BABF p. 227). The lavish, almost profligate, morphology exhibited by the multiple levels of degeneracy in the brain is in stark contrast to the parsimony and specificity of present-day human-made artifacts, composed of parts of which the variability is deliberately minimized, and whose components are chosen from a relatively limited number of categories of almost identical units. Statistical variability among (say) electronic components occurs, but it’s usually merely a nuisance that must be accommodated, rather than an opportunity that can be exploited as a fundamental organizational principle, as Edelman claims for the brain. In human-built computers, “the small deviations in physical parameters that do occur (noise levels, for example) are ignored by agreement and design” (BABF p. 225). “The analogy between the mind and a computer fails for many reasons. The brain is constructed by principles that ensure diversity and degeneracy. . .” (BABF p. 152).
. . .
In a biological system, much of the physical complexity needed to support primary consciousness is inherent in the morphology of biological cells, tissues, and organs, and it isn’t clear that this morphology can be easily dismissed: “[Are] artifacts designed to have primary consciousness… **necessarily** confined to carbon chemistry and, more specifically, to biochemistry (the organic chemical or chauvinist position)[?] The provisional answer is that, while we cannot completely dismiss a particular material basis for consciousness in the liberal fashion of functionalism, it is probable that there will be severe (but not unique) constraints on the design of any artifact that is supposed to acquire conscious behavior. Such constraints are likely to exist because there is every indication that an intricate, stochastically variant anatomy and synaptic chemistry underlie brain function and because consciousness is definitely a process based on an immensely intricate and unusual morphology” (RP pp. 32-33).