JOHN E. MAYFIELD, professor emeritus, Iowa State University
  Department of Genetics, Development, and Cell Biology

B.A. Physics, The College of Wooster, 1963
M.S. Biophysics, The University of Pittsbutgh, 1965
Ph.D. Biophysics, The University of Pittsburgh, 1968

e-mail: jemayf@iastate.edu

Blog site: johnemayfield.com

Evolution and the generation of complexity have long interested biologists and many non-biologists. Scientific understanding of these subjects has advanced to the point where much of the mystery can now be dispelled. To make this understanding available to the non-specialist, I have written a book explaining in non-technical terms how the generalized notion of evolution explains and makes possible much of the complexity we encounter in our everyday lives.

The Engine of Complexity, Evolution as Computation, Columbia University Press, 2013.

The book weaves together four threads to explain much of the worldly complexity we encounter in a natural and surprisingly simple way. The first thread is that evolution is not simply a biological phenomenon but can be found also to operate in other venues. A straightforward way to generalize evolution so that all the examples we know are seen to employ the same underlying mechanism is to describe evolution as a particular class of computations. When this is done, biological evolution, plant and animal breeding, antibody maturation, evolutionary and genetic algorithms, social and technical evolution, and even some brain functions are all seen to be based on the same information processing strategy. I call this computational strategy the engine of complexity.

Figure. Diagram of the engine of complexity computation. This simple computational strategy defines evolution in a more general way than does biological evolution and can be shown to underlie various non-biological as well as biological phenomena. The superscripts t and t+1 indicate the cycle number, and mt, the number of inputs for a particular cycle, must always be less than or equal to n t+1, the number of outputs in the previous cycle.

The second thread of the book is that most complex things in our experience were created through the use of instructions. Instructions (or recipes, blueprints, or algorithms) embody extra information beyond that inherent in the laws of chemistry and physics. They are used to create configurations of matter far too improbable to occur without specification. Instructions embody purposeful information and purposeful information has to come from somewhere, it can`t simply be conjured from nothing. The reason for this is the probability problem, the third thread of the book. The probability problem is well illustrated by the difficulty of coin tossing 100 consecutive heads. To do this, you would need to flip a coin roughly 1,000,000,000,000,000,000,000,000,000,000 times. Tossing 1000 consecutive (or any other specific ordering of) heads is quite impossible because the lifetime of the universe is not long enough. The same difficulty faces anyone wishing to assemble thousands of bits of information that make up any but the shortest instructions provided they do not know in advance what information to put in.

The fourth thread is that the engine of complexity computation has the remarkable property that it can efficiently assemble large bodies of purposeful information when nothing is known in advance about what information is needed. When the engine is employed, the time needed to generate 100 purposeful bits (the same problem as flipping 100 consecutive heads) is reduced to just over 1000 flips in contrast to 1,000,000,000,000,000,000,000,000,000,000 flips otherwise required. Fundamentally, engine of complexity type computations extract from random choices information that is pertinent to whatever selection criteria are in play. I lay out the case that all but the simplest of instructions owe their existence to some physical manifestation of the engine of complexity, and because of this, much of the complexity that characterizes our existence depends directly on the engine of complexity.

The book is written for a non-technical audience and strives also to introduce principles of computer science to non-computer scientists.

Published work on complexity:

John E. Mayfield (2007) Minimal History, a Theory of Plausible Explanation, Complexity 12, 48-53. {HTML}

John E. Mayfield (2004) Evolution as Computation, in Evolutionary Theory {PDF}

Ashlock and Mayfield (1998), Acquisition of General Adaptive Features by Evolution, Evolutionary Programming VII, V.W. Porto et al eds, pp. 75-84. {PDF}

Published work on the molecular genetics of disease causing organisms:

CV references numbered 17-21, 23, 25-31, 33, 34, 37-40, 42, 44, 45

Published work on chromosome structure:

CV references numbered 4 through 16

Curriculum Vitae

Iowa State Links:

Department of Genetics, Development and Cell Biology

Interdepartmental Bioinformatics and Computational Biology Major

Interdepartmental Genetics Graduate Major

Dan Ashlock

Jack Lutz

Iowa State University Graduate College