Изменить стиль страницы

Distance is not the only gradient about which one can make this kind of argument. Twilight is another. At dead of night, almost nothing can be seen, and even a very crude resemblance of mimic to model will pass muster. At high noon, only a meticulously accurate mimic may escape detection. Between these times, at daybreak and dusk, in the gloaming or just on a dull overcast day, in a fog or in a rainstorm, a smooth and unbroken continuum of visibilities obtains. Once again, resemblances of gradually increasing accuracy will be favored by natural selection, because for any given goodness of resemblance there will be a level of visibility at which that particular goodness of resemblance {75} makes all the difference. As evolution proceeds, progressively improving resemblances confer survival advantage, because the critical light intensity for being fooled becomes gradually brighter.

A similar gradient is provided by angle of vision. An insect mimic, whether good or bad, will sometimes be seen out of the corner of a predator's eye. At other times it will be seen in merciless full-frontal aspect. There must be an angle of view so peripheral that the poorest possible mimic will escape detection. There must be a view so central that even the most brilliant mimic will be in danger. Between the two is a steady gradient of view, a continuum of angles. For any given level of perfection of mimicry, there will be a critical angle at which slight improvement or disimprove-ment makes all the difference. As evolution proceeds, resemblances of smoothly increasing quality are favored, because the critical angle for being fooled becomes gradually more central.

Quality of enemies' eyes and brains can be regarded as yet another gradient, and I have already hinted at it in earlier parts of this chapter. For any degree of resemblance between a model and a mimic, there is likely to be an eye that will be fooled and an eye that will not be fooled. Again, as evolution proceeds, resemblances of smoothly increasing quality are favored, because predator eyes of greater and greater sophistication are being fooled. I don't mean that the predators are evolving better eyes in parallel to the improving mimicry, though they might. I mean that there exist, somewhere out there, predators with good eyes and predators with poor eyes. All these predators constitute a danger. A poor mimic fools only the predators with poor eyes. A good mimic fools {76} almost all the predators. There is a smooth continuum in between.

Mention of poor eyes and good eyes brings me to the creationist's favorite conundrum. What is the use of half an eye? How can natural selection favor an eye that is less than perfect? I have treated the question before and have laid out a spectrum of intermediate eyes, drawn from those that actually exist in the various phyla of the animal kingdom. Here I shall incorporate eyes in the rubric I have established of theoretical gradients. There is a gradient, a continuum, of tasks for which an eye might be used. I am at present using my eyes for recognizing letters of the alphabet as they appear on a computer screen. You need good, high-acuity eyes to do that. I have reached an age when I can no longer read without the aid of glasses, at present quite weakly magnifying ones. As I get older still, the strength of my prescription will steadily mount. Without my glasses, I shall find it gradually and steadily harder to see close detail. Here we have yet another continuum – a continuum of age.

Any normal human, however old, has better vision than an insect. There are tasks that can be usefully accomplished by people with relatively poor vision, all the way down to the nearly blind. You can play tennis with quite blurry vision, because a tennis ball is a large object, whose position and movement can be seen even if it is out of focus. Dragonflies' eyes, though poor by our standards, are good by insect standards, and dragonflies can hawk for insects on the wing, a task about as difficult as hitting a tennis ball. Much poorer eyes could be used for the task of avoiding crashing into a wall or walking over the edge of a cliff or into a river. Eyes that are even poorer could tell when a shadow, which might {77} be a cloud but could also portend a predator, looms overhead. And eyes that are still poorer could serve to tell the difference between night and day, which is useful for, among other things, synchronizing breeding seasons and knowing when to go to sleep. There is a continuum of tasks to which an eye might be put, such that for any given quality of eye, from magnificent to terrible, there is a level of task at which a marginal improvement in vision would make all the difference. There is therefore no difficulty in understanding the gradual evolution of the eye, from primitive and crude beginnings, through a smooth continuum of intermediates, to the perfection we see in a hawk or in a young human.

Thus the creationist's question – “What is the use of half an eye?” – is a lightweight question, a doddle to answer. Half an eye is just 1 percent better than 49 percent of an eye, which is already better than 48 percent, and the difference is significant. A more ponderous show of weight seems to lie behind the inevitable supplementary: “Speaking as a physicist, [5] I cannot believe that there has been enough time for an organ as complicated as the eye to have evolved from nothing. Do you really think there has been enough time?” Both questions stem from the Argument from Personal Incredulity. Audiences nevertheless appreciate an answer, and I have usually {78} fallen back on the sheer magnitude of geological time. If one pace represents one century, the whole of Anno Domini time is telescoped into a cricket pitch. To reach the origin of multi-cellular animals on the same scale, you'd have to slog all the way from New York to San Francisco.

It now appears that the shattering enormity of geological time is a steamhammer to crack a peanut. Trudging from coast to coast dramatizes the time available for the evolution of the eye. But a recent study by a pair of Swedish scientists, Dan Nilsson and Susanne Pelger, suggests that a ludicrously small fraction of that time would have been plenty. When one says “the” eye, by the way, one implicitly means the vertebrate eye, but serviceable image-forming eyes have evolved between forty and sixty times, independently from scratch, in many different invertebrate groups. Among these forty-plus independent evolutions, at least nine distinct design principles have been discovered, including pinhole eyes, two kinds of camera-lens eyes, curved-reflector (“satellite dish”) eyes, and several kinds of compound eyes. Nilsson and Pelger have concentrated on camera eyes with lenses, such as are well developed in vertebrates and octopuses.

How do you set about estimating the time required for a given amount of evolutionary change? We have to find a unit to measure the size of each evolutionary step, and it is sensible to express it as a percentage change in what is already there. Nilsson and Pelger used the number of successive changes of 1 percent as their unit for measuring changes of anatomical quantities. This is just a convenient unit – like the calorie, which is defined as the amount of energy needed to do a certain amount of work. It is easiest {79} to use the 1 percent unit when the change is all in one dimension. In the unlikely event, for instance, that natural selection favored bird-of-paradise tails of ever-increasing length, how many steps would it take for the tail to evolve from one meter to one kilometer in length? A 1 percent increase in tail length would not be noticed by the casual bird-watcher. Nevertheless, it takes surprisingly few such steps to elongate the tail to one kilometer – fewer than seven hundred.

Elongating a tail from one meter to one kilometer is all very well (and all very absurd), but how do you place the evolution of an eye on the same scale? The problem is that in the case of the eye, lots of things have to go on in lots of different parts, in parallel. Nilsson and Pelger's task was to set up computer models of evolving eyes to answer two questions. The first is essentially the question we posed again and again in the past several pages, but they asked it more systematically, using a computer: Is there a smooth gradient of change, from flat skin to full camera eye, such that every intermediate is an improvement? (Unlike human designers, natural selection can't go downhill – not even if there is a tempting higher hill on the other side of the valley.) Second – the question with which we began this section – how long would the necessary quantity of evolutionary change take?

вернуться

[5] I hope this does not give offense. In support of my point, I cite the following from Science and Christian Belief, by a distinguished physicist, the Reverend John Polkinghorne (1994, p. 16): “Someone like Richard Dawkins can present persuasive pictures of how the sifting and accumulation of small differences can produce large-scale developments, but, instinctively, a physical scientist would like to see an estimate, however rough, of how many steps would take us from a slightly light-sensitive cell to a fully formed insect eye, and of approximately the number of generations required for the necessary mutations to occur.”