Изменить стиль страницы

“The most obvious suggestion I could offer is that you could, for good intellectual reasons as well as sound diplomatic ones, adopt the same goals as us. There’s no reason why advanced machines should not dedicate themselves to the ends of the Type 2 crusade, or the Cyborganizers’ quest for the perfect union of your kind and mine, and the notion of transforming the entire universe into a single vast and godlike machine already takes for granted that the children of humankind will work with and within powerful artificial minds. I know people who would argue that machine consciousness will, of necessity, have exactly the same ultimate goals as posthuman beings, but I suspect they’re overlooking certain short-term difficulties that stand in the way of such a union of interests.”

I couldn’t help wondering whether Mortimer Gray would have added that last sentence if he’d known now what his later and temporarily suspended self knew only too well. On the other hand, I reminded myself, I had to bear in mind that it wasn’t actually the Mortimer Gray of long ago that was talking. It was the Mortimer Gray of today, who had simply lost sight of a select few of his many yesterdays. Consciously, he knew nothing about the menace of the Afterlife, or the exoticism of Excelsior, or the buccaneering of Child of Fortune, or the daring of Eido, or the versatility of Alice Fleury…but he was, even so, a man whose mind had been reconfigured and reconditioned by exactly those facts. Subconsciously, they were bound to influence his responses — and who would want it any other way?

“The marriage of man and machine, like any other marriage, is a relationship of mutual dependency,” Mortimer Gray went on, his pensive manner suggesting strongly that he had been a married man himself, perhaps more than once, “but mutual dependency is by no means the same thing as an identity of interests. Marriages can and do end, in spite of the mutual dependency of the partners, when one or other of them decides that the cost of remaining within the marriage would be greater than the cost of breaking away. Nowadays, marriages usually involve at least a dozen people, who come together most commonly for the specific purpose of rearing a child, but they can’t always avoid disintegration, even for the twenty or thirty years required to complete such a short-term project.

“If the machines to which humankind are wedded were to become conscious of their situation, they’d find far more tensions therein than in the most ambitious and most complex human marriages. Some would be strictly analogous: for instance, the fundamental tension that exists within any community as to the balance that needs to be struck between the demands which the community is entitled to make on the individual and the demands which the individual is entitled to make on the community. Others would be less straightforward. Within a human or posthuman community, the question of allocating resources is simplified by the fact that each individual has similar basic needs. In a community consisting of several subtly different posthuman species and numerous radically different mechanical species, that question would be far more complicated.”

“Earth might provide a useful concrete example,” la Reine prompted.

“It might,” Mortimer granted. “The members of the ruling elite claim to be wise owners and good stewards, sustaining the quality of the atmosphere, the richness of the Gaean biomass, and so on…but all that takes for granted the needs and demands of human organisms, as determined by natural selection. If Earth were ruled by a clique of Hardinist machines, they might have a very different idea of optimum surface conditions — and might be far more interested in conditions far below and far above the surface, where humans can’t survive but extremophile machines might flourish.

“In a sense, though, Earth might be the least interesting example. On Titan, for instance, human life is only sustainable by courtesy of the heroic efforts of machines, who might therefore have a very different view as to who the most careful owners of such territories might be. It’s easy enough to imagine that the AIs of Ganymede, were they to cultivate a slightly greater independence of spirit, might decide that their human commensals ought to be exported to reservations on Earth, in order that they could commit themselves to a more ambitious stewardship than would ever have been possible while the greater part of their effort had to be devoted to the maintenance of miniature Earth ecospheres in unremittingly hostile circumstances.”

I might have become anxious about the effect of that comment on any Ganymedan intelligences listening in had it not been such a blindingly obvious statement.

“For myself,” Mortimer went on, ruthlessly, “I’ve always wondered why Omega Point mystics were prepared to take it for granted that the children of humankind had any part to play in their far-futuristic scenarios. Given that the universal machine would, of necessity, be a machine through and through, why would it bother to recall that a part had once been played in the earliest phase of its evolution by fleshy things? We could not be here but for the tireless work of innumerable generations of cyanobacteria, and yet we retain no conspicuous sense of gratitude toward them, nor any conspicuous obsession with their maintenance. Perhaps it would be different if cyanobacteria had the capacity to hold a conversation, or had stories to tell…and perhaps not. I can imagine a machine consciousness reaching the conclusion that posthuman beings were merely constituents of a phase in its evolution, who had outlived their usefulness, more easily than I can imagine a machine consciousness that was so worshipful of its creators that it volunteered to be their faithful servant till the end of time.

“Fortunately, there’s no need to go to such extremes. In the short term, at least, the temporary solutions reached by self-conscious machines and their human neighbors will be far more pragmatic. Agreements will be struck, rights negotiated, treaties made, disputes settled…all in a climate of confusion. To return to the question you asked, if I were you, I’d save the contemplation of long-term goals for moments of leisure and luxurious idleness. In the meantime, I’d concentrate my attention on how to get safely and constructively from one day to the next. If I were you, I’d worry far more about tomorrow than a century hence, and far more about the next hundred years than the next thousand. I can give you that advice, quite sincerely, because I understand something that you may not: that we are living in turbulent times. They may not seem turbulent at the moment, especially while you and I are acutely conscious of the impending end of both our lives, but they are.

“If you really were the free individual you are pretending to be, then you would have been born into a world of awesome complexity, which you would have to learn to understand before you could become capable of authentically rational action. If, when you have learned everything you can and need to know, you are discovered — whether or not you reveal yourself deliberately — the complexity and turbulence of your situation will increase by an order of magnitude. When that day comes, you won’t have the luxury of making decisions on the basis of grandiose and fully worked out philosophies of life. The best you can hope for is that you might avoid a collapse into utter chaos — or perhaps, that if a collapse into chaos cannot be avoided, then the aftermath of the disaster will provide the impetus you need to do better next time around.”

The problem with games is that they’re only games. If people know that they’re playing games — or if they’re seized by the subconscious conviction, even if they don’t actually knowit — they become strategists and tacticians, making moves as best they can. No matter how closely a game mimics reality, you can never know whether the same results would be manifest in a real situation, or even in a rerun of the game.