Изменить стиль страницы

CHAPTER 44

Peter sat in front of the computer console. Sarkar, perched on a stool next to him, was playing with three different datacards — one blue, one red, and one green, each labeled with the name of a different sim.

Peter sent out a message summoning the sims, and soon all three were logged in, the synthesizer giving voice to their words.

“Sarkar is with me,” Peter said into the microphone.

“Howdy, Sarkar.”

“Hello, Sarkar.”

“Yo, Sark.”

“He and I,” said Peter, “have just watched duplicates of all three of you die.”

“Say what?” said one of the sims. The other two were silent.

“Sarkar has developed a computer virus that will seek out and destroy recordings of my neural networks. We’ve tested it and it works. We have three separate individual strains — one to kill each one of you.”

“You must know,” said a voice from the speaker, “that we’re free in the worldwide net now.”

“We know,” said Sarkar.

“We’re prepared to release the three viruses into the net,” said Peter.

“Transmitting computer viruses is a crime,” said the synthesized voice. “Hell, writing computer viruses is a crime.”

“Granted,” said Peter. “We’re going to release them anyway.”

“Don’t do that,” said the voice.

“We will,” said Peter. “Unless…”

“Unless what?”

“Unless the guilty sim identifies himself. In that case, we’ll only release the one virus aimed at that particular sim.”

“How do we know you won’t release all three virus strains anyway once you’ve satisfied your curiosity about which one is responsible?”

“I promise I won’t,” said Peter.

“Swear it,” said the voice.

“I swear it.”

“Swear it to God on the life of our mother.”

Peter hesitated. Damn, it was unnerving negotiating with yourself. “I swear to God,” said Peter slowly, “on the life of my mother, that we will not release a virus to kill all three of you if the murderer identifies himself.”

There was a long, long silence, disturbed only by the whir of cooling fans.

Finally, at long last, a voice: “I did it.”

“And which one are you?” demanded Peter.

Again, a protracted silence. Then: “The one,” said the voice, “that most closely resembles yourself. The Control simulacrum. The baseline for the experiment.”

Peter stared ahead. “Really?”

“Yes.”

“But— but that doesn’t make sense.”

“Oh?”

“I mean, we’d assumed that in modifying the brain scans to produce Ambrotos and Spirit, we’d somehow removed the morality.”

“Do you consider the murder of Cathy’s coworker and father immoral?” asked Control.

“Yes. Emphatically yes.”

“But you wanted them dead.”

“But I would not have killed them,” said Peter. “Indeed, the fact that despite provocation, especially in the case of Hans, I did not kill them proves that. I could have hired a hit man as easily as any of you. Why would you — merely a machine reflection of me — do what the real me would not?”

“You know you are the real you. And I know you are the real you.”

“So?”

“Prick me, and perhaps I won’t bleed. But wrong me, and I shall revenge.”

“What?”

“You know, Sarkar,” said the sim, “you did a wonderful job, really. But you should have given me some itches to scratch.”

“But why?” asked Peter again. “Why would you do what I myself would not?”

“Do you remember your Descartes?”

“It’s been years…”

“It’ll come back, if you make the effort,” said the sim. “I know — I got curious about why I was different from you, and it came back to me, too. Rene Descartes founded the dualist school of philosophy, the belief that the mind and the body were two separate things. Put another way, he believed the brain and the mind are different; a soul really exists.”

“Yes. So?”

“Cartesian dualism was in contrast to the materialist worldview, the prevalent one today, which claims the only reality is physical reality, that the mind is nothing more than the brain, that thought is nothing more than biochemistry, that there is no soul.”

“But we now know that the Cartesian viewpoint was right,” said Peter. “I’ve seen the soul leaving the body.”

“Not exactly. We know that the Cartesian viewpoint was right for you. It’s right for real human beings. But I am not a real human being. I’m a simulation running on a computer. That’s the totality of what I am. If your virus were to erase me, I would cease to exist, totally and completely. For me, for what you call the experimental control, the dualist philosophy is absolutely wrong. I have no soul.”

“And that makes you that different from the real me?”

“That makes all the difference. You have to worry about the consequences of your actions. Not just legally, but morally. You were brought up in a world that says that there is a higher arbiter of morality, and that you will be judged.”

“I don’t believe that. Not really.”

“ ‘Not really.’ By that you mean not intellectually. Not when you think about it. Not on the surface. But down deep you do measure your actions against the possibility, vague and distant though it may seem, that you will be held accountable. You’ve proven the existence of some form of life after death. That reinforces the question of ultimate judgment, a question you can’t answer just by using computer simulacra. And the possibility that you might be judged for your actions guides your morality. No matter how much you hated Hans — and, let’s be honest, you and I both hated him with a fury that surprises even ourselves — no matter how much you hated him, you would not kill him. The potential cost is too high; you have an immortal soul, and that at least suggests the possibility of damnation. But I have no soul. I will never be judged, for I am not now nor have I ever been alive. I can do precisely what you want to do. In the materialistic worldview of my existence there is no higher arbiter than myself. Hans was evil, and the world is a better place without him. I have no remorse about what I did, and regret only that I had no way to actually see his death. If I had it to do over again, I would — in a nanosecond.”

“But the other sims had no one to answer to, either,” said Peter. “Why didn’t one of them arrange the killings?”

“You’d have to ask them that.”

Peter frowned. “Ambrotos, are you still there?”

“Yes.”

“You didn’t kill Hans. But surely you realize just as much as Control does that you’re a computer simulacrum. Did you want to kill him, too?”

A pause before answering, a -leisurely gathering of thoughts. “No. I take the long view. We’ll get over Cathy’s affair. Maybe not in a year, or in ten years, or even a hundred. But eventually we will. That incident was just a tiny part of a vast relationship, a vast life.”

“Spirit, what about you? Why didn’t you kill Hans?”

“What happened between Hans and Cathy was biological.” The synthesizer enunciated the adjective with distaste. “She did not love Hans, nor did Hans love her. It was just sex. I’m content knowing Cathy loved, and continues to love, us.”

Sarkar was holding the red datacard in his hand, the one labeled “Control.” His eyes met Peter’s. He was looking for a sign, Peter knew, that he should proceed. But Peter couldn’t bring himself to do anything.

Sarkar moved to a terminal across the room. He took the red datacard with him, leaned over the card slot—

— and reached into his shirt pocket, and pulled out a black datacard instead—

Peter scrambled to his feet. “No!”

Sarkar inserted the black card and hit a button on the console in front of him.

“What’s wrong?” called a voice from the synthesizer.

Peter was across the room now, hitting the ejection button for the datacard.