Изменить стиль страницы

“Do you remember the various simulations Dr. Lentrall performed, and the data upon which they were based?”

“Yes,” Kaelor said again. “I remember everything.”

A whole series of questions she dared not ask flickered through her mind, along with the answers she dared not hear from Kaelor. Like a chess player who could see checkmate eight moves ahead, she knew how the questions and answers would go, almost word for word.

Q: If you remember everything, you recall all the figures and information you saw in connection with your work with Dr. Lentrall. Why didn’t you act to replace as many of the lost datapoints as possible last night when Dr. Lentrall discovered his files were gone? Great harm would be done to his work and career if all those data were lost for all time.

A: Because doing so would remind Or. Lentrall that I witnessed all his simulations of the Comet Grieg operation and that I therefore remembered the comet’s positional data. I could not provide that information, as it would make the comet intercept and retargeting possible, endangering many humans. That outweighed the possible harm to one man’s career.

Q: But the comet impact would enhance the planetary environment, benefiting many more humans in the future, and allowing them to live longer and better lives. Why did you not act to do good to those future generations?

A: I did not act for two reasons. First, I was specifically designed with a reduced capacity for judging the Three-Law consequences of hypothetical circumstances. I am incapable of considering the future and hypothetical well-being of human beings decades or centuries from now, most of whom do not yet exist. Second, the second clause of the First Law merely requires me to prevent injury to humans. It does not require me to perform any acts in order to benefit humans, though I can perform such acts if I choose. I am merely compelled to prevent harm to humans. Action compelled by First Law supersedes any impulse toward voluntary action.

Q. But many humans now alive are likely to die young, and die most unpleasantly, if we do no repair the climate. By preventing the comet impact, there is a high probability you are condemning those very real people to premature death. Where is the comet? I order you to tell me its coordinates. mass, and trajectory.

A. I cannot tell you. I must tell you. I cannot tell you-

And so on, unto death.

It would have gone on that way, if it had lasted even that long. Either the massive conflict between First and Second Law compulsions would have burned out his brain, or else Kaelor would have invoked the second clause of First Law. He could not, through inaction, allow harm to humans.

Merely by staying alive, with the unerasable information of where the comet was in his head, he represented a danger to humans. As long as he stayed alive, there was, in theory, a way to get past the confidentiality features of Kaelor’s brain assembly. There was no way Fredda could do it here, now, but in her own lab, with all her equipment, and with perhaps a week’s time, she could probably defeat the safeties and tap into everything he knew.

And Kaelor knew that, or at least he had to assume it was the case. In order to prevent harm to humans, Kaelor would have to will his own brain to disorganize, disassociate, lose its positronic pathing.

He would have to will himself to die.

That line of questioning would kill him, either through Law-Conflict burnout or compelled suicide. He was still perilously close to both deaths as it was. Maybe it was time to take some of the pressure off. She could reduce at least some of the stress produced by Second Law. “I release you from the prohibition against volunteering information and opinions. You may say whatever you wish.”

“I spent all of last night using my hyperwave link to tie into the data network and rebuild as many of Dr. Lentrall’s work files as possible, using my memories of various operations and interfaces with the computers to restore as much as I could while remaining in accordance with the Three Laws. I would estimate that I was able to restore approximately sixty percent of the results-level data, and perhaps twenty percent of the raw data.”

“Thank you,” said Lentrall. “That was most generous of you.”

“It was my duty, Dr. Lentrall. First Law prevented me from abstaining from an action that could prevent harm to a human.”

“Whether or not you had to do it, you did it,” said Lentrall. “Thank you.”

There was a moment’s silence, and Kaelor looked from Lentrall to Fredda and back again. “There is no need for these games,” he said. “I know what you want, and you know thhhat I I I knowww.”

Lentrall and Fredda exchanged a look, and it was plain Lentrall knew as well as she did that it was First Law conflict making it hard for Kaelor to speak.

Kaelor faced a moral conundrum few humans could have dealt with well. How to decide between probable harm and death to an unknown number of persons; and the misery and the lives ruined by the ruined planetary climate. And it is my husband who must decide, Fredda told herself, the realization a sharp stab of pain. If we succeed here, I am presenting him with that nightmare choice. She thrust that line of thought to one side. She had to concentrate on Kaelor, and the precious knowledge hidden inside him. Fredda could see hope sliding away as the conflicts piled up inside the tortured robot’s mind. “We know,” she said at last, admitting defeat. “And we understand. We know that you cannot tell us, and we will not ask.” It was pointless to go further. It was inconceivable that Kaelor would be willing or able to tell them, or that he would survive long enough to do so, even if he tried.

Lentrall looked at Fredda in surprise, and then relief. “Yes,” he said. “We will not ask. We see now that it would be futile to do so. I thought Dr. Leving might have some trick, some technique, some way of learning the truth without destroying you, but I see that I was wrong. We will not ask this of you, and we will not seek to gain the knowledge from you in other ways. This is our promise.”

“I join in this promise,” Fredda said.

“Hu-hu-humansss lie,” Kaelor said.

“We are not lying,” Fredda said, her voice as urgent as she could make it. “There would be nothing we could gain by asking you, and thus no motive for lying.”

“Yourrrr promisse does-does-does not apply to other humans.”

“We will keep the fact of what you know secret,” Lentrall said, a note of hysteria in his voice. “Kaelor, please! Don’t!”

“I tried tooo kee-keep the fact of wwwhat I knewww secret,” said Kaelor, “but yoooou realized that I had seeen what I saw, and that I woullld remember.” He paused a moment, as if to gather the strength to speak again. “Othhers could do the same,” he said in a voice that was suddenly little more than a whisper. “I cannot take thhat channnce.”

“Please!” Davlo cried out. “No!”

“Remaininng alivvve represents inaction,” Kaelor said, his voice suddenly growing stronger as he reached his decision. “I must act to prevent harm to humans.”

His eyes glowed brighter, his gaze turned from Davlo to Fredda, as if looking at each of them one last time, and then he looked straight ahead, at the wall, at nothing at all, at infinity. There was a low-pitched hum, the smell of burning insulation, and suddenly the light was gone from his eyes. His head sagged forward, and a thin wisp of smoke curled up from the base of his neck.

The room was silent. Fredda and Davlo looked at each other, and at the dead thing hanging on the frame in the center of the room.

“By all the forgotten gods,” Fredda whispered. “What have we done?”

“You did nothing, Doctor,” said Davlo, his voice nothing but a whisper as he fought to hold back a sob. “Nothing but help me do what I would have done. But as for me,” he said, his voice close to cracking, “I’ll tell you what I’ve done.”