Изменить стиль страницы

"That is all," said Defense abruptly.

As Professor Goodfellow, more than a bit ruffled, stood down, Justice Shane leaned forward and said, "Since I am not a robotics man myself, I would appreciate knowing precisely what the Three Laws of Robotics are. Would Dr. Lanning quote them for the benefit of the court?"

Dr. Lanning looked startled. He had been virtually bumping heads with the gray-haired woman at his side. He rose to his feet now and the woman looked up, too-expressionlessly.

Dr. Lanning said, "Very well, Your Honor." He paused as though about to launch into an oration and said, with laborious clarity, "First Law: a robot may not injure a human being, or, through inaction, allow a human being to come to harm. Second Law: a robot must obey the orders given it by human beings, except where such orders would conflict with the First Law. Third Law: a robot must protect its own existence as long as such protection does not conflict with the First or Second Laws."

"I see," said the judge, taking rapid notes. "These Laws are built into every robot, are they?"

"Into every one. That will be borne out by any roboticist."

"And into Robot EZ-27 specifically?"

"Yes, Your Honor."

"You will probably be required to repeat those statements under oath."

"I am ready to do so, Your Honor." He sat down again.

Dr. Susan Calvin, robopsychologist-in-chief for U. S. Robots, who was the gray-haired woman sitting next to Lanning, looked at her titular superior without favor, but then she showed favor to no human being. She said, "Was Goodfellow's testimony accurate,

Alfred?"

"Essentially," muttered Lanning. "He wasn't as nervous as all that about the robot and he was anxious enough to talk business with me when he heard the price. But there doesn't seem to be any drastic distortion."

Dr. Calvin said thoughtfully, "It might have been wise to put the price higher than a thousand."

"We were anxious to place Easy."

"I know. Too anxious, perhaps. They'll try to make it look as though we had an ulterior motive."

Lanning looked exasperated. "We did. I admitted that at the University Senate meeting."

"They can make it look as if we had one beyond the one we admitted."

Scott Robertson, son of the founder of U. S. Robots and still owner of a majority of the stock, leaned over from Dr. Calvin's other side and said in a kind of explosive whisper, "Why can't you get Easy to talk so we'll know where we're at?"

"You know he can't talk about it, Mr. Robertson."

"Make him. You're the psychologist, Dr. Calvin. Make him."

"If I'm the psychologist, Mr. Robertson," said Susan Calvin coldly, "let me make the decisions. My robot will not be made to do anything at the price of his well-being."

Robertson frowned and might have answered, but Justice Shane was tapping his gavel in a polite sort of way and they grudgingly fell silent.

Francis J. Hart, head of the Department of English and Dean of Graduate Studies, was on the stand. He was a plump man, meticulously dressed in dark clothing of a conservative cut, and possessing several strands of hair traversing the pink top of his cranium. He sat well back in the witness chair with his hands folded neatly in his lap and displaying, from time to time, a tight-lipped smile.

He said, "My first connection with the matter of the Robot EZ-27 was on the occasion of the session of the University Senate Executive Committee at which the subject was introduced by Professor Goodfellow. Thereafter, on the tenth of April of last year, we held a special meeting on the subject, during which I was in the chair."

"Were minutes kept of the meeting of the Executive Committee? Of the special meeting, that is?"

"Well, no. It was a rather unusual meeting." The dean smiled briefly. "We thought it might remain confidential."

"What transpired at the meeting?"

Dean Hart was not entirely comfortable as chairman of that meeting. Nor did the other members assembled seem completely calm. Only Dr. Lanning appeared at peace with himself. His tall, gaunt figure and the shock of white hair that crowned him reminded Hart of portraits he had seen of Andrew Jackson.

Samples of the robot's work lay scattered along the central regions of the table and the reproduction of a graph drawn by the robot was now in the hands of Professor Minott of Physical Chemistry. The chemist's lips were pursed in obvious approval.

Hart cleared his throat and said, "There seems no doubt that the robot can perform certain routine tasks with adequate competence. I have gone over these, for instance, just before coming in and there is very little to find fault with."

He picked up a long sheet of printing, some three times as long as the average book page. It was a sheet of galley proof, designed to be corrected by authors before the type was set up in page form. Along both of the wide margins of the galley were proofmarks, neat and superbly legible. Occasionally, a word of print was crossed out and a new word substituted in the margin in characters so fine and regular it might easily have been print itself. Some of the corrections were blue to indicate the original mistake had been the author's, a few in red, where the printer had been wrong.

"Actually," said Lanning, "there is less than very little to find fault with. I should say there is nothing at all to find fault with, Dr. Hart. I'm sure the corrections are perfect, insofar as the original manuscript was. If the manuscript against which this galley was corrected was at fault in a matter of fact rather than of English, the robot is not competent to correct it."

"We accept that. However, the robot corrected word order on occasion and I don't think the rules of English are sufficiently hidebound for US to be sure that in each case the robot's choice was the correct one."

"Easy's positronic brain," said Lanning, showing large teeth as he smiled, "has been molded by the contents of all the standard works on the subject. I'm sure you cannot point to a case where the robot's choice was definitely the incorrect one."

Professor Minott looked up from the graph he still held. "The question in my mind, Dr. Lanning, is why we need a robot at all, with all the difficulties in public relations that would entail. The science of automation has surely reached the point where your company could design a machine, an ordinary computer of a type known and accepted by the public, that would correct galleys."

"I am sure we could," said Lanning stiffly, "but such a machine would require that the galleys be translated into special symbols or, at the least, transcribed on tapes. Any corrections would emerge in symbols. You would need to keep men employed translating words to symbols, symbols to words. Furthermore, such a computer could do no other job. It couldn't prepare the graph you hold in your hand, for instance."

Minott grunted.

Lanning went on. "The hallmark of the positronic robot is its flexibility. It can do a number of jobs. It is designed like a man so that it can use all the tools and machines that have, after all, been designed to be used by a man. It can talk to you and you can talk to it. You can actually reason with it up to a point. Compared to even a simple robot, an ordinary computer with a non-positronic brain is only a heavy adding machine."

Goodfellow looked up and said, "If we all talk and reason with the robot, what are the chances of our confusing it? I suppose it doesn't have the capability of absorbing an infinite amount of data."

"No, it hasn't. But it should last five years with ordinary use. It will know when it will require clearing, and the company will do the job without charge."

"The company will?"

"Yes. The company reserves the right to service the robot outside the ordinary course of its duties. It is one reason we retain control of our positronic robots and lease rather than sell them. In the pursuit of its ordinary functions, any robot can be directed by any man. Outside its ordinary functions, a robot requires expert handling, and that we can give it. For instance, any of you might clear an EZ robot to an extent by telling it to forget this item or that. But you would be almost certain to phrase the order in such a way as to cause it to forget too much or too little. We would detect such tampering, because we have built-in safeguards. However, since there is no need for clearing the robot in its ordinary work, or for doing other useless things, this raises no problem."