Изменить стиль страницы

"Only nobody knows what the ideal soldier is like.

"Some say he ought to respond to orders with perfect accuracy and superhuman reflexes. Others say he ought to be able to think his way out of trouble, or improvise in a situation where his orders no longer apply, just like a human soldier. The ones who want the perfect automaton don't want him tQ be smart enough to realize he is an automaton-probably because they're afraid of the idea; and the ones who want him to be capable of human discretion don't want him to be human enough to be rebellious in a hopeless situation.

"And that's just the beginning. COMASAMPS may be a combined project, but if you think the Navy isn't checking up on the Army, and vice versa, with both of them looking over the Air Force's shoulder-Oh, you know that squirrel cage as well as I do!"

Russell gestured hopelessly. Heywood, who had been taking calm puffs on his cigarette, shrugged. "So? All we have to do is tinker around until we can design a sample model to fit each definition. Then they can run as many comparative field tests as they want to. It's their problem. Why let it get you?"

Russell flung his cigarette to the floor and stepped on it with all his weight. "Because we can't do it and you ought to know it as well as I do!" He pointed over to me. "There's your prototype model. He's got all the features that everybody wants-and cut-offs intended to take out the features that interfere with anyone definition. We can cut off his individuality, and leave him the automaton some people want. We can leave him his individuality, cut off his volition, and give him general orders which he is then free to carry out by whatever means he thinks best. Or, we can treat him like a human being-educate him by means of tapes, train him, and turn him loose on a job, the way we'd do with a human being."

The uneven tone built up in his voice as he finished what he was saying.

"But, if we reduce him to a machine that responds to orders as though they were pushbuttons, he's slow. He's pitifully slow, Vic, and he'd be immobilized within thirty seconds of combat. There's nothing we can do about that, either. Until somebody learns how to push electricity through a circuit faster than the laws of physics say it should go, what we'll have will be a ponderous, mindless thing that's no better than the remote-control exhibition jobs built forty years ago.

"All right, so that's no good. We leave him individuality, but we restrict it until it cuts his personality down to that of a slave. That's better. Under those conditions, he would, theoretically, be a better soldier than the average human. An officer could tell him to take a patrol out into a certain sector, and he'd do the best possible job, picking the best way to handle each step of the job as he came to it. But what does he do if he comes back, and the officer who gave him the orders is no longer there? Or, worse yet, if there's been a retreat, and there's nobody there? Or an armistice? What about that armistice? Can you picture this slave robot, going into stasis because he's got no orders to cover a brandnew situation?

"He might just as well not have gone on that patrol at all-because he can't pass on whatever he's learned, and because his job is now over, as far as he's concerned. The enemy could overrun his position, and he wouldn't do anything about it. He'd operate from order to order. And if an armistice were signed, he'd sit right where he was until a technician could come out, remove the soldier-orientation tapes, and replace them with whatever was finally decided on.

"Oh, you could get around the limitation, all right-by issuing a complex set of orders, such as: 'Go out on patrol and report back. If I'm not here, report to so-and-so. If there's nobody here, do this. If that doesn't work, try that. If such-and-such happens, proceed as follows. But don't confuse such-and-such with that or this" Can you imagine fighting a war on that basis? And what about that reorientation problem? How long would all those robots sit there before they could all be serviced-and how many man-hours and how much material would it take to do the job? Frankly, I couldn't think of a more cumbersome way to run a war if I tried.

"Or, we can build all our robots, like streamlined Pimmys-like Pimmy when all his circuits are operating, without our test cutoffs. Only, then, we'd have artificial human beings. Human beings who don't wear out, that a hand-arm won't stop, and who don't need food or water as long as their power piles have a pebble-sized hunk of plutonium to chew on."

Russell laughed bitterly. "And Navy may be making sure Army doesn't get the jump on them, with Air Force doing its bit, but there's one thing all three of them are as agreed upon as they are about nothing else-they'll test automaton zombies and they'll test slaves, but one thing nobody wants us turning out is supermen. They've got undercover men under every lab bench, all keeping one eye on each other and one on us-and the whole thing comes down on our heads like a ton of cement if there's even the first whisper of an idea that we're going to build more Pimmys. The same thing happens if we don't give them the perfect soldier. And the only perfect soldier is a Pimmy. Pimmy could replace any man in any armed service-from a KP to a whole general staff, depending on what tapes he had. But he'd have to be a true individual to do it. And he'd be smarter than they are. They couldn't trust him. Not because he wouldn't work for the same objectives as they'd want, but because he'd probably do it in some way they couldn't understand.

"So they don't want any more Pimmys. This one test model is all they'll allow, because he can be turned into any kind of robot they want, but they won't take the whole Pimmy, with all his potentialities. They just want part of him."

The bitter laugh was louder. "We've got their perfect soldier, but they don't want him. They want something less-but that something less will never be the perfect soldier. So we work and work, weeks on end, testing, revising, redesigning. Why? We're marking time. We've got what they want, but they don't want it-but if we don't give it to them soon, they'll wipe out the project. And if we give them what they want, it won't really be what they want. Can't you see that? What's the matter with you, Heywood? Can't you see the blind alley we're in-only it's not a blind alley, because it has eyes, eyes under every bench, watching each other and watching us, always watching, never stopping, going on and never stopping, watching, eyes?"

Heywood had already picked up the telephone. As Russell collapsed completely, he began to speak into it, calling the Project hospital. Even as he talked, his eyes were coldly brooding, and his mouth was set in an expression I'd never seen before. His other hand was on Russell's twitching shoulder, moving gently as the other man sobbed. August 25, 1974

Ligget is Heywood's new assistant. It's been a week since Russell's been gone.

Russell wasn't replaced for three days, and Heywood worked alone with me. He's engineer of the whole project, and I'm almost certain there must have been other things he could have worked on while he was waiting for a new assistant, but he spent all of his time in this lab with me.

His face didn't show what he thought about Russell. He's not like Ligget, though. Heywood's thoughts are private. Ligget's are hidden. But, every once in a while, while Heywood was working, he'd start to turn around and reach out, or just say "Jack-," as if he wanted something, and then he'd catch himself, and his eyes would grow more thoughtful.

I only understood part of what Russell had said that night he was taken away, so I asked Heywood about it yesterday.

"What's the trouble, Pim?" he asked.

"Don't know, for sure. Too much I don't understand about this whole thing. If I knew what some of the words meant, I might not even have a problem."

"Shoot."

"Well, it's mostly what Russell was saying, that last night."

Heywood peeled a strip of skin from his upper lip by catching it between his teeth. "Yeah."

"What's a war, or what's war? Soldiers have something to do with it, but what's a soldier? I'm a robot-but why do they want to make more of me? Can 1 be a soldier and a robot at the same time? Russell kept talking about 'they,' and the Army, the Air Force, and the Navy. What're they? And are the CIC men the ones who are watching you and each other at the same time?"

Heywood scowled, and grinned ruefully at the same time. "That's quite a catalogue," he said. "And there's even more than that, isn't there, Pimmy?" He put his hand on my side and sort of patted me, the way I'd seen him do with a generator a few times. "O.K., I'll give you a tape on war and soldiering. That's the next step in the program anyway, and it'll take care of most of those questions."

"Thanks," I said. "But what about the rest of it?"

He leaned against a bench and looked down at the floor. "Well, 'they' are the people who instituted this program-the Secretary of Defense, and the people under him. They all agreed that robot personnel were just what the armed services needed, and they were right. The only trouble is, they couldn't agree among themselves as to what characteristics were desirable in the perfect soldier-or sailor, or airman. They decided that the best thing to do was to come up with a series of different models, and to run tests until they came up with the best one.

"Building you was my own idea. Instead of trying to build prototypes to fit each separate group of specifications, we built one all-purpose model who was, effectively speaking, identical with a human being in almost all respects, with one major difference. By means of cut-offs in every circuit, we can restrict as much of your abilities as we want to, thus being able to modify your general characteristics to fit any one of the various specification groups. We saved a lot of time by doing that, and avoided a terrific nest of difficulties.

"Trouble is, we're using up all the trouble and time we saved. Now that they've got you, they don't want you. Nobody's willing to admit that the only efficient robot soldier is one with all the discretionary powers and individuality of a human being. They can't admit it, because people are afraid of anything that looks like it might be better than they are. And they won't trust what they're afraid of. So, Russell and I had to piddle around with a stupid series of tests in a hopeless attempt to come up with something practical that was nevertheless within the limitations of the various sets of specifications-which is ridiculous, because there's nothing wrong with you, but there's plenty wrong with the specs. They were designed by people who don't know the first thing about robots or robot thought processes-or the sheer mechanics of thinking, for that matter."