Изменить стиль страницы

Not a friendly face in the room. Not a one.

Bonner called the session to order, Bonner talked about high feelings over the tragic accident, Bonner talked about the stress of a job that called on men to risk their lives, talked about God and country.

Blue-sky language. Blue-sky thinking. Up to an Earther didn’t refer to phase fields, war was two districts on a plane surface in a dispute over territory, and the United Nations was a faction-ridden single-star-system organization trying to tell merchanter Families what their borders were: explain borders to them, first.

You had to see a planet through optics and think flat surface to imagine how ground looked. He hadn’t laid eyes on a planet til he was half-grown. He never had figured out the emotional context, except to compare it to ship or station, but there was something about being fixed hi place next to permanent neighbors that sounded desperately unnatural. Which he supposed was prejudice on his side. Bonner talked about a righteous war. And he thought about ports and ships run by Cyteen’s tape-trained humanity, with mindsets more alien than Earth’s.

Bonner talked about human stress and interactive systems, while he thought about the Cluster off Cyteen, where startides warped space, and a ghosty malfunction on the boards you hoped to God was an artifact of that space, while a Union spotter was close to picking up your presence.

Bonner got Helmond Weiss on the mike to read the medical report. Telemetry again. More thorough than the post-mortem on the ship. Less printout. Four human beings hadn’t output as much in their last minutes as that struggling AI had. Depressing thought.

Then the psych lads took the mike. “Were Wilhelmsen’s last decisions rational?” the committee asked point-blank. And the psychs said, hauling up more charts and graphs, “Increasing indecision,” and talked about hyped senses, maintained that Wilhelmsen had gone on hyperfocus overload and lost track of actual time-flow—

... making decisions at such speed in such duration, it was pure misapprehension of the rate at which filings were happening. No, you couldn’t characterize it as panic....

“... evidence of physiological distress, shortness of breath, increase in REM and pulse rate activated a medical crisis warning with the AI—”

“The carrier’s AI didn’t have time to reach the rider?” a senator asked.

“And get the override query engaged and answered, no, there wasn’t time.”

Playback of the final moments on the tape. The co-pilot, Pete Fowler, the last words on the tape Fowler’s, saying, “Hold it, hold it~”

That overlay the whole reorientation and firing incident, at those speeds. The panel had trouble grasping that. They spent five minutes arguing it, and maybe, Graff thought, still didn’t realize the sequence of events, or that it was Fowler protesting the original reorientation.

You didn’t have time to talk. Couldn’t get a word out in some sequences, and not this one. Fowler shouldn’t have spoken. Part of it was his fault. Shouldn’t have spoken to a strange pilot, who didn’t know his contexts, who very well knew they didn’t altogether trust him.

The mike went to Tanzer. A few final questions, the committee said. And a senator asked the question:

“What was the name of the original pilot?”

“Dekker. Paul Dekker. TVainee.”

“What was the reason for removing him from the mission?”

“Seniority. He was showing a little stress. Wilhelmsen was the more experienced.”

Like hell.

“And the crew?”

“Senator, a crew should be capable of working with any officer. It was capable. There were no medical grounds there. The flaw is in the subordination of the neural net interface. It should be constant override with concurrent input from the pilot. The craft’s small cross-section, its minimum profile, the enormous power it has to carry in its engines to achieve docking at highest v—all add up to sensitive controls and a very powerful response....”

More minutiae. Keep my mouth shut or not? Graff asked himself. Trust Tanzer? Or follow orders?

Another senator: “Did the sims run the same duration as the actual mission?”

Not lately, Graff thought darkly, while Tanzer said, blithely, “Yes.”

Then a senator said: “May I interject a question to Lt. Graff.”

Bonner didn’t like that. Bonner frowned, and said, “Lt. Graff, I remind you you’re still under oath.”

“Yes, sir.”

The senator said, “Lt. Graff. You were at the controls of the carrier at the time of the accident. You were getting telemetry from the rider.”

“Yes, sir.”

“The medical officer on your bridge was recorded as saying Query out.”

“That’s correct.”

“What does that mean?”

“It means she’d just asked the co-pilot to assess the pilot’s condition and act. But the accident was already inevitable. Just not enough time.”

Blinks from the senator, attempt to think through the math, maybe. “Was the carrier too far back for safety?”

“It was in a correct position for operations. No, sir.”

“Was the target interval set too close? Was it an impossible shot?”

“No. It was a judgment shot. The armscomper doesn’t physically fire all the ordnance, understand. He sets the priorities at the start of the run and adjusts them as the situation changes. A computer does the firing, with the pilot following the sequence provided by his co-pilot and the longscanner and armscomper. The pilot can violate the aimscomper’s priorities. He might have to. There are unplotteds out there, rocks, for instance. Or mines.”

“Did Wilhelmsen violate the priorities?”

“Technically, yes. But he had that choice.”

“Choice. At those speeds.”

“Yes, sir. He was in control until that point. He knew it was wrong, he glitched, and he was out. Cold.”

“Are you a psychiatrist, lieutenant?”

“No, sir, but I suggest you ask the medical officer. There was no panic until he heard his crew’s alarm. That spooked him. Their telemetry reads alarm—first, sir. His move startled them and he dropped out of hype.”

“The lieutenant is speculating,” Bonner said. “Lt. Graff, kindly keep to observed fact.”

“As a pilot, sir, I observed these plain facts in the medical testimony.”

“You’re out of order, lieutenant.”

“One more question,” the senator said. “You’re saying, lieutenant, that the tetralogic has faults. Would it have made this mistake?”

“No, but it has other flaws.”

“Specifically?”

“Even a tetralogic is recognizable, to similar systems. Machine can counter machine. Human beings can make decisions these systems don’t expect. Longscan works entirely on that principle.”

“Are you a computer tech?”

“I know the systems. I personally would not go into combat with a computer totally in charge.”

The senator leaned back, frowning. “Thank you, lieutenant.”

“May I make an observation?” Tanzer asked, and got an indulgence and a nod from Bonner.

Tanzer said: “Let me say this is an example of the kind of mystical nonsense I’ve heard all too much of from this service. Whatever your religious preferences, divine intervention didn’t happen here, Wilhelmsen didn’t stay conscious long enough to apply the human advantage. Human beings can’t defy physics; and the lieutenant sitting behind his carrier’s effect shields can maintain that spacers are somehow evolved beyond earthly limitations and make their decisions by mysterious instincts that let them outperform a tetralogic, but in my studied and not unexpert opinion, there’s been altogether too much emphasis in recruitment based on entry-level skills and certain kinds of experience— meaning a practical exclusion of anyone but Belters. The lieutenant talks about some mysterious unquantifiable mentality that can work at these velocities. But I’d like to say, and Dr. Weiss will back me on this, that there’s more than button-pushing ability and reflexes that make a reliable military. There is, very importantly, attitude. There’s been no background check into volunteers on this project...”