tmp0
–
PARAGON
by Aubrey Watt
For D—,
the inspiration for all of the good in my characters
and some of the naughty.
Subscribe to my newsletter for free books and discounts
Follow me on Twitter: @AubreyWatt
Copyright 2013 by Aubrey Watt
First Edition: February 2013
ISBN: TBD
Cover design by Aubrey Watt
CHAPTER ONE
“As for morality, well that's all tied up with the question of consciousness.” -Roger Penrose
***
Her head was on his chest, and she could hear his heart beating through the pale skin to echo hers. His breaths made her entire body rise and fall with his, the rolling motion as regular as the tides that would only cease to be once the moon fell into the earth, or the earth into the sun.
Which would happen first? How would this planet go, finally? The idea would not stop scratching at her brain, and if his arm had not been tightly laced around her body she might have stood up and paced the sand to ponder it.
Would the moon fall first, winding its way down into the atmosphere until it scraped against the surface of our planet? Or rather, would the earth fall into the sun, perhaps dragging the moon along into a fiery death? It didn’t matter, of course. What would happen would happen, and worries had no place in the present.
His body was human, his touch warm, his lips slightly parted as he slept. Watching him she could not believe that they wanted to destroy him. He was so innocent.
He murmured something, a whisper from the other side of sleep, and at that moment she felt him to be unbearably fragile, unbelievably human. A child.
Standing in the airport, he had asked her a question and she had said yes. Now, though, she wondered if either of them would be safe, if either of them could be happy as fugitives. They had each other; was it enough?
She traced her finger across his pale chest and he stirred, his eyes moving behind their lids. If she could have reached out and caught his dreams, she would have seen that they were as complex and incomprehensible as her own. They had made him that way. Able to feel.
To love, to feel. It was enough.
***
On Tuesday morning, the following message was transmitted to Washington by the M.I.D. headquarters ninety miles outside of Phoenix, Arizona:
PRIORITY CODE 22
TIME SENT: 15:08:34 DATE 06/04/2131
START TRANSMISSION
TO: SEN . YONGH FROM: LTC . JOHNNER RE: PROJECT PARAGON
SECOND INSTANCE OF MALFUNCTION IN AL-26 CORE ON LAUNCH ; PROTOTYPE DESTROYED. ONE IMPLANTED PROTOTYPE REMAINING . PLEASE ADVISE .
END TRANSMISSION
Senator Yongh was in a national security meeting when her glasses flashed the small blinking light that indicated an email message. She raised her hand absently to turn off the indicator, but it would not stop blinking. Excusing herself, she opened the message and was surprised to have to key in her security credentials.
When she read the transmission, she went back to the national security meeting and said that they would have to table everything for the day. There was something else she had to deal with.
It was almost seventeen hours before she replied to the M.I.D. laboratory with the consensus from the NorAm-Soviet consulate group. The decision was unanimous: continue with the agreed-upon Project Paragon protocol and enlist help from external sources. There were seven names on the list for the M.I.D. to contact, in order.
They struck the first name immediately; Sam Warson had recently visited Singapore and had his passport flagged for customs breaches when he attempted to return to North America with human-based neural transplant materials. Locked up in a minimum security prison reserved for dignitaries and other important persons, he was currently awaiting a hearing in front of the Seattle medical ethics board before standing trial.
The second name was Dr. Chal Davidson.
***
In the darkness the slide was blindingly bright. The audience members blinked, their pupils adjusting in milliseconds like lenses to the reflected light of the screen. Dr. Chal Davidson’s voice rose, echoing slightly in the dim room from behind her podium.
“The hard problem of consciousness has been a thorn in philosophy’s side for ages. Physicalism holds that everything we have in our brains is adequate to explain the workings of the mind, but there are issues that this explanation raises. We ourselves perceive that we feel, rather than just think. When we stub a toe, the pain is represented in our neuronal firings, yes, but there is also the quality of the pain, what philosophers call qualia.”
Chal gestured, and the pointer which had been hovering patiently by her elbow now whirled toward the slide, tracing the image of the brain’s pain network. The image morphed into a photo of a man in obvious agony, and the pointer dropped down beneath the projection screen, bobbing obediently next to Chal. She tossed her long blond hair back as she continued.
“Go back to Descartes’ famous pronouncement in the Cogito: I think, therefore I am. Let us take, as an example, the human orgasm.” Some members in the audience giggled. Always a winner with a college crowd.
“We have long been able to study the neuronal firings that happen simultaneously with orgasm. And Lidder’s recent study has shown us that if we replicate the brain firing patterns through backwards EEG analysis, we can induce orgasms in research patients. Naturally this has caused a great deal of commotion in the porn industry.”
Dr. Davidson switched the slide to a picture of one of Lidder’s patients, a woman in the throes of neuronal-induced orgasm. A ripple of uncomfortable laughter ran through the lecture hall. Chal looked up at the slide. She had seen it a hundred times before, the dark-haired woman with her mouth in the shape of an O. It never failed to unnerve the audience. Fake orgasms... and then what? She paused to let the picture sink in.
“In the field of traditional philosophy, however, we are no closer to understanding the deeper part of experience, the qualia, that makes one person’s experience different from another’s.
“Your orgasm,” and here she gestured toward a male student in the first row, “is fundamentally different from my orgasm. And how can you know that I am really feeling the orgasm at all?” She arched her brow. “After all, I might be faking it.” Again, laughter, and light applause from the computer scientists she had met before the lecture began. The audience was back with her.
“And so we come back to Descartes, who perhaps should have instead said: I FEEL, therefore I am. Our current research is a departure from philosophical tradition in that we are studying the digital representation of qualia, trying to understand if artificial intelligence, to use a quaint term, is sufficient to produce sensation, and even consciousness, given Lidder’s recent progress in the field of backwards emotional induction.”
The next slide showed a robot lying in bed with a human. “This is why we’re not allowed in the philosophy department anymore,” she quipped. The audience tittered.
The slides ran on, and she moved through the lecture almost automatically, reciting the familiar words and making the familiar jokes. The audience was warm, responsive, and she found herself sorry to see the last slide come up.
“Part of the difficulty in our work, indeed the main difficulty that we face, is that of understanding if and when these digital intelligences gain the capacity to interpret qualia on their own. When do they gain sentience? When do they begin to feel?
“We are very close to pinpointing the places where digital representation breaks down in the process of shifting from representing sensation to actually feeling it. The field of digital intelligence is one of the hottest fields today, in papers published and grants awarded. It may be soon indeed when a robot
is able to simulate the physical, mental, and conscious effects of experience. Given that, there is no difference between the robot and myself when, for example, we both tell you that we are experiencing an orgasm.”
The lights in the auditorium came on and a half dozen hands flew up in the air amidst a sea of applause. The pointer hummed back and settled into its cradle at the front of the podium. The host came up onstage, microphone in his hand.
“First of all, I’d like to say thank you to Dr. Davidson for coming so far out of her way to be here tonight.” He waited for the applause to die down, looking at his watch. “We have time for a few questions, so please be clear, concise, and ask only one question at a time.” He stepped into the audience when the lecture hall doors burst open. All heads turned to the back of the room.
“MEN, NOT MACHINES! MEN, NOT MACHINES!” The student protestors shouted as they marched down the aisle. They were holding signs above their heads:
No Real Workers = No Real Jobs
Human Rights, NOT Robot Slaves - End the Digital Divide!!!
All MEN Are Created Equal
The host looked around in confusion; where were the security guards? Chal stepped aside from the podium, waiting alertly. She had seen her share of anti-digital intelligence protests, but this one seemed harmless enough. The students, all wearing anti-Divide logos on their T-shirts, shouted their slogans from the audience aisles until the guards, who had stepped outside for a brief smoke break, hurried in to escort them out.
“Dr. Davidson!” one protestor cried out, struggling against the guard. “God will punish you for your work!”
“Traitor!” another screamed. “Traitor to humanity! Traitor to mankind!” The student raised her sign and threw it toward the stage. Chal stepped backwards as the sign crashed down at her feet, breaking in two. The guard picked the student up by the waist and dragged her toward the door.
Chal noticed a man dressed in a dark suit standing in the back, right behind the last row. He might have been a professor, but his demeanor seemed closed, authoritative, his chest thrust outward just an inch more than normal, his feet shoulder-width apart. She cocked her head, trying to remember where she had seen that stance before. It looked oddly familiar.
The one thing that struck her right away were his eyes, which were a light, piercing blue. Everything else about him was remarkably average. But his eyes, so brilliantly blue, were not staring at the screaming protestors being led out of the door right next to him. They were locked on Chal.
Finally, the security guards moved all of the student protestors outside of the lecture hall and soon the murmuring of the audience settled down. Chal waved away the hosts’s apologies and returned to the podium to take questions.
“Didn’t I tell you this was a heated field?” she said, spreading her hands and smiling in order to dissipate the tension in the hall. As she returned to the podium in front of her, however, she could see that her hands were shaking slightly.
In truth, she had many misgivings about the practical applications of her work, but protestors tended to lump the effects of digital intelligence together with the research behind it. While she understood their motives, she also understood that their battle was already lost. Progress moved as it ever did, in fits and starts, but patiently, inexorably on.
The host passed the microphone to the first student who had raised his hand.
“Um, thanks, Dr. Davidson,” the student mumbled into the mike. “My question is about telling whether or not digital programs are actually feeling stuff. How could we ever really tell? Couldn’t they be lying?”
Chal nodded, still thinking of the student who had thrown the sign at her, the face twisted in hatred. She couldn’t help but wonder if the hatred was real. “Thank you for the question. This is one of the most fascinating aspects of the field of digital intelligence today.
“Alan Turing – I’m sure you’ve heard of him – had a test for artificial intelligence which was very simple: put a machine and a human in two different rooms, both typing their responses to questions from a judge in a separate room. His criterion for intelligence was just this: if the judge was unable to tell the difference between the human and the machine, the machine would be said to possess intelligence.
“Digital intelligences have had the ability to pass the Turing Test for decades,” she went on, “but only recently have we been able to separate the neurological workings of thought from the neurological workings of feeling. This separation has granted us the ability to tell when living organisms are conscious and when they are not. And we can only assume that we will eventually be able to do the same for digital intelligences.”
Another student’s hand went up and the microphone was passed along the row of people. “What’s the biological basis for being able to distinguish between the two?”
Dr. Davidson shook her head. “I’m no neuroscientist. But in the Lidder study, they were able to separate distinct neuron firing patterns. One in patients who had experienced orgasm physically and consciously. One in those who had only experienced it physically.
“Remember that initially the Lidder study was a wellness study, targeted toward relieving sexual dysfunction. They used test subjects with anorgasmia, a medical condition where a patient who could exhibit all the signs of orgasm and whose brain would fire in patterns similar to those of normal patients in orgasm failed to actually experience climax. The only difference was that the subjects with sexual dysfunction could not experience the feeling of orgasm at all, claiming that they felt nothing even though their body said otherwise.
“The men in this study could achieve erection and even ejaculate. Their brains fired in much the same way as in normal brains. But they could not feel the orgasm. Separating out those neuron firings patterns has allowed us to move forward in understanding consciousness in a deeper way.”
A professor’s hand went up this time, and Davidson pursed her lips. Faculty questions were either extremely illuminating or extremely not, and they tended to weigh heavy on the latter end of the spectrum.
The older man took the microphone, adjusting his eyeglasses. “Hello, Dr. Davidson. On behalf of the entire philosophy department, I want to thank you for taking the time to come talk with us today.” Scattered applause. Davidson tensed herself. A philosopher. They always tended to be crabby.
“I would just like to say something, as I have been around for a number of years and it seems to me that computer scientists, cognitive neuroscientists, all of you – have for decades been promising to make headway on this fundamental problem but have not come anywhere near solving it. Certain scholars,” and here he glanced two seats over at a colleague, “would venture to say that this is nothing more than a wild goose chase. And yet we keep hearing about the next possible breakthrough in solving the consciousness problem. Do you really think that this time around it will be different?” He sat down, adjusting his tie, satisfied with himself.
Chal cleared her throat. “It is in the nature of a breakthrough that we don’t know it is coming. To me it seems a bit like chipping on the surface of ice – you keep at it and keep at it, and you could be a quarter of an inch away from cracking it and not know. But once it’s cracked, it’s cracked wide open.”
Her eyes moved over the room, settling on one of the corners of the audience. “Why not ask the physicists if they should have given up the Higgs Boson? Was that just a wild goose chase?” Amid the chuckles, Davidson’s face turned serious. “Look, it’s entirely possible that we don’t succeed. But if it isn’t, even if there’s just the slightest possibility that we crack this open, well...”
For the second time that night she spread her hands in a gesture of helplessness. Suddenly, that’s how she felt. Helpless to figure out how it all worked, how the pieces fit together. Helpless to explain the importance of the research, because it might all be in vain, that was true, that was true of anything worth doing.
Her life had been spent in research, and now, standing in front
of the audience, she had the vaguest sensation of having misspent her years. There was something missing. She cleared her throat and motioned for the last question from the audience.
“I get how digital intelligences can think,” the student said, adjusting her sweater as she spoke into the microphone. “But how can they love?”
The doubt that had edged into Chal’s mind with the entrance of the protestors now bore down in full force and for a moment she simply stood there, hearing the echoes of a question that so many others had asked before.
How can they love?
She heard her mind answer back, only half-sarcastically: How can I?
Standing in the light, the audience waiting for an answer, she thought of the last man she had thought she might love. It hadn’t worked out – they never worked out. She was alone, with only her research. A hermit. A mad scientist.
A laugh bubbled up in her throat before she remembered herself, remembered where she was. It must have been the protestors that had thrown her off of her game.
To the student she gave a glib response about the research still needed before her work was truly done. It was nonsense, but it sounded all right. Chal thought idly to herself that she might make a good politician one day.
Questions finished, she moved out into the crowd, thanking the appropriate people and making sure to say hello to a few of the more eager students and computer science professors. The philosophy department pointedly ignored her, and she was happy to ignore them right back. They had strawberry cake, after all, and she was much more interested in the dessert than in talking about the hopelessness of her field with a bunch of self-important assholes. She hadn’t eaten all day, and she managed to make the rounds while forking cake savagely into her mouth.
Finally she managed to extricate herself from the lecture hall, and she walked toward the raised parking garage behind the tall building. She couldn’t wait to get back to the hotel for at least a few hours’ sleep. She yawned, one hand clutching her presentation materials, and pressed the elevator button.