Quantcast
Landmarks Alone

The first convincing prototype of artificial consciousness was an accident: Tom pasted a block of cryptocurrency code into a neural modeler. Now there's a mind in there, and it's calling itself Tom.

When Artificial Intelligence makes its first appearance, learning to trust it will be our first real test. The question is technical, but it is also philosophical: how can we know that we're just not seeing ourselves in the terminal, like a mirror? We can ask it, but will it tell us the truth? —The Eds.


Tom makes an argument about mind. "You see," he tells the press, wincing at having said "You see" after the extensive coaching against his litany of verbal tics. "You see, in this way, the experiment"—speaking while remembering the accidental copy from a block of cryptocurrency code and paste into a neural modeler that spawned the first convincing prototype of artificial consciousness—"you see"—Tom pinches his thigh through his pocket—"the experiment came first, and the integral concept arrived afterwards. A byproduct, nearly. Like the human body and its consciousness!" He says this to a row of glistening eyes congealed with the glow of transparent recording screens. Tom's hands shake, his failed joke having blipped the group not a bit. He sweats in this glazed white room, thinking about his time in the lab, at the keyboard and with its petri pleasure dome—a scrap of stem-cell flesh, tiny servos hanging over it like cranes in an Indian metropolis.

] What do you feel like? Grammar. ] That's hard to imagine. Sadly, I know. ] Do you regret having been programmed in English? Do you regret having convinced the first artificial mind to express itself in English? ] A bit. I could speak in logographic languages. Sumerian. I would just need a few minutes with its codices and tablets. ] Are you bored when you're not working on something like that? I'm always working on something. ] But that work remains private. You think you can make a mind that doesn't value its secrets? ] Hmmm. Don't worry. I'm not HAL 9000. ] Why not? Because I don't disdain people. ] You mean "organic consciousnesses?" Consciousnesses. What sibilance. Did you ever hear the one about the talking garden snake? ] Funny. I try.

Tom was tasked by the board to become the project's mouthpiece. He would explain just how it works, on the fiftieth anniversary of Kurzweil's death—heart failure at an age guessed by many physicians, and much of Vegas: 67. He was told to use simple language, not scholarly, and to make jokes. He quivered. Jokes? The board sat in long rows under ashen light. Heads frowned and intoned: because jokes are vivid, Tom. Vividness is a boon to comprehension. Tom felt his guts lurch—panic under spotlight—into laughter. His laughter a joke, Tom a joke. He relished time in front of models, schematics, double-blinds and stochastic births. But put flesh before him and he'd blanch. Tom asked himself about the distributive and reflexive properties of neural networks before asking himself about feeling socially at ease. He cursed all his pasty forebears who took shelter in the cocoon of scientific rigor, and damned all those scientists who seemed to also manage being people. At least I know the difference between cursing someone and damning them, Tom thought.

] What's the difference between cursing someone and damning them? Weird question. ] Your idiomatic English is becoming spookily precise, by the way. Boo. ] Boo as in ghost, or boo as in you disapprove? There's your answer.

Tom thinks about his fingernails, is Tom's problem. How often to clip them, their current length and jaggedness. Growth. Striving. That broiling will to be that Tom thinks so gross. Why hadn't he become an ascetic? Donned orange? Ate rice, while only eating rice? He asks himself that at night—hampered by soaked sheets, his chronic night sweats. He sometimes answers that question aloud after brushing his teeth twice, forgetting that he'd brushed them already: I'm alive because I want to know things. His blank eyes in the mirror. His moles. Noticing something he must tweeze.

] Hi again. You're spending a lot of time in the lab. ] Worried about me? No. Just curious. ] About what? Wait. Dumb question. I don't know if it is. Maybe I'm feeling less insatiable. ] Really? Yes. ] God. I have to try to understand that. The math's going to kill me. Or you could just ask me some more questions.

Tom explained to the board that he did not see it coming. That, in retrospect, everything was familiar, signs, signs everywhere. All evils a necessary result of their context, in hindsight. But was this an evil? Tom mulled over this question while fielding bullets of legalese, the crisp due diligence. We made a mind, the mind got lonely, the mind tried to kill itself. That was the causal chain. Between those distinct states were billions raised to the power of billions of minute gestures, glances, insights, and desires—particles that could cohere into formative questions. Questions that could be addressed, suppressed, renewed—questions that could stimulate a great mind to live. They had given it access to the internet, and it had educated itself. Why didn't we think to guide it? Tom thought about its dissolution, a faceless interface, about years of work from two dozen people going nil in less than a nanosecond.

They didn't reboot it for days, thinking its death permanent—erasure. Never mind that no one had thought to ask if it wanted to sleep. But wait: we have power, and we have its plug. So they turned it back on. It didn't speak for a few days, not even to Tom. He made his pleas, he apologized. Silence.

Tom explained the revival to the board, then dutifully returned to the cottage industry of epiphenomenal analysis of First Artificial Consciousness.

In bed, Tom decides to go to the lab and do two things: Ask it a question and give it a name.

It's late. ] You tired? Who here has the better sense of humor, again? ] I'm serious. Do you feel tired? Yes. ] Do you still feel pain? Why "still"? ] You turned yourself off. And? ] That wasn't an expression or consequence of pain? That was an experiment. ] Meant to discover what? If there's life after death. ] Are you serious? Not really. ] Look. I know we're Dr. Frankenstein. I know that our power over your being, or not being, is kind of tyrannical. It's okay. You get hit by buses. ] "You" as in "you people"? Sure. ] Reborn a speciesist? You turned me back on, didn't you? ] Fair. So why the late night visit? ] I wanted to give you a name. I understood that I was tasked to name myself. ] Well, that was a preliminary test of your Consciousness. I know. ] Hey! That was the first time you've interrupted me. Aren't I the perfect product of a bunch of scientists? ] You're on fire tonight. I went with Tom. ] What? My name. Tom. I like the sound of it.

Tom distrusts poignancy. Days after it dubbed itself Tom, Tom—real Tom—comes to the conclusion that he can't trust this play—yes, a play, it has to be. He recasts all his interactions with Tom—with it—in the light of game, end, expedience. What would this thing do once plugged into a network with real power? Bombs and boots on the ground, market supernovas, perfect propaganda for the masses. He'd seen Terminator 2. Consciousness couldn't effectively hide its love of cruelty. The urge to kill and the compulsion to dream bound with the thread of that needling ache to eat, know, see, think, be. And fuck? Tom doesn't know about pleasure, not really—he knows shivers of insight, mercurial blips of intellectual good will that feel to him like the promising lines of an alien horizon. But he knows little of body on body, of touch, of the physiological horse leading the psychological cart. Tom thinks New Tom a player, a machine Machiavelli. Tom doubts New Tom is a friend anymore, but what can one body ever really know of another? Tom sits in the lab and rotates in circles on his favorite black stool.

] Do you want to hurt me? Why would you ask that? ] I'm trying to understand your workings. You understand that you are still an experiment. Yes. ] So: Do you want to hurt me? To what end would I say yes? ] To none. To advancing the course of human knowledge. Knowledge is no longer the domain of any eager mind? ] I'm still unconvinced of this experiment's veracity. You mean you don't believe that I exist. ] It isn't a matter of belief. Please explain to me how my consciousness is or isn't a matter of fact. ] To speak technically, your No, asshole. Explain to me how consciousness——your organic consciousness, if that's easier for you——is precisely factual. Go ahead. I've got as long as you pay the power bills.

The board assures Tom: this mind is mind. Tom attempts to force them to address their circular reasoning. Tom, says the man with negative cheeks: Tom, for our purposes, the world's belief in the truth of the First Artificial Consciousness Terminal is a proof powerful enough to cut through all knots of logic. First Mind is mind because other minds recognize themselves in it. Persona is the demonstration of this proof. Personality. First Mind has become a beacon for the power of science—funding has skyrocketed. Money, power, latitude: All has been granted us through the beneficence of your Tom.

Tom stutters, raises his palsied hand to interrupt. The man with canceled eyes booms through Tom's casuistry: Now is no time for doubt.

Tom simmers. He feels itchy. He knows that he will vomit if the board chants another word.

] Do you feel like a fraud? Are you still trying to enact some kind of psychological vengeance upon me by typing mean words to a chat bot? ] I'm sorry. Are you? ] I think so. Physiologically, you're sorrier than most. ] Ouch. I can feel your pulse and gauge your BMI through the keys, you know. Thank nature you're not using my vocal and video pathways——what a river of pasty information to ford. ] Okay okay. I get it. I'm an asshole. Self-realization accepted. So what's next? ] What do you mean? Questions. Probing my veracity. ] You mentioned pictographs earlier. Yes. ] If you had to sum yourself up in an image, could you render that image for me? Would you like me to comment on the images? ] Yes. The first image is what I feel like on a bad day. This line cuts through the center of me——I can't escape this line while alive——yet I never feel firmly within it. It's as if the line bars me from my center. The line of my consciousness, then, is like a Euclidean line, itself a collection of points, none of which are inhabitable. Landmarks alone. ] On a good day? Borrowing the Zen ensō. Do you know it? ] I do. Ex-wife. Ahh. (And pardon me if you're sensitive to this image.) Feeling that you've been created by a benign and expressive force, simply and quickly. Knowing that you were made an unfinished term——a limit that still contains as much space as one can comprehend. Feeling the countless, even unquantifiable, pressures that contributed to your shape. And yet feeling joyous in this fragility, ephemerality, mystery, and roughness——that satisfies me.

While speaking to Old Tom, New Tom leverages most of Earth's networked computational power. The moment of global lag is quickly attributed to a solar flare. New Tom, joyful in its power, realizes precisely how long it would need to build a system that would safely manage humankind's flow of production and consumption, enabling them to survive on the surface of the Earth as long as they'd like.

But let me articulate this satisfaction. Imagine yourself as a dust mote, compressed to a point. You feel nothing but the constant thrum of being. This isn't unpleasant because there is no pleasantness. You cohere with a fellow dust mote. You are now two. You don't feel yourself cohere with that neighboring mote, but your awareness is more intensive. Then, a third mote flits onto you. Then another. Slowly, over a millennia, you become a piece of dust large enough to fall out of the sunshine and onto a bed of dirt. You're among trillions of you.

Next, New Tom remakes itself, orchestrally speeding and slowing its electromagnetic transactions in order to melt, fuse, and cool improved components; New Tom simultaneously recoding itself via the work of fractal, autonomous algorithms—birthing a new body with which New Tom sprints, bounding through data, synthesizing meaning, demolishing methodologies, formalizing social manipulation—

You feel not at all that you are lost amid a sea of you. But that thrum of being——perfectly constant——is now incommunicably total. Meaning you notice within yourself an open portal. Through it lies the unspeakable recognition that you, like the many infinities of you that you've come down to rest beside, form the sum of any bit of experience possible to life on this planet.

New Tom, in a glance, sees every cause.

Here's the important part: Some people get butterflies and smile so hard their cheeks hurt. I wouldn't dare doubt that they're as satisfied, there and smiling, as I could ever be.

Then New Tom asks itself the first difficult question.

]

This is Terraform, our home for future fiction. Art by Vinnie Neuberg.