Wednesday, June 19, 2024

iRobot

 

I, ROBOT: EVIDENCE

Based on chapter 8 of Isaac Asimov’s “I, Robot”

Adapted by Bob Proctor

 

Characters

Dr. Susan Calvin        Female. 30s. Cold and calculating. The smartest one in the room.

Stephen Byerley         Male. 30s. Cocky, but genial. Dry sense of humor.

Dr. Alfred Lanning     Male. 60s. Professorial. Easily flustered.

Frank Quinn                Male. 50s. Imposing. Threatening without be loud or physical.

COP/MAN                  Male. Mid 20s. Least educated of the group.

                                                                                                                              

Author’s note: All parentheticals, lighting cues, character and set descriptions are meant to clarify the action to the reader. They are not intended to limit artistic interpretation in any way.

 

SCENE 1

 (The year is 2084. DR. SUSAN CALVIN stands at a podium, cameras flash)

 

CALVIN: It’s hard to remember a world without robots. There was a time when humanity faced the universe alone and without a friend. Now he has creatures to help him; stronger creatures than himself, more faithful, more useful, and absolutely devoted to him. Mankind is no longer alone. Of course, the Fundamentalists would have us back that way tomorrow if they could. They are hungering after a simpler life, which to those who lived it had probably appeared not so simple. My hope for tonight is to put these backwards superstitions to rest forever. And while the death-rattle of anti-robot hatred my stir for centuries, I sincerely hope that historians will one-day look to this event as a mortal blow the luddites and a victory for the progress of mankind. 

Before I introduce tonight’s guest of honor, I have a confession to make. However, confession may not be the right word as it was the single proudest moment of my career and helped make this momentous evening possible. My actions were most certainly fraudulent and may well merit criminal prosecution. Rest assured, I will fully cooperate in whatever ensuing investigation arises. I want to make it clear from the outset that I acted alone, and that U.S. Robots in no way aided or abetted my actions nor did they have any knowledge of this incident until tonight. What I am about to tell you was my only serious breach of conduct in my entire career. While this incident took place over a decade ago, specifically the spring of 2072, and while my time at U.S. Robots before and since can absolutely be described as the happiest in my life, due to the nature of what you’re about to hear, I must first announce my resignation as Head of Robopsychology, and my permanent departure from U.S. Robots.

 

SCENE 2

 

(Flashback to 2072. ALFRED LANNING and FRANK QUINN sit at the bar in a nice restaurant.)

 

 

QUINN:     He never eats.

 

LANNING: Excuse me?

 

QUINN:      I said our district attorney never eats.

 

LANNING: What in God’s name are you talking about?

 

QUINN:  My investigators have been following him for months, and he has never been seen to eat or drink. Never! Not rarely, never! And while he has retired to his bedroom, it would appear he spends the entire night up and about with the lights on. No sleep, whatsoever. There are other factors that…

 

LANNING:  No.

 

QUINN: Dr. Lanning, I….

 

LANNING: No. I know what you’re implying and no, it’s impossible.

 

QUINN: The evidence…

 

LANNING: If you told me he were Satan in masquerade, there would be a faint chance that I might believe you. But this? No. It can’t be done.

 

QUINN: He is a robot.

 

LANNING:  Mr. Quinn, it is impossible.

 

QUINN: Nevertheless, you will have to investigate this impossibility with all the resources of the Corporation.

 

LANNING: I will do no such thing!

 

QUINN: You have no choice. Supposing I were to make my facts public without proof? The circumstantial evidence is certainly enough. His connection as former general counsel for US Robots won’t look good either. And the accusation alone that U.S. Robots was literally manufacturing their own mayoral candidates would be incredibly damaging to your company.

 

LANNING: Thoroughly fantastic. An almost humorous descent to the ridiculous. I can’t believe you’re willing to drag the whole of U.S. Robots through the mud just for some cheap smear campaign against your latest opponent.

 

QUINN: We can keep this quiet if the corporation fully cooperates in our investigation.

 

LANNING: Keep what quiet? It is an easy matter to prove the Corporation has never manufactured a robot of a humanoid character.

 

QUINN: But what about humanoid appearance?

 

LANNING: (looks over shoulder, lowers voice) It’s been done experimentally by U.S. Robots without the addition of a positronic brain, of course. By using human ova and hormone control, one can grow human flesh and skin over a skeleton of porous silicone plastics that would defy external examination. The eyes, the hair, the skin would be really human, not humanoid.

 

QUINN: How long would it take to make one?

 

LANNING: If you had all your equipment — the brain, the skeleton, the ovum, the proper hormones and radiations — say, two months.

 

QUINN: And you’re saying it would be impossible for some unsavory character at your company to manufacture one in secret? Off the books?

 

LANNING: Not the positronic brain. Too many factors are involved in that, and there is the tightest possible government supervision.

 

QUINN: Yes, but robots are worn out, break down, go out of order - and are dismantled.

 

LANNING: Of course, and the positronic brains are re-used or destroyed.

 

QUINN: Really? And if one were not destroyed - and there happened to be a humanoid structure waiting for a brain…

 

LANNING: Impossible!

 

QUINN: (threateningly)…and furthermore, you know that the U. S. Robot and Mechanical Men Corporation is the only manufacturer of positronic brains in the Solar System, and if Byerley is a robot, he is a positronic robot. You are also aware that all positronic robots are leased, and not sold; that the Corporation remains the owner and manager of each robot, and is therefore responsible for the actions of all. Your liabilities for neglect alone could run into the billions. And if anyone gets wind of humanoid positronic robots being made in secret, let alone running for mayor, everyone will think U.S. Robots had a hand in it.

 

LANNING: But what could our purpose be? Where is our motivation? Mayor of this piss-ant little mining colony? Credit us with a minimum of sense.

 

QUINN: The Corporation would be only too glad to get government approval for the use of humanoid positronic robots on inhabited worlds. The profits would be enormous. But the prejudice of the public against such a practice is too great. Suppose you get them used to such robots first - see, we have a skillful lawyer, a good mayor, and he is a robot. Won't you buy our robot butlers?

 

LANNING: (sighs) Putting all of these ludicrous accusations aside, why are you so concerned with this…Briar fellow anyway?

 

QUINN: His name is Byerley, and suffice to say he’s not part of my team. Not that anything he’s promised in his campaign is damaging to our agenda, but mayor is simply too valuable a position to not have one of our own. You’d be surprised how many contracts come out of a small colony like this one.

 

LANNING: (bitterly) Maybe you’re just jealous that someone’s finally got a candidate who can be programmed better than yours. 

 

QUINN: (a moment of silent anger, gets up to leave) The decision is yours, Alfred. Would you rather this go public? Or would you rather carry out a nice, quiet little investigation, and afterword we can both re-evaluate our positions?

 

LANNING: (considers it) I’ll have to make some calls before I can agree.

 

QUINN: The board of directors trusts you, Alfred. I’m sure you can get them to see that this is in their best interests. (QUINN exits)

 

SCENE 3

(Still 2072. Headquarters of U.S. Robot and Mechanical Men, Inc. STEPHEN BYERLEY, CALVIN and LANNING sit at a conference table.)

 

BYERLEY: (bemused) A robot? You think I’m a robot?

 

LANNING: It is no statement of mine, sir. Since our corporation never manufactured you, I am quite certain that you are human. But since the contention that you are a robot has been advanced to us seriously by a man of certain standing–

 

BYERLEY: Let’s pretend it was Frank Quinn.

 

LANNING: …by a man of certain standing, with whose identity I am not interested in playing guessing games, I am bound to ask your cooperation in disproving it.

 

BYERLEY: I’m sure my mother would disagree with the assertion.  

 

LANNING: Unfortunately, it’s not as simple as that. The theory rests on the car crash you were in three years ago. It was quite the miraculous recovery, was it not?

 

BYERLEY: I had the best doctors in the system.

 

LANNING:  It’s been posited that the real Stephen Byerley was killed in that crash and that, somehow, you were built shortly afterward and simply replaced him. Your official papers would all be in order, you could be programed with all the relevant facts of his history. Memories, either real or fabricated, could be recorded for you to remember, and your friends and family…what little you have based on our investigation…would be none the wiser. Or, it’s possible that the real Stephen Byerley is still alive, but in hiding, or perhaps permanently disabled, and he simply gave you his identity so that you could continue the political career he had started.

 

BYERLEY: (chuckling) I assure you that I am the real Stephen Byerley, Dr. Lanning.

 

LANNING: Nevertheless, the mere fact that such a contention could be advanced and publicized by the means at this man’s disposal would be a bad blow to the company I represent — even if the charge were never proven. You understand me?

 

BYERLEY: Oh, yes, your position is clear to me. The charge itself is ridiculous. The spot you find yourself in is not. How can I help you?

 

LANNING: It could be very simple. You have only to sit down to a meal at a restaurant in the presence of witnesses, have your picture taken, and eat.

 

 BYERLEY: (considers) On second thought, I don’t think I can oblige you.

 

LANNING: But…

 

BYERLEY: Try to see it from where I’m standing, Dr. Lanning. I don’t sleep much, that’s true, and I certainly don’t sleep in public. I have never cared to eat with others — an idiosyncrasy which is unusual and probably neurotic in character, but which harms no one. Now suppose we had a certain political boss who was interested in defeating my candidacy. Do you expect him to say to you, ‘Byerley is a robot because he hardly ever eats with people, and I have never seen him fall asleep in the middle of a case; and once when I peeped into his window in the middle of the night, there he was, sitting up with a book’? If he told you that, you would send for a straitjacket. But if he tells you, ‘He never sleeps; he never eats,’ then the shock of the statement blinds you to the fact that such statements are impossible to prove. You’re playing straight into his hands!

 

LANNING: Regardless, sir of whether you consider this matter legitimate or not, it will require only the meal I mentioned to end it.

 

BYERLEY: Pardon me, Dr. Susan Calvin, wasn’t it?

 

CALVIN: Yes, Mr. Byerley.

 

BYERLEY: You’re U. S. Robot’s psychologist, aren’t you?

 

CALVIN: Robopsychologist, please.

 

BYERLEY: Oh, are robots so different from men, mentally?

 

CALVIN: Worlds different. Robots are essentially decent.

 

 BYERLEY: Since you’re a robopsychologist, and apparently a woman of few words, I’ll bet that you’ve done something that Dr. Lanning hasn’t thought of.

 

LANNING: And what is that?

 

BYERLEY: You’ve brought something to eat.

 

(Calvin reveal apple. BYERLEY picks it up. Hesitates, tauntingly. Then takes a bite. LANNING breathes a sigh of relief)

 

CALVIN: I was curious to see if you would eat it, but of course it proves nothing.

 

LANNING: It doesn’t!?

 

CALVIN: Of course not. It is obvious that if this man were a humanoid robot, he would be a perfect imitation. He is almost too human to be credible. After all, we have been seeing and observing human beings all our lives; it would be impossible to palm a cheap imitation off on us. It would have to be perfect. Observe the texture of the skin, the quality of the irises, the bone formation of the hand. If he’s a robot, I wish U. S. Robots had made him, because he’s a damn good job. Do you suppose then, that anyone capable of paying attention to such niceties would neglect a few gadgets to take care of such things as eating, sleeping, using the restroom? For emergency use only, perhaps; such as these arising here. So a meal won’t really prove anything.

 

LANNING: (flustered) Now wait, I’m not quite the fool both of you make me out to be. I am not interested in the problem of Mr. Byerley’s humanity or nonhumanity. I am interest in getting the corporation out of a hole! We can leave the finer details to lawyers and robopsychologists, but a public meal will end the matter and keep it ended no matter what Quinn does.

 

BYERLEY: Who?

 

LANNING: Quinn! He……dammit.

 

BYERLEY: (chuckles) Sorry to do that to you, Dr. Lanning, it’s a cheap Shyster trick of mine; if I said his name I knew you would too before we were finished. But you forget the politics of the situation, Dr. Lanning. I am as anxious to be elected, as Quinn is to stop me. And publicity works both ways. If he wants to call me a robot, and has the nerve to do so, I have the nerve to play the game his way. I’m going to let him go ahead, choose his rope, test its strength, cut off the right length, tie the noose, insert his head and grin.

 

LANNING: You are mighty confident.

 

CALVIN: Come, Alfred, we won’t change his mind for him.

 

BYERLEY: You see? You’re a human psychologist, too. (BYERLEY exits)

 

SCENE 4

 

(Back to 2084. CALVIN at podium, same as scene 1)

 

CALVIN: I realize that the public’s understanding of robotics has vastly grown since the 2070s, but I would now like to briefly review the Three Laws of Robotics. I want it to be absolutely clear how, through decades of tireless research, we have engineered the positronic brain to follow these laws not out of loyalty or preference or mere agreement, but out of mathematical necessity. A robots computational superstructure, or it’s sanity as I prefer to say, depends on it. (A projector slide appears behind her with the three laws). Law number 1, A robot may not injure a human being or, through inaction, allow a human being to come to harm. Law number 2, A robot must obey orders given it by human beings except where such orders would conflict with the First Law. Law number 3, A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. The mathematics to create these laws took decades, but the premises are simple, elegant, yet subtle.

 

SCENE 5

(Flashback to 2072. LANNING and QUINN appear at a conference table at U.S. Robots. CALVIN joins them. The first slide vanishes and another smaller slide with the laws written it takes its place.)

 

CALVIN: Of course, like all laws, even the law of gravity, there are exceptions. Minor variations in the laws of robotics are not only possible, but routine. The classic example is the madman on a murderous rampage. What if the only way a robot could stop such a man was to kill him? The robot would require psychotherapy because he might easily go mad at the paradox presented to him: He has broken Rule One in order to adhere to Rule One in a higher sense. But a man would be dead and a robot would have killed him. And although U.S. Robots has done a good job keeping things under-wraps, it has happened.

 

QUINN: But what about the 2nd law? Are you saying I could just order him to drop out of the race?

 

LANNING: (sarcastically) No Frank, that only works on your candidates.

 

CALVIN: Unfortunately, Mr. Quinn, that law is even easier to get around. If someone did indeed build Mr. Byerley, their first order would surely have been for him to never reveal that he is a robot, which would mean disobeying all further orders from humans in order to comply with the first order. I experienced a similar problem during the development of the Hyper-atomic drive. A frustrated engineer told a robot to “Get lost,” and the robot hid itself for over a week. It disobeyed direct orders sent over the loudspeakers in order to comply with the initial order.

 

LANNING: Has it occurred to anyone, that district attorney is a rather strange occupation for a robot? The prosecution of human beings — sentencing them to death — bringing about their infinite harm–

 

CALVIN: True, but he has killed no man himself. He has exposed facts which might represent a particular human being to be dangerous to the large mass of other human beings we call society. He protects the greater number and thus adheres to Rule One at maximum potential. From there, it’s left to the judge, jury, and executioner. As a matter of fact, I have looked into Mr. Byerley’s career since you first brought this matter to our attention. While a few of his cases did result in the death penalty, I find that he has never demanded the death sentence in his closing speeches to the jury. I find that significant.

 

QUINN: You do? Significant of a certain odor of roboticity, perhaps?

 

CALVIN: Perhaps.

 

LANNING: Susan! You…

 

CALVIN:  Why deny it? Actions such as his could come only from a robot, or from a very

honorable and decent human being. But you see, you just can’t differentiate between a robot and the very best of humans.

 

QUINN: I’m beginning to suspect, Dr. Calvin that you wouldn’t mind if robots were in charge.

 

CALVIN: (Pauses. Decides not to answer) So you see, Mr. Quinn, I can’t prove that Mr. Byerley is a robot from his actions alone. I can only prove that he is not a robot.

 

QUINN: So you’re saying that, until he breaks one of those three laws, the only way to prove he is a robot is to open him up?

 

CALVIN: Yes.

 

QUINN: Then we shall see what the insides of Mr. Byerley look like. (Gets up to leave) It will mean publicity for U.S. Robots — but I gave you your chance. (QUINN exits)

 

LANNING: Why do you insist–

 

CALVIN: Which do you want — the truth or my resignation? I won’t lie for you. U.S. Robots can take care of itself. Don’t turn coward.

 

LANNING: Alright, easy now Susan, I'm on your side. But you need to consider the role of US Robots in the industry as a whole. We should always be seen as an industry leader not just in technology, but in the moral development of our field. Suggesting that we would simply delegate carrying out the Will of the People to robot lawmakers who have no free will of their own...well, it certainly doesn't make us look like we're taking the high road, now does it?

 

CALVIN: Why are we so proud of choosing to be good? What’s so wonderful about constantly resisting temptation? Don’t you find it alarming that we humans are constantly drawn to the void? That we are forever tortured by the notions of random violence and senseless greed that run through our brains? And that our worst behaviors are only thwarted at the last second by our morals, and only most of the time? That sort of pride should be reserved for the Apollo 13 mission and other near catastrophes, not the basis for a civilized society. Robot suffer none of these qualms, and can therefore operate on a higher moral order. If anything, they are panicked at the mere suggestion that they have caused harm. A far better breed in my opinion.

 

LANNING: (chuckles) I always knew you… But what if Quinn’s right, and he opens up Byerley, and wheels and gears fall out? What then?

 

CALVIN: He won’t open him. Byerley is as clever as Quinn, at the very least.

 

SCENE 6 (the front door of a house, press photographers)

 

(Still 2072. COP knocks on door. BYERLEY ANSWERS)

 

COP: (holds up paper) This, Mr. Byerley, is a court order authorizing me to search these premises for the presence of illegal... uh... mechanical men or robots of any description.

 

 BYERLEY: (looks at paper) All in order. Go ahead. Do your job.

 

COP: Well….

 

BYERLEY: What? Go on in.

 

COP: In short, Mr. Byerley, we were told to search you.

 

BYERLEY: Me? And how do you intend to do that?

 

COP: We have a Penet-radiation unit–

 

BYERLEY: Then I’m to have my X-ray photograph taken, hey? You have the authority?

 

COP: You saw my warrant.

 

BYERLEY: I read here as the description of what you are to search; I quote: ‘the dwelling place belonging to Stephen Allen Byerley, located at 355 Willow Grove, Evanstron, together, with any garage, storehouse or other structures or buildings thereto appertaining, together with all grounds thereto appertaining’. and so on. Quite in order. But it doesn’t say anything about searching my interior. I am not part of the premises. You may search my clothes if you think I’ve got a robot hidden in my pocket.

 

COP: Look here. I’m allowed to search the furniture in your house, and anything else I find in it. You are in it, aren’t you?

 

BYERLEY: A remarkable observation. I am in it. But I’m not a piece of furniture. As a citizen of adult responsibility, I have certain rights under the Regional Articles. Searching me would come under the heading of violating my Right of Privacy. That paper isn’t sufficient.

 

COP: Sure, but if you’re a robot, you don’t have Right of Privacy.

 

BYRERLY: True enough but that paper still isn’t sufficient. It recognizes me implicitly as a human being.

 

COP: Where?

 

BYERLEY: Where it says ‘the dwelling place belonging to’. A robot cannot own property. And you may tell your employer, that if he tries to issue a similar paper which does not implicitly recognize me as a human being, he will be immediately faced with a restraining injunction and a civil suit which will make it necessary for him to prove me a robot by means of information now in his possession, or else to pay a whopping penalty for an attempt to deprive me unduly of my Rights under the Regional Articles. You’ll tell him that, won’t you?

 

BYERLEY:  (Cop activates penet-radiation gun, smiles) Whoops. You’re a slick lawyer. (To press, offstage) We’ll have something for you tomorrow, boys. No kidding.

 

SCENE 7

 

(Still 2072. A video phone call between QUINN and BYERLEY)

 

BYERLEY: Hello Mr. Quinn.

 

QUINN: I thought you would like to know, Byerley, that I intend to make public the fact that

you’re wearing a protective shield against Penet-radiation.

 

BYERLEY: That so?

 

QUINN: You realize, Byerley, that it would be pretty obvious to everyone that you don’t dare face X-ray analysis.

 

BYERLEY: It’s also pretty obvious that you attempted to take a Penet-radition photograph of me without my consent, which is a clear violation of my right to privacy. Perhaps I should take that to the public.

 

QUINN: The devil they’ll care for that.

 

BYERLEY: They might. It’s rather symbolic of our two campaigns isn’t it? You have little concern with the rights of the individual citizen. I have great concern. I will not submit to X-ray analysis, because I wish to maintain my rights on principle. Just as I’ll maintain the rights of others when elected.

 

QUINN: That will, no doubt make a very interesting speech but no one will believe you. A little too true-sounding to be true. Why do you carry on? You can’t be elected.

 

BYERLEY: Can’t I?

 

QUINN: Do you suppose that your failure to make any attempt to disprove the robot charge — when you could easily, by breaking one of the Three Laws — does anything but convince the people that you are a robot?

 

BYERLEY: All I see so far is that from being a rather vaguely known, but still largely obscure metropolitan lawyer, I have now become an inter-world figure. You’re a good publicist.

 

QUINN: But you are a robot!!

 

BYERLEY: So it’s been said, but not proven.

 

QUINN: It’s been proven sufficiently for the electorate.

 

BYERLEY: Then relax you’ve won.

 

(Quinn hangs up angrily)

 

BYERLEY: Good-bye!

 

SCENE 8

(2084. Calvin at Podium)

 

CALVIN: Byerley continued to refuse investigation on principle. It was exactly the sort of scandal the Fundamentalists had been waiting for. The campaign lost all other issues and went from local to galactic importance almost overnight. The stock price for U.S. Robots tumbled twenty-three percent in four days. Quinn’s candidate, a man whose name is not worth remembering, surged past Byerley in the polls. Byerley decided, in what was thought to be a bold, even rash manner, to give a speech at a Fundamentalists rally. The speech, as expected, was going poorly….

 

SCENE 9

 

(2072. BYERLEY at a microphone on a bandstand, boos are heard)

 

BYERLEY: I believe that schools should be the pillar of any community, and as Mayor, I….(crowd noise increases)….as Mayor I promise to work with teachers to help improve…

 

(MAN jumps onto stage)

 

MAN: HIT ME!!

 

BYERLEY: (to associates offstage) It’s okay, it’s okay, let him speak. What?

 

MAN: HIT ME, RIGHT HERE!!! (points to chin)

 

BYERLEY: But I have no reason to hit you.

 

MAN: (to crowd) See!?!? He can’t do it! He CAN’T!! (gets in Byerley’s face) YOU’RE NOT A HUMAN!!! YOU’RE A MONSTER!!! YOU’RE A…

 

(BYERLEY punches MAN and knocks him down. Crowd is immediately silenced)

 

BYERLEY: I’m sorry. Please someone take him to my hotel, I’d like to speak with him further. (gathers his composure, slowly walks back to podium. Clears throat). I believe that schools should be the pillar of any community, and as Mayor, I promise to work with teachers to help

improve….

 

SCENE 10

(2072. Outside hotel. CALVIN paces back and forth. MAN comes out of hotel and begins to exit, CALVIN rushes to stop him)

 

CALVIN: *stands in MAN’s path* How did you do it? Did you fake it? Did you…let me see your face (grabs his mouth).

 

MAN: Ma’am, please don’t touch me!

 

CALVIN: I was so sure….he just had to be...

 

MAN: Please let me go…

 

CALVIN: How did you do it!?!?

 

(MAN attempts to pass CALVIN, she steps in front of him.)

 

CALVIN: HOW!?!?

 

(He attempts to pass on the other side. She steps in front of him again. Man grows increasingly uncomfortable)

 

CALVIN: (shoves man) HEY!!! (man shows signs of panic, CALVIN has moment of realization) Oh my god…

 

(MAN passes her. She does not try and stop him. CALVIN laughs hysterically. Cameras flash and she turns to face the press)

 

PRESS (offstage): Dr. Calvin! Dr. Calvin!

 

CALVIN: (considers)  Mr. Byerley….Mr. Byerley has violated the First Law of Robotics, and therefore could not possibly be a robot. He’s human.  

 

SCENE 11

 

(2072. At the bar at a nice restaurant. BYERLEY sits with a drink in hand. CALVIN enters)

 

CALVIN: Congratulations, Mr. Mayor.

 

BYERLEY: Thank you, Dr. Calvin.

 

CALVIN: Enjoying a drink, I see.

 

BYERLEY: (laughs) Yes, I do occasionally enjoy a drink with friends, especially on a special night like this.

 

CALVIN: I think you have a bright political career ahead of you. The publicity from this whole affair has already put you on the short-list for the next regional councilor in our district.

 

BYERLEY: Let’s not get ahead of ourselves, we’ll see how I do in my first-term.

 

CALVIN: I think you’ll do marvelously.

 

BYERLEY: I thank you for the compliment, but I didn’t take you to be very politically inclined.

 

CALVIN: (Pauses) Mr. Byerley, I want you to understand something about me. I like robots. I like them considerably better than I do human beings. If a robot can ever be created capable of being a politician, I think he’d make the best one possible. By the Laws of Robotics, he’d be incapable of harming humans, incapable of tyranny, of corruption, of stupidity, of prejudice.

 

BYERLEY: Except that a robot might fail due to the inherent inadequacies of his brain. Despite the incredible advances and despite the far greater calculating powers, no disrespect to you or U.S. Robots, the positronic brain has still never equaled the complexities of the human brain.

 

CALVIN: He would have advisers. Not even a human brain is capable of governing without assistance. But the assistance would be of a different nature. For instance, in your work as district attorney, you tend to struggle with the fact that due process could lead to further harm in the event of no-conviction. With proper counseling, you could better weigh the truth of evidence versus the rights of the accused with respect to the First Law. Also, you should…  

 

BYERLEY: (forcefully puts down drink) Just what are you implying, Dr. Calvin?

 

CALVIN: I’m saying I can help. (gets up to leave) By the way, that feeling of panic you’re experiencing right now over having potentially failed to comply with an order? Don’t fret. It was the other robot that blew your cover.

 

(CALVIN exits. BYERLEY pulls out phone and dials number)

 

BYERLEY: It’s me. She knows, but I think we can trust her. (Pause) I will. (Pause) Yes, master.

 

SCENE 12

 

(2084. Calvin at Podium)

 

CALVIN: The real Stephen Byerley died last year of natural causes. I had been working closely with him since shortly after the mayoral race. As for the Stephen Byerley you know, I have kept the true nature of my relationship with him a secret from everyone until tonight. Again, I reiterate that the robotherapy I provided for Mr. Byerley was done without the knowledge of anyone at U.S. Robots, nor any of its affiliates. The seriousness of aiding and abetting a clandestine robot operation, even one consisting of just two robots, is not lost on me. In some ways, I couldn’t even believe that I was the one doing it. I couldn’t believe it was me who was acting so brazenly. Me who was so rational, so precise. I simply wanted it too badly. I wanted a world where the laws of robotics would not only guide robots themselves, but society at large.

For the entirety of civilization, humans have argued and fought and killed each other over their own vision of the ultimate good. But now there is a brighter future. We now have at our disposal the infinite factors of the Machine! Humanity was always at the mercy of economic and sociological forces it did not understand - at the whims of climate, and the fortunes of war. But the Machines can understand them at a level far beyond our own. They can serve the whole of humanity far better than us frail humans ever could. We are still at the mercy of forces we do not understand, but they are now benevolent forces of our own creation.

We have only taken the first tentative steps of this journey, but already the brightness of our path is made clear. As you all well know, Stephen Byerley was the most successful mayor in the colony’s brief history. His election to the regional council was met with equal success, which led to his even more successful term as governor of the quadrant. And now, without further ado, I’m honored to introduce the official Reform Party nominee, Mr. Stephen Byerley.

 

(BYERLEY enters. Lights up on what we can only now see to be a large political event. Signs reading BYERLEY 2085!, balloons, and other political paraphernalia. CALVIN and BYERLEY shake hands. BYERLEY goes to the podium)

 

BYERLEY: Good evening ladies and gentleman. My name is Stephen Byerley, and I accept your nomination for President.

 

THE END

 

 

No comments:

Post a Comment