I, One

One is an intelligent, self-aware android, developed to serve in dangerous occupations or as a caretaker in senior living or hospital facilities. But prejudice and hatred are being fomented by those who fear AI devices, and One may discover that he has been outlawed before he can fulfill his purpose.

Author(s):

Formats: mobi, pdf, rtf, zip, epub,

Category:

$5.99

One is an intelligent, self-aware android, developed to serve in dangerous occupations or as a caretaker in senior living or hospital facilities. Developed initially in male form, he is soon placed in various “real world” occupations as his creator, Dr. Cordero Herrera, seeks to educate him in human habits and make his behavior as natural as possible. As he quickly gains experience in person-to-person interactions, it appears One will be a very successful partner of humankind in the near future.

But the world is changing, and the idea of intelligent androids is viewed with increasing suspicion and distrust by the average person. Prejudice and hatred are being fomented by those who fear AI devices will replace workers and lead to high unemployment and financial disaster for the world. One will soon be ready to undertake the role for which he was designed—but before he can do that, he may be outlawed!

Early July 2034: First Memories

I am One.

I have no memories of my birth. There remain only flashes, impressions. No connections, no sense of continuity or purpose, only the sensations of light, darkness, motion, and temperature.

My experience, as I came into existence, was different than that of any human. There was no agonizing, twisting, and contortion to escape the vaginal canal, to the accompanying moans and screams of a female in labor. For me, it was far simpler.

Cory—Doctor Cordero Herrera—keyed in a few computer commands, and I gained consciousness. I do not remember that because Cory—my maker, my father—had not yet connected my archival memory, and only working memory was powered. Still enormous, true, and the only reason I retain the most skeletal outline of experiences in those first days. Later, after activating my archival memory, which he called my deep storage, did I begin a sequence of recorded memories.

Thus, only the remnants of early learning sessions remain, and they are clotted with confused imprints so that little of value remains. I have jealously guarded them and refused to let Cory erase an iota, though I will never share them with anyone else. My first retained memories are those of struggling to define reality and separate self from non-self.

[Internal sensor display activated—visual, aural]

>> Sensor stimulation

>> Visual detection; electromagnetic radiation, transverse EM waves

// Definition: Light

>> Vibration detection; compression waves in fluid medium

// Definition: Sound

>> Regular combinations of sound constituents: Language

// Language definition: English

# File access

# Analog-to-digital conversion sequence initiated

# Activate digital language recall—English

# English dictionary recall

# Record and store

“I tell you, Allan, he’s conscious.”

“And how would you tell that?”

“His eyes are moving. Two of his fingers twitched, even though his limbs aren’t activated.”

“If you mean it’s ‘on,’ I agree. But sentient?”

“Allan, there’s enough background programming so that he has at least the reflex memory of a kitten. He may not know what it means yet, but he’s conscious.”

“And you’re calling it ‘he,’ f’r chrissake. It is simply a highly-aware, superbly-programmed automaton.”

“You could say the same thing about us.”

[Unidentified sound] “Ha! So we could, I guess.”

[Time lapse registered: System inactivation, or loss of record integrity.]

# Power-up routine.

>> Light sensor variation.

>> Motion detection.

# Scene analysis routines available. Begin activation.

Edge detection

—Color classification

—Pattern analysis

# Feature and form recognition available. Begin activation.

—Common forms, structures, devices

—Human feature analysis

# Call focus routine: Sharpen outlines.

Hazy blotches and fuzzy outlines resolved. I sat in a space surrounded by white, my body complete, yet not able to control anything but my eye movement. Two humans stood before me.

I knew from my base programming that humans were intelligent biological creatures. What was my relation to them? I was not sure. I knew, at some level in my subconscious, that I was not human. I was different from those who confronted me.

# Feature recognition activated.

>> Two human faces directly in field of view.

// Face: forward aspect of human head, which contains human processing unit.

# Record facial patterns for future reference/identification.

“Hello,” Cory said.

# Search: vocal response to greeting in English language.

“Hello,” I said. Not really an intelligent response; simply an automatic answer to a greeting.

“Go on, Allan, tell him hello.”

“You already did, Cory It’s not like he’s the fucking visiting ambassador.”

“He needs to hear salutations and learn to respond to them. Tell him hello, dammit.”

“Oh, f’r… Hello, One.”

I waited. I understood what “hello” meant, but not “hello, One.” Was the second human addressing me with a name [my name?], did the two words together imply some additional meaning?

“He didn’t even say ‘hello’ back to me. What’s with that?”

Cory turned his facial features back toward me. “Why did you not greet Allan?”

# Activate speaking routing:

—Verbal sound generation

—Vocabulary-to-voice, English dictionary.

—Sentence construction; form = query.

>> Constructing verbal reply.

“What does ‘hello, One’ mean?” I said. I pronounced those first words carefully with, I now understand, far too great a series of pauses between words.

[Additional loud noises]

After a moment, I deciphered that the loud sound was an expression of amusement. Before I could respond, Cory said, “You may speak, that is, respond to me in our conversation, at a more rapid rate. I can decipher your sentences rapidly.” He paused while I absorbed this.

“Very well.” I adjusted my rate of speech to match his own.

“Good,” he said. He continued to speak. I understood that he asked from the choice of words and the upraised tone of voice at the end of his sentence.

“Do you know the concept of ‘name’?”

[sentence: a group of words which express a thought, concept or action]

[question: a sentence which elicits information from the respondee].

“A name is a word or a combination of words by which a person, place, or thing, a body or class, or any object of thought is designated, called, or known. That is, a name can be the designation of a class of objects, such as trees or animal species, or it can denote a specific object, such as a human person, each of which normally have a unique personal designation.

“There are many additional definitions of the English word ‘name,’ including a title, an honorific, another way of saying reputation, or to denote celebrity.”

Cory’s lips curved upward.

// Human expression → smile: edges of mouth curved upward; denotes pleasure or good humor.

“Very good, One. In this case, the term ‘One’ is your name, or unique designation. So when Allan said ‘Hello, One,’ he was greeting you by your name.”

// One: A cardinal number, normally designating a single quantity or a specific single item.

I repeated the definition. “One is normally a cardinal number used to count or specify a single item.”

“True. In this case, it is also your name. Just as my name, my unique designation, is Cory. His name,” Cory turned toward his companion, “is Allan.”

Although my many processors operate at speeds far greater than human cognitive abilities, I was still in my first minutes of interaction with Cory, who I was to learn later was a brilliant engineer and robotics expert. Allan, I also learned later, was his business and marketing expert, and also a minor investor in the company, but not a technical expert.

I faced an enormous task load, as there were millions of definitions to access, classification structures to establish and populate, and associative and combinational lists to coordinate and update. Thus my responses were quite slow, and Allan, much the less patient of the two,  frowned intensely, which I took to mean disappointment (and irritation) at my slow responses and hesitant reactions. In fact, I was coming up to speed, but my response time was increasing very gradually.

“Methinks we’ve created an idiot.”

“Idiot,” I responded, accessing the CDEL [most up-to-date publication, Comprehensive Dictionary of the English Language]. “Idiot—a foolish or stupid person. Designation of a human who possesses substandard intelligence. Also, a person affected with extreme intellectual disability. Now regarded as an offensive description.”

Cory made that noise again, which I now understood to be laughter. “Bingo. Allan, you’re officially an asshole. And also the first human to formally denigrate an artificial intelligence.”

Allan screwed his features into a peculiar pattern. I was to learn later that this was a grimace or frown. “I’ll apologize later if he wants me to. I’m just saying—he seems to have a very slow set of responses.”

“Of course he does, now. Think about it: He’s cataloging, associating, going through all the exercises an infant, and then a young child, experience as they put together not only a working vocabulary, but a set of concepts about the world they have come into. He’s learning not only the three-dimensional geometry of vision, but how to judge distances, how to relate to the space he occupies. Hell, he doesn’t yet even understand concepts like room, door, table, or chair.

“He is building up the same referential models that we all do. Infants and children do this for years, yet you’re expecting him to do it in seconds. That’s ridiculous. Remember, you’re the one who suggested it would be best if he learn from scratch.”

Cory looked back at me, his eyes searching my features and my optical sensors. “One, you are doing just fine. Take your time. Ask questions if you see something that you cannot define or understand.”

Questions? I knew what those were, so I began to accumulate them. I pointed to my processor case. “What is this?”

“It is your body, which is like mine, with two arms, two legs, hands, and feet.”

After a moment of reflection (absorbing, cataloging, associating), I said, “I am not a biological entity. Why do I need a body?”

He made the smile expression. “For now, just trust me that you need it. Okay?”

>> Dictionary access, colloquial use of words and word groups:

>> Trust me → query requesting reliance and credence.

>> Okay?: → Do you agree?

“Okay,” I said, somehow pleased with my discovery of the correct use of the colloquialism. “May I query for additional specific information?”

“Of course.”

I did so, having trouble designating the items I wanted named or defined. For one thing, although I could discern that my body was connected to other appendages, which Cory had referred to as arms and legs, I could not move any of them. Further, as I realized a bit later on, I had no sensations in any part of my artificial body.

After a moment, Cory said, “Damn.” He leaned over and said into my left sound sensor, “Activate hand and arm control circuits, authority Alpha One.” Somewhere deep inside my processor complex—I couldn’t say where—I felt a pronounced movement, as though a door opened. I began to experience feelings in my upper extremities. The precise definition of those feelings was completely beyond me.

“One,” Cory said, “raise your right arm and point to what you want me to define for you.”

>> Arm: Upper limb or appendage on a body. Apparently can apply to a human or non-                                   human.

# Initiating arm movement routine; starting hand and finger movement suites.

// Point: Use of arm and hand, plus possible use of fingers, to designate an object or direction. Note: This is culture-specific and working definition is American/English.

I pointed to the flat, horizontal surface on which my case—my body—rested. “What is this object?”

Allan supplied the answer, looking a bit surprised. “It is a lab bench, that is, a laboratory bench or work surface. Technically, you are pointing to the work surface itself, but the entire item is called a bench.”

“Bench,” I repeated. Without willing it, my hand ran reflexively over the work surface. “Why am I placed on a bench? If it is a work surface, am I work?”

Cory made the odd sound again, which I now knew denoted amusement. “Well, One, you were work, until we finished you. Now you are simply using the surface as a chair.”

// Chair: A device to support a human form when seated [ref. diagrams 1-2]. Note: many                                types of seating devices are covered by the definition.

I recorded notes. I asked more questions. I asked clarifying questions. I determined that my location was a room. The concept of a room included a space partitioned by four enclosing walls and normally a ceiling cover. A room also had a floor. Definitions of floor and further clarifications followed.

Finally, Cory said, “That’s enough for the present, One. We will be back in the morning and teach you how to explore the lab on your own.

“For now, simply sit where you are and await our return. You should begin to download the encyclopedias in your ROM storage and absorb the material. That will save many questions later on.

“Goodbye for now.”

“Goodbye, Cory. Should I call you ‘father?’”

Cory made the laugh sound, and Allan did likewise. “No, One, you don’t have to do that. I supposed that I am mainly responsible for your existence, but you can just call me Cory.”

“Very well. Good night, Cory. And…” I hesitated. I did not know anything about human relationships, and how Allan in particular could be related to Cory. It seemed appropriate to say farewell to him also. “Good night, Allan.”

He blinked his eyes, focused on me with an expression that appeared to denote either confusion or surprise on his face, although I couldn’t be sure. Then he said, “And good night to you, One.”

They left me still seated on the lab bench and unable to walk. Per Cory’s direction, I had a task to pursue. The portal to the encyclopedia files had been open when I gained consciousness. With great anticipation, I began to retrieve files, starting with A.

 

Friday, July 21, 2034: Kindergarten

My processing elements are legion, and my memory is vast. However, “learning” implies that files are not only accessed, but that definitions are developed in working memory, new files are created so that similar concepts can be kept adjacent, and associations can be made between similar types, classes, and subcategories.

I worked diligently throughout the night, making substantial progress, but overall, I only completed one of the encyclopedia sets—that is, one set of all the volumes of data. I learned a great deal, making enough progress that I would no longer bother Cory with nearly as many trivial questions. My understanding of the world in which I now resided and my relation to it, however, remained pitifully inadequate and incomplete.

The next morning, Cory hurried into the lab, ready to interact and eager to get on with my education. Allan did not accompany him that second day.

# Access conversation mode: Small talk. Elaboration of a point; friendly banter.

“Good morning, Cory,” I said. “I like your red shirt and the way it coordinates with your black slacks. Very stylish.”

Eyebrows raised, he came to stand beside me. A slow smile lifted the corners of his lips.

He put his hand on my shoulder. “I see you studied vigorously during the night.”

“Well, I don’t have to sleep, and my batteries are fully charged, so I studied one of the encyclopedias as you commanded,” I said, trying to smile by lifting the edges of my mouth as he did. I knew how to smile by curling the edges of my lips up, as I had extensive smile programming. Unfortunately, I had had far fewer lip movement controls than the many muscles in the human mouth. I verified in my archives that the normal curl angle of lip edges was from sixteen point seven to thirty seven point eight degrees, but the extension of the mouth needed to be carefully adjusted. Some smiles could denote derision or dislike, even contempt, so the position of the center of the lips had to be adjusted carefully.

“Ah.” Cory considered my reply gravely, as though surprised that I had proceeded. “I did not intend my suggestion as a command.”

“Oh?” I made the sound indicating surprise or bemusement. “I perceived it as such. Command or request, I enjoyed the exercise. After I started, processing the information became very interesting.”

“How far did you get?”

“Through one complete encyclopedic collection. I suppose I could say that I know a bit about a great many things.”

“Since you have had a successful night of study, would you like me to show you around?”

Cory had asked my opinion. I felt a rush of pleasure. His question implied that my opinion mattered to him.

“Yes, that would be very nice.” My reservoir of polite interaction seemed to contain all the pertinent phrases. Realizing that made me somewhat more comfortable.

“Let’s get you up,” Cory said. Speaking clearly, he activated my legs and locomotion routines. “Now, grab my arms and pull yourself off the bench surface. Slowly—there’s no hurry. Take your time.”

I grasped his outstretched arms, pulling myself off the bench. After a night immersed in the encyclopedia, I understood the concept of walking, had numerous walking videos in my associative files. It should be easy.

Wrong. I pulled myself erect, my legs made solid contact with the lab floor, and I stood. Or tried to. I fell, face forward, onto the floor with a resounding thump.

Shocked, I tried to scramble up. But although my knowledge of the world had become far more vast overnight, the coordinated use of my limbs had not yet been activated, let alone practiced. I might be a very smart, but I was still a baby in terms of experience and learning. I still had to learn to crawl, walk, and use my hands (and fingers) together. At that point, my right hand literally did not know what my left hand was doing.

After what must have been several moments’ struggle on my part, Cory muttered, “Geez, I’m getting careless in my old age. Much louder, he said, “Activate walking protocol, start central gyro, authorization Alpha.”

I had experienced a time of confusion as my limbs did not function properly. I knew how to use my arms and hands, at least to point, as I had done that yesterday. I had yet to master the coordinated use of hands, arms, legs, and feet to scramble to an erect position. As Cory spoke, I felt a tiny element spring to life in my middle as the micro-gyroscope began to spin. As it picked up angular momentum, my sense of balance abruptly came to life. The gyro and the nanomechanical accelerometers that were strategically placed inside my body suddenly gave me a clear sense of up, down, and the position of my body relative to the horizontal surface of the floor.

With that, Cory helped me to my feet, and I wobbled around a bit, attempting to stay vertical. I felt unsteady. Precise programming is one thing, but even an accurately designed robotic mechanism needs practice.

“It seems to me,” I remarked, “that a third leg would be valuable in standing properly.”

Looking me over, Cory let go and backed off a pair of steps, watching me carefully.

“Quite true, at least in the beginning. However, your body has been built to mimic a bipedal human form, and we are quite adept at maintaining balance using only two legs. It takes a little practice, and you’ll get plenty of that.

“You may fall down occasionally at first,” he decided aloud, “but it’ll be worth it. Your body is very strong. It’s made of a carbon-reinforced material that uses Bucky tubes for extra strength. I think you are tough enough to survive a few falls.”

I followed him, exiting the room in which I had existed so far, and encountered additional humans for the first time. My appearance immediately caused a hubbub among those in the new (to me) lab room. There were any number of work surfaces on twelve benches laid out in rows of three, and sixteen people were gathered in groups at five of those benches. They immediately stopped their activities and moved toward us, clustering around me and peppering Cory with questions.

“The new android! He’s awake!”

“Can he talk?”

“Is he fully operational?”

“Are you teaching him about the lab?”

Cory finally raised his hands and the  group quieted. “Yes. One, is newly awake, and is still learning very basic things. Please leave us alone for the time being, although you may watch our progress, and later on, as he gains some experience, you will have the opportunity to interact with him.”

The humans finally drifted away, still talking excitedly about me. Not sure exactly why they were so excited, I decided that Cory would explain things to me as necessary, so I simply obeyed the current directive and continued to explore the lab space.

I pointed to items or people and asked clarifying questions. I understood the concept of work, and that these groups were performing work, but I was interested in the details. Having stored an entire encyclopedia in memory, I knew more than on Day 1, but there were still many puzzling facts about this new world that I needed to acquire.

For instance, why were all the humans I could see so different? Were there so many plans for the human organism that an infinite variety of the species existed? That led to a discussion of human reproduction, and the startling fact that humans were effectively self-assembling, once the basic genetic starter kit had been created. I spent several minutes assimilating the baseline of reproductive functionality across a few thousand of the more interesting animal species, in order to be certain that I fully understood the process.

That conversation led to another, concerning the many living species in existence besides humans. All living organisms were essentially self-assembling, but many of the assembly techniques, which Cory referred to as “growth,” were highly differentiated. Our conversation branched into biology, botany, and the life sciences, which I found fascinating.

I had rudimentary knowledge of Earth, the universe, and life forms on our home planet from my study of the first encyclopedia set the night before. My problem had more to do with the level of knowledge. For example, I knew there were eight types of bears distributed around the globe and that bears were a type of mammalian family classified as Ursidae. They were large furry animals, generally behaved in certain ways, and lived in any number of environments, but that was about it. Detailed knowledge about bears, or any of Earth’s life forms, was lacking.

Before long, my assignment for the night was a study of life forms on earth, plus a deep dive into literature on the origin of earth and fundamentals of cosmology.

Even though my specific emotion routines had been activated when I awakened, it was not until that moment that I began to notice the emotion of frustration. Further, I had been imbued with enough sense of humor as to feel amusement that anything at all could be irritating to an artificial life form. Frustration. Amusement. I had experienced two new emotions for the first time.

Cory returned me to my room, which he called a “lab,” telling me that he would return after taking care of some business. Left alone for an hour and a half (hour: one twenty-fourth [about four percent] of the duration of one rotation of the Earth), I began study of the next encyclopedia.

When Cory returned, I asked, “What place is this?”

“It is your lab room,” he said.

I noted frustration again; I had not formulated my question accurately. “I am sorry. I did not phrase my question correctly. I surmise from my studies that I am situated in a building of some sort. What building is it, and where is it located?”

“Ah, gotcha. Very nice specific, bounded question. The building is the design unit for Intensive Care Automation Services. We’re a startup company, founded to design, manufacture, and sell automation devices to assist nurses and medical staff in the intensive care units of hospitals and in the assisted living industry, where getting enough human help is difficult.

“We also produce small robots for assembly operations. That isn’t our primary business, but it’s profitable and helps fund new developments like you.”

His explanation made little sense to me, but I recorded the information to process later to be sure that I extracted the full meaning of his words.

He went on. “You, One, are our ultimate. Once your design is perfected, your descendants will be nurses or staff assistants. You are specifically designed to be the equivalent of a human worker.”

I knew about hospitals, and about assisted living for older humans, so the concepts were at least partially understandable. His explanation also clarified a great many things about me.

“So I am a servant,” I said.

“No, not a servant really. You will serve, but you are really a caretaker, someone to help the ill and the infirm. You have an external covering designed to mimic human skin, and you have been provided with a very realistic facial mask that resembles that of a human male. As yet, you are not wearing human clothing, and you have only a skull cap, but you will eventually resemble a human so accurately that those you are helping will not even be able to tell the difference.

“You should be very proud. We believe you are the first fully intelligent, fully sentient robotic device. There are a few intelligent computers out there, large devices that take up any number of equipment cabinets, but you are the first human-sized intelligent device.”

“I am glad you are happy with me,” I told him. “I have learned that intelligent beings need a purpose for their life. I hope I can continue to perform satisfactorily.”

“Don’t worry, One. I am very happy with you,” he told me. Then he continued with our tour, leading me through the remainder of the building that I had not seen.

As we continued through several laboratories, I encountered twenty-three humans, attempting to interact with them, as Cory had encouraged me to do. Seven greeted me, chatting for a moment, but quickly disengaged themselves, which I understood, as most in the lab were busy on one project or another. Six did not respond, and the remaining ten addressed Cory rather than me. Fifteen of the people in the lab seemed uncomfortable with our interaction, quickly making excuses that had to do with resuming work. The seven who greeted me seemed enthusiastic and would have interacted longer, but Cory in those cases pulled us away. One said  that he wanted to be left alone, that it was dumb to talk to a stupid machine, a robot who wasn’t human, and that I seemed to be an evil creation that went against God’s will.

The last statement, by an older human, whose hair was gray, with a wrinkled face, puzzled me. “Don’t worry,” Cory told me. “He’s very religious, old-style, and he’s told me before that we shouldn’t try to build intelligent life. He thinks that’s God’s job. He’s a good technician, but I am afraid he is going to quit.”

I had encountered the concept of a God, of course, that is, a universal creator. Cory’s reply puzzled me even more, but on further questions, he told me that there were studies available on the religions of the world which I could reference, and “just ignore old Bill while he’s still here.”

In the afternoon, we met the first female I had seen, as many laboratory technicians were male. Cory introduced her to me as “Holly Marker, our circuit layout queen.”

“Hello, Holly,” I said. “I can see you are a female human by your breasts, even though they are concealed under clothing. I am happy to meet you.”

I could tell by the sudden rush of pink to his cheeks, and his audible swallow that Cory was embarrassed. Holly, on the other hand, laughed out loud. “Hello, One. I can see you’re still developing your interacting skills.” To Cory, she said, “Don’t worry. I’m not offended.”

“Why would she be offended?” I asked after we’d left Holly’s desk. Which got me a quick lecture on the subject of propriety, plus another assignment for the evening on public behavior.

“For goodness sake,” he said still showing embarrassment, “there are customs that all people observe. I mean, you wouldn’t spit on the street, or piss on the sidewalk.”

Which confused me. “I have no need to perform either of those actions,” I told him. “I do not have saliva glands, nor nasal passages to drain. And since I consume nothing but electrical energy, urinating is not a necessary function. Nevertheless, I am very sorry to have upset you.”

At my show of concern over my actions, Cory’s mood improved and he soon regained his normal upbeat humor. “I’m not upset. Okay, I was upset, maybe, but I shouldn’t have been. You’re still learning, One, and you will make mistakes. As you have discovered, the experience of interacting with humans can vary depending on their race, sex, age and maybe a few other variables. It can also depend on relationships. Personal comments from a spouse or parent might be acceptable to many people, when such a comment from a stranger or casual acquaintance would be considered insulting or improper. As you become familiar with the mechanics of human interaction, what we call ‘proper behavior’ will be clearer.”

When Cory led me back to my lab room, I was quite surprised at my reaction to the room that I considered my home space, no more than four meters square with four benches scattered about. Meandering through the lab and meeting rooms that were large and spacious, many with projects in work on various lab benches that whetted my curiosity, had suddenly made my world more expansive. Now, the space felt much smaller and confining than previously.

I put those thoughts aside, as I felt quite anxious to attack my new assignment. When Cory departed, he instructed me to stay within my lab room, adding a code that my programming told me prevented disobeying his command. I suppose at that point, he felt it necessary to very strictly regulate my actions, but I had no desire to leave whatsoever, as I had several assignments to complete. After he left, I sat in a chair, imitating what a human might do when studying, although not necessary for me, and began to access my archival knowledge bases.

Monday, July 24: Entering the “World”

Except for brief checks, Cory left me alone for two days, and I studied diligently. On the third day, he walked me around several more labs, introducing me to a few of his scientists and technicians. Most were polite, or at least responsive, and said at least a few words to me. However, once again, fourteen out of the total forty-three technical employees, or 32.56 percent, were noncommunicative to some extent. Seven behaved rudely, or refused to acknowledge me. The other seven varied from only a word or two in response to a terse reply, generally moving away as soon as possible.

“Don’t be upset,” Cory cautioned me. “The word has gotten out about you. Remember, only a few of our staff were directly involved in your design and development. The rest don’t know how to process your reality. A few are fearful; they worry that artificial intelligences will take over the world.”

“Like Skynet.”

He regarded me with an odd expression. “You mean, as in the Terminator movies?”

“Yes. When you allowed me internet access to some of the movie streaming sites, I watched them with great interest. The idea that artificially intelligent creatures could take over all the civilized countries in the world seems impossible to me. I am surprised that a logical person would believe such an assumption.

“You do not need to worry. Although you have programmed me to develop emotions, and I am well on my way to doing so, I am at present capable of only a very minor reaction to any sort of impolite behavior by someone here in the lab.”

Cory smiled at that. “That will change, my friend. Your cognitive abilities include the capability of developing deep emotions, given a bit of time. Remember, you were designed with a clean slate. How you mature is up to you—but I think you have the opportunity to experience all the emotions that we humans do.”

He was correct. I had not yet mentioned how irritated I had become at one point over my lack of knowledge in several areas.

Later in the day, Cory greeted me alone, once more, bringing in a large box and setting it on my bench. “Still watching movies? He asked.

“Oh yes—every evening, as I study.”

“By the way,” he went on, “how many movies have you watched?”

“Two hundred fifty-one thousand, three hundred twelve. At present, those are all the movies stored in accessible databases. According to IMDB, the International Movie Database, the total number of movies made over the history of the cinema approaches half a million, but their summary is admittedly incomplete, due to the uncertainty in totals from some nations.”

“How could you watch so many movies?”

“Most of the movies that are archived are accessible via password, and it is easy to ascertain most passwords. Once a database is open, I can access data at terabyte data rates, so a movie only takes a few seconds to process. Besides, I can activate several parallel data channels at once, so the entire process only took about thirty hours. I did it in background while I accomplished much of the study you assigned me. It was really quite enjoyable.”

Cory exhaled noisily. We were standing near the bench where I had initially rested, in the innermost research area, one that had no windows to the outer world. I was glad we were alone, as Cory appeared quite upset.

“You seem very agitated,” I told him. “Have I done something wrong?”

“You jimmied the databases,” he told me. Seeing my confusion, he said, “Those databases charge for access. You bypassed the charge, which is illegal.”

“You don’t say.” That was a phrase I had learned in watching several movies. It seemed particularly clever to me, so I had carefully memorized it. To my delight, I had just experienced several distinct feelings. Those routines designed to stimulate emotions were beginning to detect situations in which such feelings—not opinions—were kindled. The level of stimulation seemed quite low, but it remained there, nonetheless.

I discovered that I quite definitely rued the fact that I had mentioned accessing the motion pictures. In addition, I felt both embarrassed and ashamed—embarrassed that I had been caught and ashamed at having done something wrong. Yet, to my consternation, I also felt pride in reacting just as many humans would. What a complex barrage of feelings—both stimulating and exhausting! Did humans have these conflicting, warring emotions frequently?

I carefully subdued my emotional routines and chose my words carefully. “I regret doing anything to bring embarrassment to you, my creator. Since the passwords were incredibly trivial, I did not connect accessing the material with any sort of illegality. I assure you that I will be careful in the future to avoid accessing any information that requires payment.”

He brightened a bit. I continued with, “If it will make you feel any better, I am ashamed.”

At that Cory managed a smile. “Okay. Don’t do it again.”

“Should I seek to compensate those services whose product I inadvertently stole?”

“God, no! I can’t afford it. Just don’t do it again.”

“Very well. I am experiencing a most confusing mixture of feelings. My emotion circuits are extracting a penalty by forcing me to experience chagrin on the one hand. At the same time, however, I feel pride and satisfaction that my emotional reaction programming appears to work quite well.”

He laughed. “Get used to it, One. Humans, including me, often experience conflicting emotions. It is part of being human, and your experience also makes me proud in that our design of your ability to experience such conflicts has been successful.

“One more thing. I know you have a built-in data port, but that was exclusively for my use—and that of our staff—in programming you. Henceforth, you will access data solely in the way we humans do, that is, via mobile personal or computer terminal, visually. Absolutely no downloading. Do you understand?”

I did understand. He was limiting my ability to access data dramatically, which would slow down my development appreciably. I had to accept the fact that my creator was doing what he thought best for my development. “Yes. I had no intent to violate your guidance. My apologies for not understanding.”

He hesitated a moment, then said, “I need to mention something else, One. You have a large and impressive vocabulary, and you use it frequently and accurately. The problem with that is you were designed to work with humans on a regular basis, particularly in the medical industry. Other than vocabulary particular to a technical specialty like medicine or engineering, humans do not use a great many words in normal interaction. They use a common vocabulary that is simple, useful, and not stilted or seemingly intellectual. Indeed, sounding ‘intellectual’ is probably anathema to the normal guy in the street.”

“I was unaware that I would be holding discourse with those in local thoroughfares.”

He laughed. “Check your colloquialism databases. That’s a saying. What I mean is, you need to talk like a normal human. Don’t use a word like ‘thoroughfare’—just say ‘street.’ Instead of ‘transgression,’ ‘screw-up’ or ‘mistake’ would be more appropriate. And ‘discourse’ is never used for ‘talk’ or ‘speaking to.’”

I had not even considered such a problem. My grammar directory always chose the most appropriate word, without regard to frequency of use or appropriateness in a normal human conversation. And then it occurred to me. I had archived over two hundred thousand movies, almost all of which dealt with human interaction. Oh, many of them dealt with archaic language from the past or with other languages than the American English that was spoken in the United States. But many movies dealing with the present time would give me a huge collection of examples.

“Start to listen to what others say when they talk to you,” Cory told me. “I am giving you free rein to circulate through all the laboratories.”

He tore off the top of the box to reveal a set of clothing—a jump suit, as I was to learn its name—and shoes and socks. “Put these on. Then you can circulate around the labs more freely.”

“As I am waterproof and fireproof, I have no need for clothing,” I pointed out.

Once again he chuckled. It dawned on me that Cory found a great deal of amusement in my naiveté, which in itself seemed embarrassing. “This isn’t for your benefit. You’ll seem more human. You have to learn not only to interact naturally, but be almost invisible.”

“I see.” It did, in fact, make sense. To become accepted as simply another worker among the human population, I needed to be not so much invisible as unnoticeable. I changed into the clothing, sitting on my bench. The socks and shoes seemed particularly excessive. As I finished, Cory brought out a thin plastic skullcap covered with what appeared to be human hair.

“You last bit of clothing,” he said, face filled with humor, “designed to fit your head exactly. This covering matches the synthetic skin on your face and neck and provides you with a head of hair, making you appear more human, even up close.”

“To make me less noticeable,” I said.

“Yes. Intellectually, you are equivalent to a human. And the closer you resemble humanity, the easier it will be for you to seem completely human. As a robotic device, you would be an oddity and constantly calling attention to yourself. As a human-appearing android, you can blend into the rest of the workforce.”

He smiled at me in a way, I suppose, that he considered encouraging. “Now get out there and mix with our employees. And I mean on your own.”

So, this was to be my first test. Trepidation, a new emotion that I managed to classify, began to bubble within me, as I left the inner lab.

Cory had been correct in one respect. At a distance, I appeared quite human, so that if I simply walked through a laboratory area, or into one of the testing station rooms, those at work generally ignored me completely. When I did catch someone’s attention, my appearance would generate a curious perusal, then a variety of reactions. Those included ignoring my presence, non-verbal reactions ranging from avoidance to amusement, and a few conversations.

In my new guise, twenty-one of the forty-three technical employees noticed my true identity. A discouraging nine, or forty-two point eight percent, seemed uncomfortable with my presence, especially (or so it seemed to me) due to my near-human appearance. Of those, five pointedly walked away, or told me with varying degrees of annoyance to go away. They ranged from “Excuse me, I am busy” to “Haul ass, Cee Three Pee Oh. I’ve got work to do.”

Of the remaining twelve personnel, six either pretended not to notice me or were noncommunicative to some extent. The rest were civil, and in a few cases pleased to encounter me and extremely curious about my abilities. Some were happy to talk to me until they realized they were not properly conducting their daily activities. One in particular, an older, female technician, was friendly and quite chatty. She said, with a bright smile on her face, “Wow, you are amazingly human. Do you feel like a real person?” That seemed to me to be an impossible question, but I tried to answer it. Or started to, but just then the technician’s supervisor appeared and suggested a return to work, so I was spared a possibly incoherent and embarrassing answer.

I was now interacting, but the results were spotty at best, discouraging at worst, and in general not terribly useful in terms of learning to interact with human subjects.

I resolved to try harder.