Previous Challenge Entry (Level 2 – Intermediate)
Topic: Feel (emotions) (08/26/10)
-
TITLE: Emotibot | Previous Challenge Entry
By Jody Day
09/02/10 -
LEAVE COMMENT ON ARTICLE
SEND A PRIVATE COMMENT
ADD TO MY FAVORITES
“Take…me…to…your…leader,” said Rayvoid in robot drone.
“You may have been programmed with a human humor chip, but it is not appreciated here. Do you have your report?” said RL10, the Supervisor of the Robot Human Emotion Experiment. The program, dubbed Emotibot, sought to improve the advancements in Robotics to include human emotion software.
Rayvoid switched the communicator button on his torso control panel to normal human English.
“I have it, sir, but I can tell you right now that I have been programmed incorrectly,” Rayvoid said, then added while flailing his arms up and down, “Danger, danger, Will Robinson.” He had downloaded too many science fiction television programs that always seemed to link to his humor chip.
“I require more information. Please state the purpose of your experiment and a full account of your observations,” RL10 said.
“Yes, sir. I was programmed with a series of keywords and images that would trigger an appropriate outward response. For instance, if I detected the word “starvation” my computer should cause a frown and possibly the word “sad” or “help”. I was assigned to observe a Christian family to determine if my responses are a reasonable parallel to theirs. I have computed that the software is obsolete or possibly I have been programmed with the wrong keywords.
“Explain your conclusions, Rayvoid,” RL10 said.
“The family settled in for an evening of watching television. On the news program there was a report of a catastrophic hurricane in a third world country. I detected the keywords ‘destroyed’, ‘hunger’, and ‘death’. There was no response. They simply used the remote control device to change the channel.”
“No response at all?” RL10 asked.
“No sir. The next program was their local news. A fire was reported and the words ‘homeless’, ‘3rd degree burns’ and ‘needs’ were used. Still no response. I conclude that they must have been disconnected from their power source or I have been programmed with the wrong keywords.
“Any further conclusions?” asked RL10.
“I compute that the Christian software is unnecessary. Their responses are the same as non-Christian people groups. Further, I would add more keywords related to money, food, and material things,” Rayvoid said.
“The addition of those keywords, as they relate to human emotions, is based on what conclusion?” RL10 asked.
“The female teenager asked her male parental unit for cash. His negative response triggered crying, complete with tears, and angry contortions of her face. Then there was a discussion of a planned purchase of a new car. I detected the words ‘joy’, ‘fun’, and ‘status’ which seemed to trigger the ‘happiness’ emotion. The most significant response was the use of the word ‘love’ in connection with the apple pie that they were consuming during this discussion.”
“Your conclusions are similar to 90% of the reports we have collected so far. There is a research project planned to determine why the 10% seemed to be more connected to their power source than the rest. The scanner detects some equipment damage around your vision portals. You should report to the lab for repairs,” said RL10.
“Yes, my software responded to the keywords programmed into my computer. Simulated tears were released during the news programs. I think my vision portals are beginning to rust.”
Rayvoid was programmed to say ‘Thank you, goodbye,” upon leaving the presence of his supervisor, but his humor chip uploaded “I’ll be back,” complete with Schwarzenegger tone and accent. This triggered RL10’S dormant humor chip to log on to his system, and the robot smiled.
The opinions expressed by authors may not necessarily reflect the opinion of FaithWriters.com.
If you died today, are you absolutely certain that you would go to heaven? You can be right now. CLICK HERE
JOIN US at FaithWriters for Free. Grow as a Writer and Spread the Gospel.
GREAT job on this one!