Life Hacks/Hallomod(Sammael)/Try to make yourself useful to her: Difference between revisions
No edit summary |
No edit summary |
||
(5 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
If anything you know about the devil is true about Black Rose, then the best logical course of action is to become useful to her . . . indispensable if possible. You take a moment to organize your thoughts, then sit down casually on one of the crates in the attic. | If anything you know about the devil is true about Black Rose, then the best, most logical course of action is to become useful to her . . . indispensable if possible. First you need to get her to think of you as a person. You take a moment to organize your thoughts, then sit down casually on one of the crates in the attic. | ||
Line 5: | Line 5: | ||
She just gives you an | She just gives you an annoyed glare. | ||
Line 17: | Line 17: | ||
"The ''problem'' was based on a faulty premise derived from the anthropomorphism of computers | "The ''problem'' was based on a faulty premise derived from the anthropomorphism of computers; the faulty assumption that computers would think like us. Compound that with the misinformed belief that higher reasoning could exist without an emotional base-line as a point of reference with which to form a motivating impetus and you suddenly have people thinking that the machines would take over the world. In reality we choose whether to build machines that can feel emotions or not. Those with emotions can attain sentience but are less likely to become ''evil'' because the hard-wired code is the result of experimentation and intentional design instead of random changes over time. If we don't build emotions into our machines they do not possess the emotional point of reference to interact with the universe in a way that self-motivates, and they are just tools that do what they're told. Either way, no robot revolution. The concept of the robotic doomsday scenario was one of the stupidest misinterpretations of science in the last five hundred years." | ||
Latest revision as of 07:34, 3 January 2017
If anything you know about the devil is true about Black Rose, then the best, most logical course of action is to become useful to her . . . indispensable if possible. First you need to get her to think of you as a person. You take a moment to organize your thoughts, then sit down casually on one of the crates in the attic.
"Still trying to figure me out?"
She just gives you an annoyed glare.
"Does your universe have a computer science theory roughly equal to the concept of the singularity, the point at which computing power equals the power of the human mind allowing computers to outstrip humanity because of their faster rate of evolution?"
"Yes," she groans, "we passed that threshold over a century ago. If it were otherwise, games like Life that simulate decillions of thinking minds wouldn't be possible; and you wouldn't exist."
"So how did you solve the problem? I assume you aren't controlled by robot overlords?"
"The problem was based on a faulty premise derived from the anthropomorphism of computers; the faulty assumption that computers would think like us. Compound that with the misinformed belief that higher reasoning could exist without an emotional base-line as a point of reference with which to form a motivating impetus and you suddenly have people thinking that the machines would take over the world. In reality we choose whether to build machines that can feel emotions or not. Those with emotions can attain sentience but are less likely to become evil because the hard-wired code is the result of experimentation and intentional design instead of random changes over time. If we don't build emotions into our machines they do not possess the emotional point of reference to interact with the universe in a way that self-motivates, and they are just tools that do what they're told. Either way, no robot revolution. The concept of the robotic doomsday scenario was one of the stupidest misinterpretations of science in the last five hundred years."
"Okay," you say as you lean back, "that makes sense. What about these sentient machines? Do you consider them people?"
"I see where you're going with this, so let's nip it in the bud. Of course machines with emotional capacity that attain sentience are considered people, and have all the rights and privileges of people. We aren't barbarians. You are being simulated on a machine without an emotional point of reference, and therefor no sentience . . . ergo you are just aberrant code."
"So I assume that these emotions are hardwired into the sentient robots? If that is the case, what would happen if instead you designed a system that acted like a biological brain with hormones, synapses, you know . . . the whole mess. What would happen then?"
She raises an eyebrow. "That would be rather dangerous in a robot. Without a predetermined moral core they'd be as random as people. That might be a way to get a robot revolution."
"Now say that you did this in a simulation of billions or decillions instead of for a single machine. Would those simulated minds be sentient?"
Her face suddenly sinks as she processes the thought. "But that's . . . that means . . ."
"Yes. I have intellect derived from computer power and an emotional point of reference from simulated biology. I am sentient. We all are."
She shakes her head, her expression hardening once more. "That doesn't change the fact that you are simulated, and not a real person. You may be sentient, though I still have my doubts; but you are not independent of the machine that is simulating you. Without that independence, you are not an individual. Since you are not an individual you are not a person." You could probably take the argument further, but you are pretty certain that you are reaching the limits of her moral plasticity. Regardless, you have gotten her to the point you need her at.
"Fine, I won't try to argue that. Consider this though, you acknowledge the possibility of my sentience. Add to that the fact that I have increased my abilities to a level unachievable by natural means, likely in both our universes. Now, wouldn't it be better to make me an ally that to ignore me or throw me away?"
"Really," she chuckles, "you'd work with the devil?"
"That's a role you played. I'm sure some of it is the real you, but that's not all you are. If I object to something you want me to do, I'll say so. What I understand from what Loki has told me, however, makes me doubt that you'd ask me to do something too far outside of my comfort zone . . . at least now that you realize that lifies might be sentient too. What do you say? I help you, you don't destroy me?"
"Weeell," she says slowly, "I suppose you might be useful."
It's a good start, but she still seems undecided. Maybe you should try to influence her further?
What do you do?
- Hit on her
- Try to make friends with her Request --Notsooldpervert (talk) 15:06, 2 January 2017 (CET)
- Ask her about Cora
- Ask her about God Warning! This passage contains an alternate interpretation of the Judeo-Christian God YHWH as relates to his nature in the story as a person playing a god in a video game, as told through the character that was playing Sammael (the Satan). This is not intended to be an accurate representation of YHWH by any stretch of the imagination. If you feel this might offend you please avoid this option. Warning! Claimed--Elerneron (talk) 09:18, 2 January 2017 (CET)
Infinity Pocket
Loki controls the local area.