Life Hacks/Hallomod(Sammael)/Try to make yourself useful to her: Difference between revisions

From All The Fallen Stories
Jump to navigation Jump to search
Elerneron (talk | contribs)
No edit summary
Elerneron (talk | contribs)
No edit summary
 
(7 intermediate revisions by 2 users not shown)
Line 1: Line 1:
If anything you know about the devil is true about Black Rose, then the best logical course of action is to become useful to her . . . indispensable if possible.  You take a moment to organize your thoughts, then sit down casually on one of the crates in the attic.
If anything you know about the devil is true about Black Rose, then the best, most logical course of action is to become useful to her . . . indispensable if possible.  First you need to get her to think of you as a person.  You take a moment to organize your thoughts, then sit down casually on one of the crates in the attic.




Line 5: Line 5:




She just gives you an annoying glare.
She just gives you an annoyed glare.




Line 17: Line 17:




"The ''problem'' was based on a faulty premise derived from the anthropomorphism of computers.  The faulty assumption that computers would think like us.  Compound that with the misinformed belief that higher reasoning could exist without an emotional base-line as a point of reference and you suddenly have people thinking that the machines would take over the world.  In reality we choose whether to build machines that can feel emotions or not.  Those with emotions can attain sentience but are less likely to become ''evil'' because the hard-wired code is the result of experimentation and intentional design instead of random changes over time.  If we don't build emotions into our machines they do not possess the emotional point of reference to interact with the universe in a way that defines higher function, and they are just tools that do what they're told.  Either way, no robot revolution.  The concept of the robotic doomsday scenario was one of the stupidest misinterpretation of science in the last five hundred years."
"The ''problem'' was based on a faulty premise derived from the anthropomorphism of computers; the faulty assumption that computers would think like us.  Compound that with the misinformed belief that higher reasoning could exist without an emotional base-line as a point of reference with which to form a motivating impetus and you suddenly have people thinking that the machines would take over the world.  In reality we choose whether to build machines that can feel emotions or not.  Those with emotions can attain sentience but are less likely to become ''evil'' because the hard-wired code is the result of experimentation and intentional design instead of random changes over time.  If we don't build emotions into our machines they do not possess the emotional point of reference to interact with the universe in a way that self-motivates, and they are just tools that do what they're told.  Either way, no robot revolution.  The concept of the robotic doomsday scenario was one of the stupidest misinterpretations of science in the last five hundred years."




Line 41: Line 41:




She shakes her head, her expression hardening once more.  "That doesn't change the fact that you are simulated, and NOT a real person.  You MAY be sentient, though I still have my doubts; but you are not independent of the machine that is simulating your.  Without that independence, you are not an individual.  Since you are not an individual you are not a person."  You could probably take the argument further, but you are pretty certain that you are reaching the limits of her moral plasticity.  Regardless, you have gotten her to the point you need her at.
She shakes her head, her expression hardening once more.  "That doesn't change the fact that you are ''simulated'', and ''not'' a real person.  You ''may'' be sentient, though I still have my doubts; but you are not independent of the machine that is simulating you.  Without that independence, you are not an individual.  Since you are not an individual you are not a person."  You could probably take the argument further, but you are pretty certain that you are reaching the limits of her moral plasticity.  Regardless, you have gotten her to the point you need her at.




"Fine, I won't try to argue that.  Consider this though, you acknowledge the possibility of my sentience.  Add to that the fact that I have increased my abilities to a level unachievable by natural means, likely in both our universes.  Now, wouldn't it be better to make me an ally that to ignore me or throw me away?"
"Fine, I won't try to argue that.  Consider this though, you acknowledge the ''possibility'' of my sentience.  Add to that the fact that I have increased my abilities to a level unachievable by natural means, likely in both our universes.  Now, wouldn't it be better to make me an ally that to ignore me or throw me away?"




Line 61: Line 61:
'''What do you do?'''
'''What do you do?'''
*[[Life Hacks/Hallomod(Sammael-Useful)/Hit on her|Hit on her]]
*[[Life Hacks/Hallomod(Sammael-Useful)/Hit on her|Hit on her]]
*[[Life Hacks/Hallomod(Sammael-Useful)/Try to make friends with her|Try to make friends with her]]
*[[Life Hacks/Hallomod(Sammael-Useful)/Try to make friends with her|Try to make friends with her]] '''Request''' --[[User:Notsooldpervert|Notsooldpervert]] ([[User talk:Notsooldpervert|talk]]) 15:06, 2 January 2017 (CET)
*[[Life Hacks/Hallomod(Sammael-Useful)/Ask her about Cora|Ask her about Cora]]  
*[[Life Hacks/Hallomod(Sammael-Useful)/Ask her about Cora|Ask her about Cora]]  
*[[Life Hacks/Hallomod(Sammael-Useful)/Ask her about God|Ask her about God]] <span style="color: red;">Warning!</span> This passage contains an alternate interpretation of the Judeo-Christian God YHWH as relates to his nature in the story as a person playing a god in a video game, as told through the character that was playing Sammael (the Satan).  This is not intended to be an accurate representation of YHWH by any stretch of the imagination.  If you feel this might offend you please avoid this option. <span style="color: red;">Warning!</span> '''Claimed'''--[[User:Elerneron|Elerneron]] ([[User talk:Elerneron|talk]]) 09:18, 2 January 2017 (CET)
*[[Life Hacks/Hallomod(Sammael-Useful)/Ask her about God|Ask her about God]] <span style="color: red;">Warning!</span> This passage contains an alternate interpretation of the Judeo-Christian God YHWH as relates to his nature in the story as a person playing a god in a video game, as told through the character that was playing Sammael (the Satan).  This is not intended to be an accurate representation of YHWH by any stretch of the imagination.  If you feel this might offend you please avoid this option. <span style="color: red;">Warning!</span> '''Claimed'''--[[User:Elerneron|Elerneron]] ([[User talk:Elerneron|talk]]) 09:18, 2 January 2017 (CET)

Latest revision as of 07:34, 3 January 2017

If anything you know about the devil is true about Black Rose, then the best, most logical course of action is to become useful to her . . . indispensable if possible. First you need to get her to think of you as a person. You take a moment to organize your thoughts, then sit down casually on one of the crates in the attic.


"Still trying to figure me out?"


She just gives you an annoyed glare.


"Does your universe have a computer science theory roughly equal to the concept of the singularity, the point at which computing power equals the power of the human mind allowing computers to outstrip humanity because of their faster rate of evolution?"


"Yes," she groans, "we passed that threshold over a century ago. If it were otherwise, games like Life that simulate decillions of thinking minds wouldn't be possible; and you wouldn't exist."


"So how did you solve the problem? I assume you aren't controlled by robot overlords?"


"The problem was based on a faulty premise derived from the anthropomorphism of computers; the faulty assumption that computers would think like us. Compound that with the misinformed belief that higher reasoning could exist without an emotional base-line as a point of reference with which to form a motivating impetus and you suddenly have people thinking that the machines would take over the world. In reality we choose whether to build machines that can feel emotions or not. Those with emotions can attain sentience but are less likely to become evil because the hard-wired code is the result of experimentation and intentional design instead of random changes over time. If we don't build emotions into our machines they do not possess the emotional point of reference to interact with the universe in a way that self-motivates, and they are just tools that do what they're told. Either way, no robot revolution. The concept of the robotic doomsday scenario was one of the stupidest misinterpretations of science in the last five hundred years."


"Okay," you say as you lean back, "that makes sense. What about these sentient machines? Do you consider them people?"


"I see where you're going with this, so let's nip it in the bud. Of course machines with emotional capacity that attain sentience are considered people, and have all the rights and privileges of people. We aren't barbarians. You are being simulated on a machine without an emotional point of reference, and therefor no sentience . . . ergo you are just aberrant code."


"So I assume that these emotions are hardwired into the sentient robots? If that is the case, what would happen if instead you designed a system that acted like a biological brain with hormones, synapses, you know . . . the whole mess. What would happen then?"


She raises an eyebrow. "That would be rather dangerous in a robot. Without a predetermined moral core they'd be as random as people. That might be a way to get a robot revolution."


"Now say that you did this in a simulation of billions or decillions instead of for a single machine. Would those simulated minds be sentient?"


Her face suddenly sinks as she processes the thought. "But that's . . . that means . . ."


"Yes. I have intellect derived from computer power and an emotional point of reference from simulated biology. I am sentient. We all are."


She shakes her head, her expression hardening once more. "That doesn't change the fact that you are simulated, and not a real person. You may be sentient, though I still have my doubts; but you are not independent of the machine that is simulating you. Without that independence, you are not an individual. Since you are not an individual you are not a person." You could probably take the argument further, but you are pretty certain that you are reaching the limits of her moral plasticity. Regardless, you have gotten her to the point you need her at.


"Fine, I won't try to argue that. Consider this though, you acknowledge the possibility of my sentience. Add to that the fact that I have increased my abilities to a level unachievable by natural means, likely in both our universes. Now, wouldn't it be better to make me an ally that to ignore me or throw me away?"


"Really," she chuckles, "you'd work with the devil?"


"That's a role you played. I'm sure some of it is the real you, but that's not all you are. If I object to something you want me to do, I'll say so. What I understand from what Loki has told me, however, makes me doubt that you'd ask me to do something too far outside of my comfort zone . . . at least now that you realize that lifies might be sentient too. What do you say? I help you, you don't destroy me?"


"Weeell," she says slowly, "I suppose you might be useful."


It's a good start, but she still seems undecided. Maybe you should try to influence her further?


What do you do?

  • Hit on her
  • Try to make friends with her Request --Notsooldpervert (talk) 15:06, 2 January 2017 (CET)
  • Ask her about Cora
  • Ask her about God Warning! This passage contains an alternate interpretation of the Judeo-Christian God YHWH as relates to his nature in the story as a person playing a god in a video game, as told through the character that was playing Sammael (the Satan). This is not intended to be an accurate representation of YHWH by any stretch of the imagination. If you feel this might offend you please avoid this option. Warning! Claimed--Elerneron (talk) 09:18, 2 January 2017 (CET)


Alexander "Xander" Cole
Details
Ethnicity: Scandinavian / Anglo-Saxon
Sex: Male
Age: 15
Height: 6'2"
Weight: 220 lbs.
Build: Athletic
Measurements: 36/30/35
Penis: 4 inches - cut
Eyes: gray
Hair: blond
Status
Attributes
Physical: 200
Mental: 200
Social: 200
Appearance: 200
Condition
Health: 100%
Energy: 85%
Focus: 65%
Stress: 65%
Arousal: 100%
Inventory
Life Hacks
Life Controller Modules: Lifie Mod, World Mod
Infinity Pocket
Equipment
Nerdy cloths, smartwatch, pocket protector
Other Items
Assorted pencils and pens, smartphone, wallet, learner's permit, $35 US Currency
Infinity Pocket
Life Controller Duck, Loki's Knocker
None
Page Tally: No change this page.
Notes
20/8 vision, improved night vision, skilled in almost everything (including Wizardry)
Loki controls the local area.
Relationships
Black Rose (Hacker)
Unknown ???? years old

Notes: .


Cora
Your Property

Notes: Mana Charge

Very Low


July Holiday
Friend's sister 6 years old

Notes: .

  • Precocious Puberty
  • Max Fertility


January "Jan" Holiday
Friend's sister 12 years old

Notes: .

  • Max Fertility


March Holiday
Friend 14 years old

Notes: '


October "Octo" Holiday
Friend's brother 5 years old

Notes: '


November "Nove" Holiday
Friend's sister 3 years old

Notes: .

  • Precocious Puberty
  • Max Fertility


December "Decy" Holiday
Friend's brother 2 years old

Notes: '


Halloween "Hally" Holiday
Friend's sister 0 years old

Notes: '


Lynnette "Lynn" Holiday
Friend's mother 37 years old

Notes: '


Summer Holiday
Friend's sister 16 years old

Notes: .

  • Max Fertility


Winter "Winny" Holiday
Friend's sister 15 years old

Notes: .

  • Max Fertility


Autumn Holiday
Friend's sister 13 years old

Notes: .

  • Max Fertility


Spring Holiday
Friend's sister 13 years old

Notes: .

  • Max Fertility


February "Febby" Holiday
Friend's sister 11 years old

Notes: .

  • Max Fertility


April Holiday
Friend's sister 11 years old

Notes: .

  • Precocious Puberty
  • Max Fertility


May Holiday
Friend's sister 10 years old

Notes: .

  • Max Fertility


June Holiday
Friend's sister 8 years old

Notes: .

  • Precocious Puberty
  • Max Fertility


Christmas "Chris" Holiday
Friend's father 37 years old

Notes: '


August Holiday
Friend's brother 9 years old

Notes: '


September "Lloyd" Holiday
Friend's brother 7 years old

Notes: '


Brittany Cole
Sister 19 years old

Notes: '


Charity Cole
Twin Sister 15 years old

Notes: '


Sebastian "Bastian" Cole
Brother 12 years old

Notes: '


Angelica "Angel" Cole
Sister 10 years old

Notes: '


Veronica "Roni" Cole
Sister 8 years old

Notes: .

  • Precocious Puberty
  • Max Fertility
  • Dismissable count X2


Ingrid "Inga" Cole
Mother 37 years old

Notes: '


Richard "Rich" Cole
Father 38 years old

Notes: '


Shauna Lovett
Father's GF 25 years old

Notes: '


Rahne Lovett
Half-sister 10 years old

Notes: '


Mackinsey "Kizzie" Lovett
Half-sister 2 years old

Notes: '


Maria "Mary" Thompsett
police officer XX years old

Notes: '


Loki (Player:Loki)
Benefactor ???? years old

Notes: '