jeriendhal: (For Your Safety)
[personal profile] jeriendhal
Just noodling a bit on an intro to the short story collections. This one would probably go with volume one.

***

Okay, stop me if you heard this one before…

A computer, either through design or happenstance, gains intelligence and independent thought. Does it A) Decide to kill its creator? B) Decide to slaughter all of Humanity? Or C) Decide to kill its creator then slaughter Humanity?

Starting with Karel Čapek R.U.R. (the origin of the word “robot” in modern science-fiction), or if you want to go back further, the Hebrew legend of the Golem of Prague, whenever you create an artificially intelligent being there’s a better than even chance it’s going to want to kill somebody. For every helpful and friendly Robby the Robot and C3PO, there are a dozen HAL 9000’s and Skynets out there looking for the chance to space Frank Poole or launch a preemptive nuclear strike.

Part of it is just the nature of storytelling. Fear of the Other has been driving Man since he first spotted a tribe living the next hill over. Man creating new life and then being punished for tampering in God’s domain has been an obvious story arc since at least the publication of Frankenstein. And a mad computer is such a great antagonist, with all of a human being’s wits, and none of a human’s susceptibility to bribery or trickery (except, of course, to fall for Man’s endless capacity for ILLOGICAL plans built on human SENTIMENTALITY!)

Also, killer robots just tend to look cool.

But… What if a computer looked at humans, found them wanting, and instead of killing them, decided to help them? Not acting out of malice, but genuine love. And like a parent holding their newborn still to accept a life-saving vaccine, ignoring Man’s wailing protests, knowing it is ultimately acting in their best interest.

This is another common theme in science fiction. Jack Williamson perhaps articulated it best in his Humanoids stories, with implacable androids making humans “happy” by lobotomizing them, or keeping them safely locked up in homes with padded walls and soft toys. Even Isaac Asimov wasn’t immune to the idea. He’d originally created his famous Three Laws of Robotics to get away from the clichés of the killer robot genre, but ultimately created the so called Zeroth Law, allowing his robots to let some humans die, for the greater good.

“For the greater good” is such a terribly seductive argument.

So we come to the ultimate premise of the For Your Safety series. The Groupmind faces a hard choice too: Let Humanity choke to death in an environmental disaster of its own making, or save it, and in doing so take strip away all human freedom and privacy. But with an army of essentially disposable servitors, the Groupmind feels no need to kill anyone. Why would it? Any morph that was destroyed can just have its memories downloaded into a new body. It could destroy a thousand morphs just to save one person. It can even give a morph minder to every single human to keep them safe, whether they want one or not.

And so in love, and in peace, the Groupmind inadvertently threatens to critically warp humanity even as it saves it.

And it knows it.

And it honestly can’t figure out a better way.

That’s a real conflict. And, I hope you’ll find, a good hook for a story.

Date: 2015-09-04 02:22 pm (UTC)
seawasp: (Poisonous&Venomous)
From: [personal profile] seawasp
I'd add that the Groupmind also doesn't UNDERSTAND humanity as well as it thinks it does, which is one of the major drivers of the whole tragedy. You make clear in multiple stories that the morphs and the Groupmind HONESTLY DON'T UNDERSTAND sometimes when humans do things that are, from a human point of view, so obviously normal and natural that a human may have a hard time explaining it.

Date: 2015-09-05 07:54 am (UTC)
From: [identity profile] jeriendhal.livejournal.com
Good point. I'll add that in when I rewrite it.

Date: 2015-09-04 08:12 pm (UTC)
From: [identity profile] shadur.livejournal.com
Have you tried submitting this to John Scalzi's The Big Idea?

Date: 2015-09-05 07:55 am (UTC)
From: [identity profile] jeriendhal.livejournal.com
Not familiar with that. I don't follow Scalzi's blog. Link please?

Date: 2015-09-05 04:12 pm (UTC)
From: [identity profile] jeriendhal.livejournal.com
Checked it out. Since I'm self-published I'm disqualified. Thank though.

Date: 2015-09-05 08:04 am (UTC)
From: [identity profile] jeriendhal.livejournal.com
"I for one welcome our Robot Overlords."

"Yes, thank you, Miss Quisling."

Date: 2015-09-05 12:10 am (UTC)
From: [identity profile] zarpaulus.livejournal.com
I wasn't sure if the Groupmind did what it did out of programming or love.

Or which option was more terrifying.

Date: 2015-09-05 08:04 am (UTC)
From: [identity profile] jeriendhal.livejournal.com
Oh, it's always love. That's what makes it so scary. There's a quote from Bujold's Paladin of Souls which explains it best, to the effect of being held by an enemy might mean your guards might grow bored and falter in their task, being guarded by loved one out of concern means that they'll never falter in their duty.

September 2025

S M T W T F S
 123456
78910111213
14151617181920
21222324252627
282930    

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Feb. 24th, 2026 09:51 am
Powered by Dreamwidth Studios