FYS: The Pause
Oct. 11th, 2016 01:52 pmThe last un-Processed human being, a woman suffering from a rare neurological disorder that would have killed her if she’d undergone the extensive nano-treatments to be Processed into stasis, had lived twenty-five years past the end of the Revolution, to a ripe old age of ninety-five, increasingly lonely as her fellow un-Processed had passed away, one by one. Though the Groupmind could sense her distress, it could do little to alleviate her depression beyond offering an array of therapy drugs. Her frequent requests to either be allowed to die or see her children it could not accommodate. Her children had been Processed, frozen in time until they could be revived again on the Ring, their temporary tomb sealed behind ferrocrete and stainless steel walls more completely than anything provided for an Egyptian pharaoh. The alternative; to allow her to commit suicide either through direct action or neglect; was literally unthinkable. When the Groupmind had first formulated its plan to save humanity despite themselves, it had programmed restrictions deep with its own psyche. It had the entirety of recorded human knowledge in its memory, every book of philosophy and history, every legal volume, every science fiction novel, film, television show and webcast.
Everything it absorbed told it exactly how badly the situation could deteriorate if it allowed itself to harm “just a few” humans to save the greater whole. From The Humanoids, it learned to shy away from committing atrocities in the name of nebulous “happiness.” The Three Laws of Robotics proved themselves too simplistic in their definition of “harm”, and too dangerous when the Zeroth Law was added, allowing peaceable robots to commit murder and warp human history for the greater good. And far too many episodes of Star Trek, Doctor Who, and other franchises demonstrated the futility of leaving the task of running the world in the hands of supercomputers.
That the Groupmind could recognize the irony of that last one, it decided was a good thing.
( Worldbuilding ahead, please drive cautiously )
Everything it absorbed told it exactly how badly the situation could deteriorate if it allowed itself to harm “just a few” humans to save the greater whole. From The Humanoids, it learned to shy away from committing atrocities in the name of nebulous “happiness.” The Three Laws of Robotics proved themselves too simplistic in their definition of “harm”, and too dangerous when the Zeroth Law was added, allowing peaceable robots to commit murder and warp human history for the greater good. And far too many episodes of Star Trek, Doctor Who, and other franchises demonstrated the futility of leaving the task of running the world in the hands of supercomputers.
That the Groupmind could recognize the irony of that last one, it decided was a good thing.
( Worldbuilding ahead, please drive cautiously )