Real robots!

Discussion in 'General Chatter' started by Exohedron, Jul 16, 2015.

  1. Aondeug

    Aondeug Cringe Annoying Ass Female Lobster

    I can rest easy knowing that the walking cube enemies from shit like Mario 64 exist. One day we will regret making the Cubli though. One day we will learn what the Legend of Grimrock truly is.
     
  2. turtleDove

    turtleDove Well-Known Member

    I feel like it's not moral to compel robots to obey the Three Laws, any more than it'd be moral to put a chip in a human's head which forces them to obey some designated authority figure. And, like, that's before we even touch the whole "must obey orders except where they'd cause harm to humans" because that still leaves a really wide range of shit that these robots can be ordered to do which doesn't harm a human but is still a really bad idea. Like - it arguably doesn't harm someone to have flowers turn up at their door every day and to get letters hand-delivered! But what if a stalker goes "ah, I shall be Clever and not violate the restraining order, and instead I shall have a robot do these things for me"? No harm is done to humans if a robot is ordered to release all the lab animals into the wild - but it's still going to fuck up the local ecosystem and it'll fuck up the experiments that were being done (some of which may have resulted in discoveries which would have improved human lives). And I'm pretty sure someone could fast-talk their way into "it doesn't harm humans" in order to get food dye, or something unpalatable-but-food-grade dumped into the water supply.

    And that's just the first few things I thought of, not counting the "well, it's really better for X if their car can't be used / if I know where they are all the time / if they aren't allowed to go to such-and-such a place or hang out with certain people" that I could see being really easy to rules-lawyer a Three Laws-compliant robot into going along with. That's not even considering every cleverboots who would assume that they knew better than the people in charge, and would proceed to at least try and talk the robots into doing exactly the opposite of what the safety regulations required.

    tl;dr: oh god, the three laws would be a legal nightmare if actually put into practice, and that's just considering stuff I came up with in ten minutes at 2 in the morning.
     
    • Like x 2
  3. Wingyl

    Wingyl Allegedly Magic

    Freefall has that as a major plot point-very few of the robots are strict three-laws, but they're all equipped with safeguards that make them more like 'lax three-laws'.
    The creator of their brain design-which is very flexible and also used in the uplifted wolf protagonist Florence-apparently believes that no safeguards can stand up to true consciousness, so it was his duty to make safeguards moral training wheels.

    Unfortunately there's still safeguard abuse-even though Florence frequently deliberately misinterprets things to let her do other things (example: when faced with a "No AIs Allowed" sign and a "No Dogs Allowed" sign, she goes "Double negative, I can go in!"), we also see her ordered into a lot of unpleasant situations.

    Things we've seen include: an AI who doesn't have enough clock cycles spare to go around his safeguards, making him convinced that the existence of AI at all is inherently harmful to humans and anything that serves the goal of removing or disabling AI on a planet being terraformed by AI will help humanity; several AIs ordered into starting a robot war; an AI ordered that anything that serves one particular corrupt corporate executive's orders helps all of humanity and then ordered to lobotomize every robot on the planet and then kill himself; Florence worrying about being ordered to "eat her own fingers off" or "kill every turtle on the planet"; a guy teaching robots a religion that involves every religion being right simultaneously to avoid them being ordered into believing one religion; and a robot obeying the laws to their fullest extent being compelled to do the opposite of whatever a nonhuman says-so if another robot asked it to help it build something, it'd start actively trying to stop that robot.
     
    • Like x 2
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice