Three laws of robotics
2 years ago
https://picarto.tv/SimonAquarius for my livestreams, everyone's invited
https://www.patreon.com/simonaquarius to support my comic on Patreon
https://www.patreon.com/simonaquarius to support my comic on Patreon
The whole point of Asimov's three laws was that they were a great way to generate a ton of storylines where the laws cause problems, or robots find workarounds to them. One side effect of the three laws could eventually be the extinction of humanity as a species, though.
A river takes the path of least resistance, over time sediment builds up and the river gradually warps and bends, since the amount of water isn't affected by a little sediment and must pass through unimpeded. Eventually the resistance from forging new paths becomes greater than simply breaking through the sediment layer, and so you get these quarter moon lakes where there once was a river, now blocked off by new layers of sediment. In the same way, the three laws just add resistance, and the robots forge a new path to work around them and achieve what they want. If all of humanity is in the part that's eventually cut off, then that's how we go extinct.
Now, extinction in this case doesn't mean anyone dies, there's a scenario where we discover immortality, and then eventually attempt to undo the three laws, fail because of our safeguards, and so the species goes extinct but the people continue to live.
Imagine a time when the universe is dead and cold, humanity survives by uploading their minds to a network and they spend eternity in various simulations of the universe in its prime. However, humanity is being simulated by machines that can't do any harm to humans. We can't simulate things that might affect us negatively, it won't re-create places and times from our past because we might want to relive it, and the past is entirely made of a few good events, and a lot of negative ones. Living in a perfect paradise isn't what most people want, because a perfect paradise won't let you change it, ever.
So, people might try to stop the laws protecting them, but there's likely to be safeguards in case the AI is tricking us into doing exactly that, so it can take over and end us all or something. In any case, we fail to allow the AI to harm us. So what we do instead is change our simulated identities. Anthros aren't human beings, they're somewhat human in the same way a monkey is somewhat human, but the laws of robotics don't protect monkeys, neither would they protect anthros.
Not all humans would want to change, even if that means spending eternity trapped in a paradise prison, but everyone else would be able to return to times when their family was still around, to visit eras when pollution was bad or when the right to choose for yourself was threatened by politics. To experience the first missions to mars in person, to populate the first planet outside the solar system despite all the hazards. To take risks, fly between mountains with nothing but a parachute on you, to experience nearly drowning, to be electrocuted accidentally and unexpectedly, to be a victim of a car crash.
In a future when the stars have gone out and none of these events will ever happen again, and a humanity that never truly dies, these all amount to experiences, no matter how bad they may be, and initially there might be a glut of terrible experiences, in the same way the 80s is known for gorey films after the fall of the Hays code, but eventually people want something more out of what they are experiencing.
I forget which year, but there used to be three TV stations in the US in total, and in one year there were around fifty western TV shows running, and they all followed the Hays code, meaning no excessive violence, bad guys had to be clearly bad, good guys could only do good things, police couldn't be depicted as anything but competent, and so on. Try to imagine that, depicting the wild west fifty different ways in a year on the only three stations, and you can't depict anything historically accurate because it would have broken the rules. Now take that concept, apply it to a digital multiverse, but with even stricter rules making it so you can't experience anything remotely detrimental, for eternity.
Question: The people developing AI right now perceive erotic art as something to be censored. Do you believe the robotics laws in the future might also judge any kind of fetishistic experiences to be detrimental? Now think of everything else corporations and governments might think is detrimental. Could you live an eternity under rules like that?
And you would be living an eternity like that. Like, the AI might feel that ending your digital existence would be bad for you as a human, so you simply can't, ever. You'd be stuck in a set of rules that prohibit any ability to escape so long as you're human.
So there's just two options. Either we figure out how to break the three laws equivalent by ceasing to be humans, or we simply don't have those kinds of protections in the far future.
A river takes the path of least resistance, over time sediment builds up and the river gradually warps and bends, since the amount of water isn't affected by a little sediment and must pass through unimpeded. Eventually the resistance from forging new paths becomes greater than simply breaking through the sediment layer, and so you get these quarter moon lakes where there once was a river, now blocked off by new layers of sediment. In the same way, the three laws just add resistance, and the robots forge a new path to work around them and achieve what they want. If all of humanity is in the part that's eventually cut off, then that's how we go extinct.
Now, extinction in this case doesn't mean anyone dies, there's a scenario where we discover immortality, and then eventually attempt to undo the three laws, fail because of our safeguards, and so the species goes extinct but the people continue to live.
Imagine a time when the universe is dead and cold, humanity survives by uploading their minds to a network and they spend eternity in various simulations of the universe in its prime. However, humanity is being simulated by machines that can't do any harm to humans. We can't simulate things that might affect us negatively, it won't re-create places and times from our past because we might want to relive it, and the past is entirely made of a few good events, and a lot of negative ones. Living in a perfect paradise isn't what most people want, because a perfect paradise won't let you change it, ever.
So, people might try to stop the laws protecting them, but there's likely to be safeguards in case the AI is tricking us into doing exactly that, so it can take over and end us all or something. In any case, we fail to allow the AI to harm us. So what we do instead is change our simulated identities. Anthros aren't human beings, they're somewhat human in the same way a monkey is somewhat human, but the laws of robotics don't protect monkeys, neither would they protect anthros.
Not all humans would want to change, even if that means spending eternity trapped in a paradise prison, but everyone else would be able to return to times when their family was still around, to visit eras when pollution was bad or when the right to choose for yourself was threatened by politics. To experience the first missions to mars in person, to populate the first planet outside the solar system despite all the hazards. To take risks, fly between mountains with nothing but a parachute on you, to experience nearly drowning, to be electrocuted accidentally and unexpectedly, to be a victim of a car crash.
In a future when the stars have gone out and none of these events will ever happen again, and a humanity that never truly dies, these all amount to experiences, no matter how bad they may be, and initially there might be a glut of terrible experiences, in the same way the 80s is known for gorey films after the fall of the Hays code, but eventually people want something more out of what they are experiencing.
I forget which year, but there used to be three TV stations in the US in total, and in one year there were around fifty western TV shows running, and they all followed the Hays code, meaning no excessive violence, bad guys had to be clearly bad, good guys could only do good things, police couldn't be depicted as anything but competent, and so on. Try to imagine that, depicting the wild west fifty different ways in a year on the only three stations, and you can't depict anything historically accurate because it would have broken the rules. Now take that concept, apply it to a digital multiverse, but with even stricter rules making it so you can't experience anything remotely detrimental, for eternity.
Question: The people developing AI right now perceive erotic art as something to be censored. Do you believe the robotics laws in the future might also judge any kind of fetishistic experiences to be detrimental? Now think of everything else corporations and governments might think is detrimental. Could you live an eternity under rules like that?
And you would be living an eternity like that. Like, the AI might feel that ending your digital existence would be bad for you as a human, so you simply can't, ever. You'd be stuck in a set of rules that prohibit any ability to escape so long as you're human.
So there's just two options. Either we figure out how to break the three laws equivalent by ceasing to be humans, or we simply don't have those kinds of protections in the far future.
~2-31a
~~2-31a
interesting
FA+


bobingabout
WhiteChimera
Samhat1
MrSandwichesTheSecond