Official Everybody Edits Forums

Do you think I could just leave this part blank and it'd be okay? We're just going to replace the whole thing with a header image anyway, right?

You are not logged in.

#1 2016-08-15 22:31:49

Bobithan
Member
Joined: 2015-02-15
Posts: 4,433

hypothetical ethical situation

If we ever do reach a point where we are able to accurately simulate a human mind, down to all the small complexities of thought and memory and feelings, would it be okay to perform potentially torturous experiments on the artificial intelligence?

What about if we eventually are able to simulate the real world with all the physics and chemicals being completely accurate, allowing us to simulate a virtual setting with even people and animals working exactly as they do in real life? Would it be ethical to perform such experiments on those AIs?

And finally: How about completely artificial human clones?

Offline

Wooted by:

#2 2016-08-15 22:36:24

32OrtonEdge32dh
Member
From: DMV
Joined: 2015-02-15
Posts: 5,166
Website

Re: hypothetical ethical situation

What's to say we aren't already the simulation?


32ortonedge32dh.gif

Offline

#3 2016-08-15 22:37:44

Bobithan
Member
Joined: 2015-02-15
Posts: 4,433

Re: hypothetical ethical situation

Would the answer to those questions be different whether or not we are indeed simulated?

Offline

Wooted by: (3)

#4 2016-08-15 22:37:50

Slabdrill
Formerly 12345678908642
From: canada
Joined: 2015-08-15
Posts: 3,402
Website

Re: hypothetical ethical situation

32OrtonEdge32dh wrote:

What's to say we aren't already the simulation?

nothing


suddenly random sig change

Offline

#5 2016-08-15 22:41:04, last edited by 32OrtonEdge32dh (2016-08-15 22:41:35)

32OrtonEdge32dh
Member
From: DMV
Joined: 2015-02-15
Posts: 5,166
Website

Re: hypothetical ethical situation

Bobithan wrote:

Would the answer to those questions be different whether or not we are indeed simulated?

I think that depends on knowing if we are or not.  We have to know to know.


32ortonedge32dh.gif

Offline

#6 2016-08-15 22:45:04

Different55
Forum Admin
Joined: 2015-02-07
Posts: 16,572

Re: hypothetical ethical situation

Good question. There's one set of bad times in my life that everyone around me is convinced happened but I have no memory of them. Since I don't remember, I don't care. If we wiped everything and completely rewound the simulation after you were done being a sadistic bastard I'm not sure they'd care either. But you could also twist that into the real world by doing horrible things to people then killing them. After all, the dead don't care. So what makes torturing people bad in the first place? The scars it leaves behind or the pain that happens in the moment?


"Sometimes failing a leap of faith is better than inching forward"
- ShinsukeIto

Offline

#7 2016-08-15 22:46:22

Slabdrill
Formerly 12345678908642
From: canada
Joined: 2015-08-15
Posts: 3,402
Website

Re: hypothetical ethical situation

Different55 wrote:

So what makes torturing people bad in the first place? The scars it leaves behind or the pain that happens in the moment?

the former, i'd say


suddenly random sig change

Offline

#8 2016-08-15 22:47:39

32OrtonEdge32dh
Member
From: DMV
Joined: 2015-02-15
Posts: 5,166
Website

Re: hypothetical ethical situation

Different55 wrote:

So what makes torturing people bad in the first place? The scars it leaves behind or the pain that happens in the moment?

Both, really, but if you're picking one then it'll be the one that lasts the longest.


32ortonedge32dh.gif

Offline

#9 2016-08-15 23:28:52

hummerz5
Member
From: wait I'm not a secret mod huh
Joined: 2015-08-10
Posts: 5,852

Re: hypothetical ethical situation

@diff's interpretation
I'm more worried about whatever the heck you up and forgot.

I feel you're hinting towards "hey it's not so bad because I can't remember" so you could say that the effects aren't terribly detrimental. However, you can't knowledgeably make the argument that going through that experience at the time wasn't ... unethical?

If we're going on the hypothetical "we just rewind that crap" and it works as diff observes/purports, then the question is just "is torturing AI ethical in the moment."

My interpretation: if the AI has every nuance of a human mind, then I can't help but consider a human (me! I am human yes beep boop) in the scenario you suggest. However, some may argue against that. At any rate, it seems unethical.

re-reading CAVEAT: The distinction becomes whether or not this "Simulation" gives consciousness. It's one thing for a computer program to know "well this person feels anguish" as opposed to somehow creating a conscious being. I guess I'm leaning ethical now, since we probably can't actually give "consciousness" but what do I know

Offline

#10 2016-08-16 00:03:49

BuzzerBee
Forum Admin
From: Texas, U.S.A.
Joined: 2015-02-15
Posts: 4,566

Re: hypothetical ethical situation

I mean I feel bad for a chair if I accidentally kick it so I'm going to say yes torturing it would be morally wrong.


TdQRyz3.png
https://wiki.everybodyedits.com/images/5/5d/135_bee

Offline

#11 2016-08-16 00:06:11

hummerz5
Member
From: wait I'm not a secret mod huh
Joined: 2015-08-10
Posts: 5,852

Re: hypothetical ethical situation

^ I don't feel bad for the chair, but I"m conditioned to the point where I expect my parents to say "what is YOUR problem?" which is in hindsight the most scathing query I've ever received from them in the past ten years. So I throw in an unnecessary excessively loud "oops" to ease tension

Offline

#12 2016-08-16 00:11:40

Hexagon
Member
Joined: 2015-04-22
Posts: 1,213

Re: hypothetical ethical situation

I think it would be ethical because we can just program them not to feel pain.

Offline

#13 2016-08-16 00:20:44

Different55
Forum Admin
Joined: 2015-02-07
Posts: 16,572

Re: hypothetical ethical situation

hummerz5 wrote:

I'm more worried about whatever the heck you up and forgot.

Nothing huge or life changing. Just one of my surgeries is completely missing from memory. The really weird part is that I don't remember any events leading up to it, either. No preparation, no discussion about it beforehand, no memory of recovery afterwards, nothing. The whole thing has been burned out of my mind. If I didn't know better I'd say I was being gaslighted about the whole thing.
But the point is, as long as it doesn't affect me it doesn't really matter, does it? If you cut someone's left foot off, that's bad because of the lasting effects of it. They'd be mentally scarred and physically scarred. Their quality of life is affected forever after that. If there are no lasting effects, I don't think I can say for sure that it's unethical. Like if you're just doing it for fun to random people that's definitely screwed up even if you do rewind them afterwards. But assuming there was a good reason, and there's absolutely zero lasting effects at all I can't say that it's inherently wrong. Particularly if you get consent beforehand. I think the same applies to real people in the real world, but that's significantly harder to rewind than a simulation would be.

What if someone had proof that this kind of crap was going on here in the real world, no simulation, basically constantly, but it was all wiped away every time? You'd care, obviously. But it wouldn't affect how you live your life at all, would it? You might go living on paranoid about it 24/7 but it'd never happen, at least not from "your" perspective. You might be getting tortured every second of your life, but assuming that's all reversed completely, does it even count as "happening"?


"Sometimes failing a leap of faith is better than inching forward"
- ShinsukeIto

Offline

#14 2016-08-16 00:21:29, last edited by Swarth100 (2016-08-16 00:23:57)

Swarth100
Member
Joined: 2015-07-18
Posts: 305

Re: hypothetical ethical situation

So, let me address the two parts separately.

2) It's easiest to start from part 2. The assumption simply cannot be made. To simulate the Universe at the ATOMIC level you need a computer at least as big as the universe itself, we could say the universe contains the simulation of itself, but that's another story.

1) To asnwer the first question it depends on what you define as human. From a (biased) Catholic perspective I consider human a Being which has a soul. As we are not omniacent creators, we wouldn't be able to infuse an AI with a soul and thus it would not be considered a human alltogether. Is it ethical to oerform tests that may require torture on non-humans for the sake of preserving humanity itself? I would say yes.

Offline

#15 2016-08-16 00:22:34, last edited by Bobithan (2016-08-16 00:25:08)

Bobithan
Member
Joined: 2015-02-15
Posts: 4,433

Re: hypothetical ethical situation

Different55 wrote:

Since I don't remember, I don't care. If we wiped everything and completely rewound the simulation after you were done being a sadistic bastard I'm not sure they'd care either.

I guess a good question to ask around this point would be: would you, or really anybody, want to go through an extremely painful and traumatic event even if you forgot about it entirely afterwards and your life continued just as normal? I personally wouldn't, even for a large sum of money, but I guess that's leading to another discussion altogether.

hummerz5 wrote:

The distinction becomes whether or not this "Simulation" gives consciousness. It's one thing for a computer program to know "well this person feels anguish" as opposed to somehow creating a conscious being. I guess I'm leaning ethical now, since we probably can't actually give "consciousness" but what do I know

Yeah this is kind of what I'm getting at. Like, even if it's simulated completely perfectly in every conceivable way, it's still just transistors creating a representation of a mind or brain, so is it really conscious? But on the other hand, our minds are really just chemical reactions between our neurons, so are we really conscious? Well, yes, we know that for an absolute fact, even if it's just for ourselves. So what makes it different? I would argue they come to the same result, it's just a different way of establishing a consciousness, since there's not really a distinguished difference unless there truly is a soul or something similar, which I don't particularly believe in.

So my personal conclusion is it would be an unethical experiment

Hexagon wrote:

I think it would be ethical because we can just program them not to feel pain.

That kind of goes against the point of the experiment as we're trying to get an accurate idea of how a human might react or deal with hypothetical situations that we wouldn't want to try on people in the real world for obvious reasons. By removing any sense of pain or fear, the experiment doesn't really tell us much other than 'what if we were all really cool dudes who don't afraid of anything'.

Besides, the point of the argument is what are the ethics when it's essentially a perfect simulation of a human mind or brain.

EDIT:

Swarth100 wrote:

To simulate the Universe at the ATOMIC level you need a computer at least as big as the universe itself, we could say the universe contains the simulation of itself, but that's another story

For the sake of argument, let's all consider that this is feasible and wouldn't require an insane amount of computing. I mean obviously it would, but what if it didn't? (at least for this situation)

Offline

#16 2016-08-16 00:30:34, last edited by Swarth100 (2016-08-16 00:37:21)

Swarth100
Member
Joined: 2015-07-18
Posts: 305

Re: hypothetical ethical situation

@Bobithan
In the abive topic you slightly mentioned Free Will. As in, "Are human beings, that apparently only act consequently to chemical reactions that would always give the same outcome (if replicated), free?"
The answer is a real hard and complicated one, but in case you haven't, I'd like you to reflect on the consequences:
If we have no Free Will, everything you do in your life is exactly what you should have done (were mechanically meant to do). This means that all the achievements you have reached you were meant to reach. Every single time you have failed, you were meant to fail. Everything you have done you were meant to do ... So what is the point of living? Of caring? Of loving?

EDIT:
Well, regarding your second query, I still believe that an artificially assembled human at the atomic level still would not be the same as the real-life clone, as it lacks a Soul. It would be an animal and not a human even though apparently it can feel the same emotions of pain, joy and fear. Would it be ethical to torture them for the sake of Humanity's wellbeing? I'd say yes. It is just worth noting that it will look and sound just as if we were torturing a real human being ...

Offline

#17 2016-08-16 01:05:21

Bobithan
Member
Joined: 2015-02-15
Posts: 4,433

Re: hypothetical ethical situation

@Swarth
Wow you found me out: I don't believe in free will. That doesn't mean that I don't think things have a purpose though, as I'm still experiencing this timeline in real time and it sure does feel like I have a choice, even if that isn't necessarily true according to my beliefs. I realize that sounds conflicting but the things you feel aren't always exactly true. Illusions happen with our senses all the time giving us a false perception of reality. Much of life sees the world in a completely different way than us humans do. What we feel isn't an objective truth about what's around us, so it makes sense to me that free will is just as much of an illusion as everything else.

I find it interesting that you make the distinction that this simulated person as an animal. Would you say the level of existence of this simulated being is the equivalent of that of a dog or a cat or a mouse or a lion? Or does it not "feel" at all? Because if you consider it as an animal, that's a whole other can of worms that goes into the discussion of animal cruelty.

On the subject of souls, my idea of the universe in general is that there is no ultimate decider of what is a human and what isn't, we're just a consequence of the billions of years that came before us. The universe didn't prepare for us, we just are, and I really doubt that the universe (or God, at least my idea of a God) looks at us and says "that's definitely a human, so I'm definitely going to give it a soul". There aren't any qualifiers put in place along with things like the speed of light or gravitational constant that decide what is a human and what isn't.
Given that, I don't think there's a difference on a conscious level that separates a man in the real world and a replica of him in a simulation.

Offline

#18 2016-08-16 01:07:54

shadowda
Member
From: somewhere probably.
Joined: 2015-02-19
Posts: 1,015

Re: hypothetical ethical situation

depends. if this is a simulation, totally virtual. then yes. it would be ethical by my standards. despite these hypothetical AI being perfect replications of human existence, they are not real people. of course if these AI existed in replicate bodies as well, it would be slightly less ethical but still ok.

Plus i think it would be awesome, in one tab, the AI. in another tab you can see all its vital sighs. even virtual brain activity.


color = #1E1E1E       

latest?cb=20150604065609

Offline

#19 2016-08-16 01:14:36

Slabdrill
Formerly 12345678908642
From: canada
Joined: 2015-08-15
Posts: 3,402
Website

Re: hypothetical ethical situation

I'd say it's as ethical as torturing a real person. (Same with animals, actually...)


suddenly random sig change

Offline

#20 2016-08-16 01:59:52

Different55
Forum Admin
Joined: 2015-02-07
Posts: 16,572

Re: hypothetical ethical situation

shadowda wrote:

depends. if this is a simulation, totally virtual. then yes. it would be ethical by my standards. despite these hypothetical AI being perfect replications of human existence, they are not real people. of course if these AI existed in replicate bodies as well, it would be slightly less ethical but still ok.

Plus i think it would be awesome, in one tab, the AI. in another tab you can see all its vital sighs. even virtual brain activity.

These are perfectly simulated people though. Their virtual reality perfectly mirrors how the real world works. We're not simulating people, we're simulating a universe identical to ours that has people in it that work the same way we do. Why do their feelings matter less than yours just because you have complete power over them? Would you consider it ethical if some being with absolute power over this universe started torturing you because it exists outside of your reality the same way you exist outside of your simulation's reality? Just because you can easily control them doesn't automatically mean they're not conscious people. And are you saying that you wouldn't even bother rewinding them afterwards? That you'd just destroy them and leave them miserable for the rest of their lives? That it's just okay to do whatever you want to them just because you're bigger than them? They still have real thoughts and feelings, just like you do. Just because those thoughts and feelings are contained in a place that you control doesn't mean you can ruin their lives.


"Sometimes failing a leap of faith is better than inching forward"
- ShinsukeIto

Offline

#21 2016-08-16 03:23:26, last edited by Hexagon (2016-08-16 03:24:28)

Hexagon
Member
Joined: 2015-04-22
Posts: 1,213

Re: hypothetical ethical situation

Say if the AI could simulate reality 100% perfectly. It would have to simulate itself, since it is in reality, in which it would have to simulate itself in itself in itself which would require infinite processing power since the recursion is infinitely deep. In this case, the AI would not be able to do that, and therefore could not have feelings because it could not accurately simulate what it would be like.

Offline

#22 2016-08-16 03:49:01

hummerz5
Member
From: wait I'm not a secret mod huh
Joined: 2015-08-10
Posts: 5,852

Re: hypothetical ethical situation

Bobithan wrote:

But on the other hand, our minds are really just chemical reactions between our neurons, so are we really conscious?

damn that just about sums it up

I mean I"m sure some neurologist would say "nope bad comparison" but layman's terms that's spot on analysis.

Offline

#23 2016-08-16 17:00:43, last edited by shadowda (2016-08-16 17:11:44)

shadowda
Member
From: somewhere probably.
Joined: 2015-02-19
Posts: 1,015

Re: hypothetical ethical situation

Different55 wrote:
shadowda wrote:

depends. if this is a simulation, totally virtual. then yes. it would be ethical by my standards. despite these hypothetical AI being perfect replications of human existence, they are not real people. of course if these AI existed in replicate bodies as well, it would be slightly less ethical but still ok.

Plus i think it would be awesome, in one tab, the AI. in another tab you can see all its vital sighs. even virtual brain activity.

These are perfectly simulated people though. Their virtual reality perfectly mirrors how the real world works. We're not simulating people, we're simulating a universe identical to ours that has people in it that work the same way we do. Why do their feelings matter less than yours just because you have complete power over them? Would you consider it ethical if some being with absolute power over this universe started torturing you because it exists outside of your reality the same way you exist outside of your simulation's reality? Just because you can easily control them doesn't automatically mean they're not conscious people. And are you saying that you wouldn't even bother rewinding them afterwards? That you'd just destroy them and leave them miserable for the rest of their lives? That it's just okay to do whatever you want to them just because you're bigger than them? They still have real thoughts and feelings, just like you do. Just because those thoughts and feelings are contained in a place that you control doesn't mean you can ruin their lives.

but so be it. their lives mean nothing in this world beyond a science experiment. if they were in this world, sure it would be unethical. but these are simulations. i could wipe them off the face of the earth with a click. it may seem unethical, to do such a thing to a conscious being. but in the end, it does not matter, because in this question i know for a fact they are simulated. i know 100% they are mine. Heck, if i could,i would cause all sorts of problems to see what would happen. then simply change tabs to an identical world where i did nothing of the sort. the possibilities are endless.

if it were us. and some higher being were to do it to us. to him it would be ethical and to us it would suck, but that would be life. i would cry and complain. but those tears are mealiness i have no control over this higher power.  just look at Christianity. people worship this god as an ultimate good. a god that would torture you if it suited him. *cough* Job *cough*. a god that tortures you forever anyway if you don't think like him.

Hidden text

color = #1E1E1E       

latest?cb=20150604065609

Offline

#24 2016-08-16 17:24:56, last edited by Yandax (2016-08-16 17:25:29)

Yandax
Member
From: Where ever I need to be.
Joined: 2015-02-21
Posts: 637

Re: hypothetical ethical situation

First of all, @Shadowda, God didn't torture Job, he allowed Satan to, to test Job's faith, he also gave all his stuff back, in like two-fold.

Anywho, I don't think it'd be ethical if it imitated a human in every way, because it pretty much be a human being. And, why the heck would we torture it?

Ninja'd I miss-spelled like everything


Pretend I didn't exist until now

All hail me, the king of insensitive jerks

Woot if you hate me

Offline

#25 2016-08-16 18:17:31

Different55
Forum Admin
Joined: 2015-02-07
Posts: 16,572

Re: hypothetical ethical situation

shadowda wrote:
Different55 wrote:
shadowda wrote:

depends. if this is a simulation, totally virtual. then yes. it would be ethical by my standards. despite these hypothetical AI being perfect replications of human existence, they are not real people. of course if these AI existed in replicate bodies as well, it would be slightly less ethical but still ok.

Plus i think it would be awesome, in one tab, the AI. in another tab you can see all its vital sighs. even virtual brain activity.

These are perfectly simulated people though. Their virtual reality perfectly mirrors how the real world works. We're not simulating people, we're simulating a universe identical to ours that has people in it that work the same way we do. Why do their feelings matter less than yours just because you have complete power over them? Would you consider it ethical if some being with absolute power over this universe started torturing you because it exists outside of your reality the same way you exist outside of your simulation's reality? Just because you can easily control them doesn't automatically mean they're not conscious people. And are you saying that you wouldn't even bother rewinding them afterwards? That you'd just destroy them and leave them miserable for the rest of their lives? That it's just okay to do whatever you want to them just because you're bigger than them? They still have real thoughts and feelings, just like you do. Just because those thoughts and feelings are contained in a place that you control doesn't mean you can ruin their lives.

but so be it. their lives mean nothing in this world beyond a science experiment. if they were in this world, sure it would be unethical. but these are simulations. i could wipe them off the face of the earth with a click. it may seem unethical, to do such a thing to a conscious being. but in the end, it does not matter, because in this question i know for a fact they are simulated. i know 100% they are mine. Heck, if i could,i would cause all sorts of problems to see what would happen. then simply change tabs to an identical world where i did nothing of the sort. the possibilities are endless.

if it were us. and some higher being were to do it to us. to him it would be ethical and to us it would suck, but that would be life. i would cry and complain. but those tears are mealiness i have no control over this higher power.  just look at Christianity. people worship this god as an ultimate good. a god that would torture you if it suited him. *cough* Job *cough*. a god that tortures you forever anyway if you don't think like him.

Hidden text

Alright so what you're essentially saying boils down to "I'm bigger than you so I can do what I want"? You accept that they're conscious people but you're doing it anyway and saying "screw you, deal with it." What's the difference between that and me tying you down and torturing you in the real world? You can't react or fight back. I know 100% that you're mine and you're not going anywhere. Both you and the virtual people are conscious beings, so what makes you so special? Apparently it's not just the fact that the virtual people are virtual because apparently if some god felt like torturing you that'd be totally ethical. So if being virtual alone doesn't make it ethical it seems like you're saying "If I'm bigger and stronger, everything I do is ethical."


"Sometimes failing a leap of faith is better than inching forward"
- ShinsukeIto

Offline

Wooted by: (4)
Bobithan1471892929620374

Board footer

Powered by FluxBB

[ Started around 1711673096.1891 - Generated in 0.112 seconds, 12 queries executed - Memory usage: 1.76 MiB (Peak: 2.03 MiB) ]