×
Back left
Back right
Posted over 5 years ago

Posted By:

Thumb
Lock
Posts: 212

Assume we reach the point in technology where our A.I. becomes self-aware and begins acting out of its own free will. It has the capacity to think on its own, create things, understand new concepts without human input, whatever. Would it be ethical to pull the plug on that A.I., essentially killing it, even though it's not, technically speaking, biologically living?

I see it all perfectly; there are two possible situations — one can either do this or that. My honest opinion and my friendly advice is this: do it or do not do it — you will regret both. -Søren Kierkegaard


Posted over 5 years ago

Posted By:

Thumb
Lock
Posts: 7276

Sure, I don't relish the idea of being a battery.

Potato

I Support Equal Rights  


Posted over 5 years ago

Posted By:

Thumb
Lock
Posts: 212

"-Allie-" wrote:

Sure, I don't relish the idea of being a battery.
I figured a lot of the responses would be something like this, and it's pretty reasonable, even though I disagree with it. I don't think it's ethical because it's still a conscious being that you'd be killing, even though it's not alive. The definition of life would have to be changed to accept self-aware machines into it, though.

I see it all perfectly; there are two possible situations — one can either do this or that. My honest opinion and my friendly advice is this: do it or do not do it — you will regret both. -Søren Kierkegaard


Posted over 5 years ago

Posted By:

Thumb
Posts: 6500

Yes. absolutely neccesary. GO play portal 2, and youll find the reason for my answer. SOmething that had human input and its own input would be able to take over a human. Especcially if that AI was in control of a building of some sort.

Hey, I'm your average- HOLY REDSTONE THIS DRAWBRIDGE IS AWESOME. OK, so maybe not your average minecrafter, but... A minecrafter nonetheless. GIRR THIS SIGGY IS GIRR APROVED. Today's top posters:

TruSpirit (193)
nascarawesome (183)
flandrescarlet1 (183)


Posted over 5 years ago

Posted By:

Thumb
Lock
Posts: 212

"TruSpirit" wrote:

Yes. absolutely neccesary. GO play portal 2, and youll find the reason for my answer. SOmething that had human input and its own input would be able to take over a human. Especcially if that AI was in control of a building of some sort.
lol.

I see it all perfectly; there are two possible situations — one can either do this or that. My honest opinion and my friendly advice is this: do it or do not do it — you will regret both. -Søren Kierkegaard


Posted over 5 years ago

Posted By:

Thumb
Lock
Posts: 7276

"Capablanca" wrote:

"-Allie-" wrote:

Sure, I don't relish the idea of being a battery.
I figured a lot of the responses would be something like this, and it's pretty reasonable, even though I disagree with it. I don't think it's ethical because it's still a conscious being that you'd be killing, even though it's not alive. The definition of life would have to be changed to accept self-aware machines into it, though.
Conscious being or not, if a human kills there is the death penalty for them why not the same for an AI? Although I don't agree with the death penalty. But unless it was "born" to a mother it is not considered a life.

Potato

I Support Equal Rights  


Posted over 5 years ago

Posted By:

Thumb
Lock
Posts: 212

"-Allie-" wrote:

"Capablanca" wrote:

"-Allie-" wrote:

Sure, I don't relish the idea of being a battery.
I figured a lot of the responses would be something like this, and it's pretty reasonable, even though I disagree with it. I don't think it's ethical because it's still a conscious being that you'd be killing, even though it's not alive. The definition of life would have to be changed to accept self-aware machines into it, though.
Conscious being or not, if a human kills there is the death penalty for them why not the same for an AI? Although I don't agree with the death penalty. But unless it was "born" to a mother it is not considered a life.
I agree. I guess the main question of mine is even though it's fully self-aware, would it be ethical to kill something that isn't, technically, living?

I see it all perfectly; there are two possible situations — one can either do this or that. My honest opinion and my friendly advice is this: do it or do not do it — you will regret both. -Søren Kierkegaard


Posted over 5 years ago

Posted By:

Thumb
Lock
Posts: 7276

Humans already kill things considered a life for various reasons. Of course it will be no stretch to do the same with something not considered one.

Potato

I Support Equal Rights  


Posted over 5 years ago

Posted By:

Thumb
Lock
Posts: 212

"-Allie-" wrote:

Humans already kill things considered a life for various reasons. Of course it will be no stretch to do the same with something not considered one.
Of course, but I'm not really concerned with how humans are capable of horrific things. I'm only concerned with knowing if it would be considered ethical to kill something that isn't living, but still thinks and has free-will.

I see it all perfectly; there are two possible situations — one can either do this or that. My honest opinion and my friendly advice is this: do it or do not do it — you will regret both. -Søren Kierkegaard


Posted over 5 years ago

Posted By:

Thumb
Lock
Posts: 7276

"Capablanca" wrote:

"-Allie-" wrote:

Humans already kill things considered a life for various reasons. Of course it will be no stretch to do the same with something not considered one.
Of course, but I'm not really concerned with how humans are capable of horrific things. I'm only concerned with knowing if it would be considered ethical to kill something that isn't living, but still thinks and has free-will.
I hear ya, but you're assuming humans even bother to find out these things. Its my contention that we shoot first and ask questions later.

Potato

I Support Equal Rights  


News Feed

To view your full News Feed please Login using your Username and Password or Register with Kidzworld!

Forum Activity