Sapience would be hard to measure, especially if it's programmed to imitate sapience.
English
-
Can you prove to me that you aren't imitating sapience?
-
Yes, a blood test showing I'm human would confirm that.
-
That proves nothing. It's simply evidence relying on a Theory of Mind. You have no way to objectively prove to another individual that you are sapient also.
-
[quote]That proves nothing.[/quote] Except it proves I'm human, and humans are sapient barring certain conditions such as mental handicaps or coma. Frankly I'm not even sure what your point is. My post said it would be difficult to determine sapience, and all you're doing is arguing in favor of that.
-
There's no objective proof for it though. Proving you're human proves nothing. Look up solipsism. My point is that as long as a computer programme seems to be sapient enough, it should be treated as such. Difficulty of proving sapience doesn't mean we should infringe on people's rights.
-
[quote]There's no objective proof for it though.[/quote] ...Yes there is. We have specific criteria for sapience. You could counter anything with solipsism, because it's senseless drivel. I can prove the effect of gravity but you could respond by saying we can't know if it's real.
-
False equivocation. You cannot possibly understand the qualia of another person. The ideas of consciousness, sapience and sentience are all inherently qualia. They cannot be absolutely proved. Your point was that it'd be difficult to determine whether the AI was really sapient, and I assume that you are using this as an argument to not grant AI with rights wantonly. I disagree, anything that appears sapient enough then they should be treated as such.
-
Edited by HurtfulTurkey: 8/3/2013 6:50:53 PMYou throw around words like 'qualia' and then use the phrase "anything that appears sapient enough". It's mindbogglingly contradictory, not to mention that those same indefinable things can in fact be measured through neurological examination of the brain's electrical signals. [quote]I assume that you are using this as an argument to not grant AI with rights wantonly.[/quote] You assume wrong; I wasn't using it as any argument. I was just saying that it would be difficult to measure sapience, particularly in something programmed to imitate expected responses. tl;dr you're responding to an argument I never made.
-
Am I real?
-
Humans = robots
-
How is that relevant to anything I said?
-
shut up ya bastard.
-
Love you <3
-
They have this test thingy that does that :o It's called a Turing test!
-
Yeah, but it's got quite a few problems with it. It would be entirely possible to program the computer to give correct answers to questions likely to be given during the Turing test.