Except you are. You've quite clearly claimed that sentience is not the result of complex neural/computational networks
where
QuoteIf you accept that our brains are nothing more than complex networks of information processingi don't
If you accept that our brains are nothing more than complex networks of information processing
i said we can't imitate it
and besides, that's still a negative statement either way
Quote from: Verbatim on June 21, 2015, 04:45:16 PMwhereHere:QuoteQuoteIf you accept that our brains are nothing more than complex networks of information processingi don't
As Pendulate and I have already said, imitation has nothing to do with it.
Okay, computers have the capacity to be not not sentient. Am I off the hook because I worded it negatively
I mean, if I were to say "our best hypotheses in economics support the benefits of free trade" and you were to claim that isn't the case; yeah, you're making a negative statement, but you're not off the hook for justifying why you claim it.
i wouldn't be able to show you how our sentience came to be, but it involved evolution for over four billion years
making it so an AI can feel would be an imitation of our sentience
saying "not not" is a double negative, which would make it a positiveyou can't just play word games with logic
but that's not gonna convince you of anything, is it
Quote from: Pendulate on June 21, 2015, 04:30:37 PMWe have very little evidence as to how other animals feel and percieve thingsI'm not entirely sure this is correct.
We have very little evidence as to how other animals feel and percieve things
ad infinitumthere would never be a point where you couldn't logically ask the "what if" at the end of the experiment
I doubt we will ever have the capacity to grant something the ability to feel anyway; if it arises, it will arise spontaneously.
And neither can you. Saying "sentience is not the result of complex neural/computational networks"
Quote from: Verbatim on June 21, 2015, 05:01:23 PMbut that's not gonna convince you of anything, is itNo, because we've already established numerous times that my claim has nothing to do with humans deliberately programming a machine to be sentient.
why must i justify something that you already believe
just because sentience is the result of our specific neural network
that just seems a little bit too silly
Because humans being unable to deliberately programme a machine with sentience =/= a machine being able to be sentient. Pendulate and I have stated several times that if sentience arises in machines, it will be spontaneous and not the result of some on-the-spot programming, yet you still appear to have an issue with this proposition.
like, i can't definitively answer "yes" or "no" to that question without logically contradicting the very nature of spontaneity.making it a pointless question.
BIOLOGY(of movement or activity in an organism) instinctive or involuntary.
but i have no basis for it whatsoever
What I mean is that the initial development of sentience in the machine will be an involuntary process. Let's do away with the word spontaneous and phrase it like this: upon reaching a certain degree of neural complexity, the machine will develop sentience as a result. However, the establishment of this complexity will not be as a result of deliberate, on-the-spot human programming.
You mean other than the statistical improbability of being hit by a meteor.
i think our brain chemistry is too complex to be replicated
Quote from: Meta Cognition on June 21, 2015, 05:24:25 PMYou mean other than the statistical improbability of being hit by a meteor.and that's really not much of a basis anyway
If it was just a passing "Oh, I don't really think it'll happen"
what's odd is that you're only taking issue with it now on this particular subject
They should be treated equally but given how we treat the next closest thing to a sentient species on this planet I'm not holding out much hope for meatbags to be respectful to AI.They shouldn't be slaves but no doubt they will.