don't be pedantic, you obviously know what i mean
Quote from: Pendulate on June 21, 2015, 03:18:38 PMIf sentience in AI were to happen, it would almost certainly be by accident.we can't give things by accident?come ondon't be pedantic, you obviously know what i mean
If sentience in AI were to happen, it would almost certainly be by accident.
whether we have control is irrelevant to whether it's actually possible in the first place, though, which was my main point
You know, I think the closest you can come to is an intelligence that's allowed to modify itself. Which, we already build rudimentry versions of. Little processors that can change their own coding to better suit certain aspects of things that they know.If you create a particularily powerful intelligence that has the capability to alter it's own programming, then I don't think it would be too hard to eventually, essentially, stumble into the realm of being considered a sentient entity.
If you accept that our brains are nothing more than complex networks of information processing
i mean, i would just call that agency, not sentience
Quote from: Verbatim on June 21, 2015, 03:45:56 PMi mean, i would just call that agency, not sentienceYou don't think agency is a vital component of sentience?
If you think an AI needs to think and feel in the same ways as we do in order to be deemed sentient, then no, we probably won't ever have sentient AI.
Quote from: Sandtrap on June 21, 2015, 03:38:39 PMYou know, I think the closest you can come to is an intelligence that's allowed to modify itself. Which, we already build rudimentry versions of. Little processors that can change their own coding to better suit certain aspects of things that they know.If you create a particularily powerful intelligence that has the capability to alter it's own programming, then I don't think it would be too hard to eventually, essentially, stumble into the realm of being considered a sentient entity.i mean, i would just call that agency, not sentiencefor me, the defining factors of sentience involve a lot more than just agencycan it think? can it feel?can it communicate its ideas?can it get offended? can it get hurt?can it suffer?does it have interests/disinterests?does it have a personality?that's what makes sentience to me, and if you disagree with that, then i'm not talking about the same thing as you
That's a poor definition of sentience, though.
Quote from: Meta Cognition on June 21, 2015, 04:05:34 PMThat's a poor definition of sentience, though.it's actually the best definition everpresent me with a better one
I never said it wasn't a "vital component"--I'm saying that all it is is a component. It's not the only thing that defines sentience.
Unless you want to coin a new term for computer sentience (intelience? Swidt) then I don't think this is misusing it.
I'd also take this to mean that you'd support sentient AI in this case? Assuming they can't suffer in ways that we can, and they can perform complex tasks in short amounts of time?
present me with a better one
I would say developing a true artificial intelligence would be the single most important discovery in human history; all others would be meaningless by comparison except in their contribution to AI. It would mean we understand how our minds work and what actually leads to sapience. Beyond a mere function of processing we'd be able to make the existential into the scientific. Questions of existence and origin would leave the realm of pseudoscience and become experimental. Artificial intelligence isn't sought by making the most efficient computer, it's by unraveling the mystery of consciousness and reason. A true AI would mean we've finally prioritized knowledge over utility, and that would be a turning point of our species.
Quote from: Verbatim on June 21, 2015, 04:07:26 PMpresent me with a better oneWell we could just use the actual definition, which is the ability to experience subjectively. Which doesn't at all require any similarity to human experience besides some aspect of subjectivity and, presumably, self-awareness.
because that would contradict you, and help my definition instead
no one's strictly talking about human experience
No, it wouldn't.
That seemed to be what Pendulate was implying when you used his consideration to demonstrate your point.
it's my exact definition of sentience, though
i'm of course referring to all sentient life
We have very little evidence as to how other animals feel and percieve things
And you're yet to demonstrate why artificial intelligences would be excluded from the ability to perceive, feel or be self-aware.
Quote from: Meta Cognition on June 21, 2015, 04:28:31 PMAnd you're yet to demonstrate why artificial intelligences would be excluded from the ability to perceive, feel or be self-aware. it's not my job tothat which can be presented w/o evidence can be dismissed without evidencethe onus is on the person who's making the positive assertion--and i'm making no positive assertionsi know you don't agree with that logic, because it's inconvenient, but
Quote from: Pendulate on June 21, 2015, 04:30:37 PMWe have very little evidence as to how other animals feel and percieve thingsI'm not entirely sure this is correct.