A.I. technology has been developed a lot throughout the years, and we're at a point where they can think on their own (although extremely limited).
even though it is a man made creation?
Never--I mean, we've had this discussion before, but there would be no purpose whatsoever in even giving AI sentience, beyond "because we can" (which is, of course, the dumbest possible reason to do anything).Ignoring for a moment the fact that it would be unethical as hell.
but there would be no purpose whatsoever in even giving AI sentience
Oh yes, and this. By this set of logic, all children are man or woman made creations since they were technically engineered biologically.
Quote from: Verbatim on June 21, 2015, 01:51:06 PMbut there would be no purpose whatsoever in even giving AI sentienceYou mean apart from creating the single most intelligent agent we're likely to ever witness?
Ignoring for a moment the fact that it would be unethical as hell.
we already have information--why do we need to give information a brain
Quote from: Verbatim on June 21, 2015, 01:51:06 PMIgnoring for a moment the fact that it would be unethical as hell.Is this along the same lines of how you think it's unethical to reproduce? (Not that I disagree)
Quote from: Verbatim on June 21, 2015, 02:01:36 PMwe already have information--why do we need to give information a brainSo it can be processed, of course. Humanity's problems stem from a deficit in collating and processing information.
Quote from: eggsalad on June 21, 2015, 02:09:37 PMQuote from: Verbatim on June 21, 2015, 01:51:06 PMIgnoring for a moment the fact that it would be unethical as hell.Is this along the same lines of how you think it's unethical to reproduce? (Not that I disagree)yeahin my opinion, AI should only be used for protocolthey should efficiently perform a given set of taskswithout question, without fear of pain, and without loss of energysentience adds nothing but nonsenseemotions and another tripe that would just get in the way
Do you suppose there are ways we could make them that eliminates chances of them suffering? Are inner turmoil and conflicts necessary to a conscious mind?
Never--I mean, we've had this discussion before, but there would be no purpose whatsoever in even giving AI sentience, beyond "because we can" (which is, of course, the dumbest possible reason to do anything).
well, you can't make a ball bounce higher than it falls
Quote from: Verbatim on June 21, 2015, 02:16:20 PMwell, you can't make a ball bounce higher than it fallsOf course you can, with self-recursive improvement.
Quote from: Meta Cognition on June 21, 2015, 02:54:38 PMQuote from: Verbatim on June 21, 2015, 02:16:20 PMwell, you can't make a ball bounce higher than it fallsOf course you can, with self-recursive improvement.this also requires no sentience
It's not a case of "giving" sentience to an AGI; presumably sentience arrives when some point of computational complexity is reached.
i just don't see that happening, ever
It has happened numerous times. Unless you think that the sentience of humans (and a hell of a lot of animal species) arises from something other than neural complexity?
Quote from: Meta Cognition on June 21, 2015, 03:16:15 PMIt has happened numerous times. Unless you think that the sentience of humans (and a hell of a lot of animal species) arises from something other than neural complexity?nothing we could ever feasibly imitate
There's no basis for that assertion, though.
If sentience in AI were to happen, it would almost certainly be by accident.