Should A.I.'s be treated as living beings?

 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈ðŸ‘
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,062 posts
This is the way the world ends. Not with a bang but a whimper.
don't be pedantic, you obviously know what i mean
It's a point worth making, though.

You're implying we have control over whether or not a certain intelligence--regardless of neural/computational complexity--develops sentience.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,034 posts
â§
whether we have control is irrelevant to whether it's actually possible in the first place, though, which was my main point

you can argue that i lack a basis for my hypothesis, but i don't think there's a basis for my opposition, either
and the onus is on them to demonstrate to me the possibility

until then, i'll remain skeptical
Last Edit: June 21, 2015, 03:36:39 PM by Verbatim


Pendulate | Ascended Posting Frenzy
 
more |
XBL:
PSN:
Steam:
ID: Pendulate
IP: Logged

460 posts
 
If sentience in AI were to happen, it would almost certainly be by accident.
we can't give things by accident?

come on

don't be pedantic, you obviously know what i mean
No, I took you to mean that we'd intentionally endow a machine with sentience.


 
Sandtrap
| Mythic Sage
 
more |
XBL:
PSN:
Steam:
ID: Sandtrap
IP: Logged

11,704 posts
Rockets on my X
whether we have control is irrelevant to whether it's actually possible in the first place, though, which was my main point

You know, I think the closest you can come to is an intelligence that's allowed to modify itself. Which, we already build rudimentry versions of. Little processors that can change their own coding to better suit certain aspects of things that they know.

If you create a particularily powerful intelligence that has the capability to alter it's own programming, then I don't think it would be too hard to eventually, essentially, stumble into the realm of being considered a sentient entity.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,034 posts
â§
You know, I think the closest you can come to is an intelligence that's allowed to modify itself. Which, we already build rudimentry versions of. Little processors that can change their own coding to better suit certain aspects of things that they know.

If you create a particularily powerful intelligence that has the capability to alter it's own programming, then I don't think it would be too hard to eventually, essentially, stumble into the realm of being considered a sentient entity.
i mean, i would just call that agency, not sentience

for me, the defining factors of sentience involve a lot more than just agency

can it think? can it feel?
can it communicate its ideas?
can it get offended? can it get hurt?
can it suffer?

does it have interests/disinterests?
does it have a personality?

that's what makes sentience to me, and if you disagree with that, then i'm not talking about the same thing as you
Last Edit: June 21, 2015, 03:48:45 PM by Verbatim


Pendulate | Ascended Posting Frenzy
 
more |
XBL:
PSN:
Steam:
ID: Pendulate
IP: Logged

460 posts
 
If you accept that our brains are nothing more than complex networks of information processing, then you pretty much have to admit that sentience could emerge from other complex networks as well.


 
TB
| Hero of the Wild
 
more |
XBL:
PSN:
Steam:
ID: TBlocks
IP: Logged

17,216 posts
#13
Perhaps although it depends on the situation.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,034 posts
â§
If you accept that our brains are nothing more than complex networks of information processing
i don't


 
challengerX
| custom title
 
more |
XBL:
PSN:
Steam:
ID: challengerX
IP: Logged

41,949 posts
I DONT GIVE A SINGLE -blam!- MOTHER -blam!-ER ITS A MOTHER -blam!-ING FORUM, OH WOW, YOU HAVE THE WORD NINJA BELOW YOUR NAME, HOW MOTHER -blam!-ING COOL, NOT, YOUR ARE NOTHING TO ME BUT A BRAINWASHED PIECE OF SHIT BLOGGER, PEOPLE ONLY LIKE YOU BECAUSE YOU HAVE NINJA BELOW YOUR NAME, SO PLEASE PUNCH YOURAELF IN THE FACE AND STAB YOUR EYE BECAUSE YOU ARE NOTHING BUT A PIECE OF SHIT OF SOCIETY
This user has been blacklisted from posting on the forums. Until the blacklist is lifted, all posts made by this user have been hidden and require a Sep7agon® SecondClass Premium Membership to view.


 
Naru
| The Tide Caller
 
more |
XBL: Naru No Baka
PSN:
Steam: The Tide Caller
ID: GasaiYuno
IP: Logged

18,501 posts
The Rage....
God no. Let's not have sentient AI be a thing, please.


Pendulate | Ascended Posting Frenzy
 
more |
XBL:
PSN:
Steam:
ID: Pendulate
IP: Logged

460 posts
 
i mean, i would just call that agency, not sentience
You don't think agency is a vital component of sentience?

If you think an AI needs to think and feel in the same ways as we do in order to be deemed sentient, then no, we probably won't ever have sentient AI.

If it just needs to be able to think for itself, and move beyond its man-made programming, then it's well within the realm of possibility.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,034 posts
â§
i mean, i would just call that agency, not sentience
You don't think agency is a vital component of sentience?
I never said it wasn't a "vital component"--I'm saying that all it is is a component. It's not the only thing that defines sentience.

Quote
If you think an AI needs to think and feel in the same ways as we do in order to be deemed sentient, then no, we probably won't ever have sentient AI.
well, there you go


 
Sandtrap
| Mythic Sage
 
more |
XBL:
PSN:
Steam:
ID: Sandtrap
IP: Logged

11,704 posts
Rockets on my X
You know, I think the closest you can come to is an intelligence that's allowed to modify itself. Which, we already build rudimentry versions of. Little processors that can change their own coding to better suit certain aspects of things that they know.

If you create a particularily powerful intelligence that has the capability to alter it's own programming, then I don't think it would be too hard to eventually, essentially, stumble into the realm of being considered a sentient entity.
i mean, i would just call that agency, not sentience

for me, the defining factors of sentience involve a lot more than just agency

can it think? can it feel?
can it communicate its ideas?
can it get offended? can it get hurt?
can it suffer?

does it have interests/disinterests?
does it have a personality?

that's what makes sentience to me, and if you disagree with that, then i'm not talking about the same thing as you

I certainly don't disagree with that.

And, there's no real feasible reason why an intelligence that was able to self modify itself to adapt to new encounters couldn't reach that. Let's say you build an intelligence whose core revolves around public interactions with people.

That's a lot of raw data and a lot of conflicting encounters because people and their personalities are all varied. Eventually, the intelligence could learn how to filter through all of that data and a distinct, unique personality may emerge.

Now, that's not going so far as saying it would be sentient. But, then again, who is to say that with exposure to people and their emotions, it couldn't adopt the usage or simulation of them to better interact with people?

Knowing that a smile isn't hostile, and thus choosing to smile at a certain time for the benefit of the person it is interacting with?

You might say that would be parroting.

But, kids parrot their parents as well. And some people, do the things that they do, without really knowing why. Without questioning why they do it when they inherited their attitude and actions from exposure to their parents.

I certainly think that an intelligence that was allowed to modify itself could easily fall into the realm of sentience.


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈ðŸ‘
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,062 posts
This is the way the world ends. Not with a bang but a whimper.
If you think an AI needs to think and feel in the same ways as we do in order to be deemed sentient, then no, we probably won't ever have sentient AI.
well, there you go
[/quote]
That's a poor definition of sentience, though.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,034 posts
â§
That's a poor definition of sentience, though.
it's actually the best definition ever

present me with a better one


 
Sandtrap
| Mythic Sage
 
more |
XBL:
PSN:
Steam:
ID: Sandtrap
IP: Logged

11,704 posts
Rockets on my X
That's a poor definition of sentience, though.
it's actually the best definition ever

present me with a better one

We can't actually define what true sentience is because of our perspective. We can't experience any other form of experience because we're locked to our own perspective of things which is a physical barrier.

So, because our perspective is the only one we experience, and the manner that we experience it, to us, seems like the only form of sentience that exists.

If you want a good example of that, albiet far out there, take a look at Ender's game. The alien race, the Formics, attacked Humanity because they didn't realize that they were sapient due to how far apart their method of interaction and intelligence varied.

There could certainly be other forms of intelligence and sapience out there that are unrecognizable to our own.
Last Edit: June 21, 2015, 04:15:26 PM by Sandtrap


Pendulate | Ascended Posting Frenzy
 
more |
XBL:
PSN:
Steam:
ID: Pendulate
IP: Logged

460 posts
 
I never said it wasn't a "vital component"--I'm saying that all it is is a component. It's not the only thing that defines sentience.
Well, I'd say it's enough of a factor to define it for the purpose of this discussion.

Unless you want to coin a new term for computer sentience (intelience? Swidt) then I don't think this is a misuse.

I'd also take this to mean that you'd support sentient AI, in this case? Assuming they can't suffer in ways that we can, and they can perform complex tasks in short amounts of time?
Last Edit: June 21, 2015, 04:13:41 PM by Pendulate


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,034 posts
â§
Unless you want to coin a new term for computer sentience (intelience? Swidt) then I don't think this is misusing it.
i think "artificial intelligence" is good enough
Quote
I'd also take this to mean that you'd support sentient AI in this case? Assuming they can't suffer in ways that we can, and they can perform complex tasks in short amounts of time?
of course

to call that sentience, though, i feel undermines the definition of the word

under this definition of sentience, you would have to concede that insects are sentient
you realize


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈ðŸ‘
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,062 posts
This is the way the world ends. Not with a bang but a whimper.
present me with a better one
Well we could just use the actual definition, which is the ability to experience subjectively.

Which doesn't at all require any similarity to human experience besides some aspect of subjectivity and, presumably, self-awareness.


Turkey | Mythic Inconceivable!
 
more |
XBL: Viva Redemption
PSN: HurtfulTurkey
Steam: HurtfulTurkey
ID: HurtfulTurkey
IP: Logged

8,077 posts
 
I would say developing a true artificial intelligence would be the single most important discovery in human history; all others would be meaningless by comparison except in their contribution to AI. It would mean we understand how our minds work and what actually leads to sapience. Beyond a mere function of processing we'd be able to make the existential into the scientific. Questions of existence and origin would leave the realm of pseudoscience and become experimental.

Artificial intelligence isn't sought by making the most efficient computer, it's by unraveling the mystery of consciousness and reason. A true AI would mean we've finally prioritized knowledge over utility, and that would be a turning point of our species.
Last Edit: June 21, 2015, 04:16:44 PM by HurtfulTurkey


 
Sandtrap
| Mythic Sage
 
more |
XBL:
PSN:
Steam:
ID: Sandtrap
IP: Logged

11,704 posts
Rockets on my X
I would say developing a true artificial intelligence would be the single most important discovery in human history; all others would be meaningless by comparison except in their contribution to AI. It would mean we understand how our minds work and what actually leads to sapience. Beyond a mere function of processing we'd be able to make the existential into the scientific. Questions of existence and origin would leave the realm of pseudoscience and become experimental.

Artificial intelligence isn't sought by making the most efficient computer, it's by unraveling the mystery of consciousness and reason. A true AI would mean we've finally prioritized knowledge over utility, and that would be a turning point of our species.

Thank you.

Improving the species is and should, always be the goal.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,034 posts
â§
present me with a better one
Well we could just use the actual definition, which is the ability to experience subjectively.

Which doesn't at all require any similarity to human experience besides some aspect of subjectivity and, presumably, self-awareness.
if you're going to copy the wikipedia definition of sentience, you should probably not leave out the two words "feel" and "perceive"
because that would contradict you, and help my definition instead

no one's strictly talking about human experience


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈ðŸ‘
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,062 posts
This is the way the world ends. Not with a bang but a whimper.
because that would contradict you, and help my definition instead
No, it wouldn't.

Quote
no one's strictly talking about human experience
That seemed to be what Pendulate was implying when you used his consideration to demonstrate your point.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,034 posts
â§
No, it wouldn't.
it's my exact definition of sentience, though
the very first thing it mentions is that sentience is the ability to feel
Quote
That seemed to be what Pendulate was implying when you used his consideration to demonstrate your point.
then that's pendulate's fault for not wording his post properly--i'm of course referring to all sentient life

i mean, as i usually do in these types of discussions <_<


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈ðŸ‘
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,062 posts
This is the way the world ends. Not with a bang but a whimper.
it's my exact definition of sentience, though
And you're yet to demonstrate why artificial intelligences would be excluded from the ability to perceive, feel or be self-aware.

Quote
i'm of course referring to all sentient life
See above.


Pendulate | Ascended Posting Frenzy
 
more |
XBL:
PSN:
Steam:
ID: Pendulate
IP: Logged

460 posts
 
Well if we're not using human experience as the yardstick, the concept of sentience breaks down pretty quickly for any part other than mere capacity to experience.

We have very little evidence as to how other animals feel and percieve things, though it's quite obvious that they differ depending on species.

All we concretely have to go by is the fact that they can experience.
Last Edit: June 21, 2015, 04:31:21 PM by Pendulate


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈ðŸ‘
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,062 posts
This is the way the world ends. Not with a bang but a whimper.
We have very little evidence as to how other animals feel and percieve things
I'm not entirely sure this is correct.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,034 posts
â§
And you're yet to demonstrate why artificial intelligences would be excluded from the ability to perceive, feel or be self-aware.
it's not my job to

that which can be presented w/o evidence can be dismissed without evidence

the onus is on the person who's making the positive assertion--and i'm making no positive assertions
i know you don't agree with that logic, because it's inconvenient, but


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈ðŸ‘
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,062 posts
This is the way the world ends. Not with a bang but a whimper.
And you're yet to demonstrate why artificial intelligences would be excluded from the ability to perceive, feel or be self-aware.
it's not my job to

that which can be presented w/o evidence can be dismissed without evidence

the onus is on the person who's making the positive assertion--and i'm making no positive assertions
i know you don't agree with that logic, because it's inconvenient, but
Except you are.

You've quite clearly claimed that sentience is not the result of complex neural/computational networks, which is currently our best hypothesis for the existence of sentience/conciousness.

You've made a positive claim which draws the distinct conclusion that machines are exempt from such sentience, and I'm waiting for you to justify it.


 
Sandtrap
| Mythic Sage
 
more |
XBL:
PSN:
Steam:
ID: Sandtrap
IP: Logged

11,704 posts
Rockets on my X
We have very little evidence as to how other animals feel and percieve things
I'm not entirely sure this is correct.

We have data on critters that are more closely linked to us in terms of intelligence. Like whales and elephants. But the line starts to break down when you consider more varied types of lifeforms.

Like insects. Or fish.

We have absolutely no way to tell what an individual ant perceives or how it perceives things. We can only make guess based of how they act and maybe, examination of their bodily functions.

But, that's why sentience is a bitch to define because we're stuck to our perception of things. And because of that, our point of view seems like the only one that can exist to define it, when it might not.