Should A.I.'s be treated as living beings?

 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
Except you are.

You've quite clearly claimed that sentience is not the result of complex neural/computational networks
where

i said we can't imitate it--and besides, that's still a negative statement either way
Last Edit: June 21, 2015, 04:46:00 PM by Verbatim


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
to say that our brain is a "complex network" would be a COLOSSAL understatement

someone should demonstrate to me how we could perfectly replicate it
Last Edit: June 21, 2015, 04:50:47 PM by Verbatim


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
where
Here:
Quote
Quote
If you accept that our brains are nothing more than complex networks of information processing
i don't

Quote
i said we can't imitate it
As Pendulate and I have already said, imitation has nothing to do with it.

Quote
and besides, that's still a negative statement either way
Okay, computers have the capacity to be not not sentient. Am I off the hook because I worded it negatively? The fact that you're making a "negative" proposition doesn't make the proposition void of content, not does it stop you from being able to support it. The idea that "you can't prove a negative" is just folk logic.

I mean, if I were to say "our best hypotheses in economics support the benefits of free trade" and you were to claim that isn't the case; yeah, you're making a negative statement, but you're not off the hook for justifying why you claim it.
Last Edit: June 21, 2015, 04:52:16 PM by Meta Cognition


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
where
Here:
Quote
Quote
If you accept that our brains are nothing more than complex networks of information processing
i don't
okay, and?

without our sentience, our brains ARE nothing more than complex networks of information
i wouldn't be able to show you how our sentience came to be, but it involved evolution for over four billion years

Quote
As Pendulate and I have already said, imitation has nothing to do with it.
making it so an AI can feel would be an imitation of our sentience

Quote
Okay, computers have the capacity to be not not sentient. Am I off the hook because I worded it negatively
you didn't word it negatively, though

saying "not not" is a double negative, which would make it a positive
you can't just play word games with logic


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
I mean, if I were to say "our best hypotheses in economics support the benefits of free trade" and you were to claim that isn't the case; yeah, you're making a negative statement, but you're not off the hook for justifying why you claim it.
i can justify it, sure, but if you're asking me to prove that we can't?
that's not gonna be logically possible

i can tell you that the reason why i don't think we'll ever make a sentient AI is because I think sentience is just too complex of a phenomenon to ever be perfectly imitated, but that's not gonna convince you of anything, is it

i'm just not a fan of justifications if they're just going to waste people's time


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
the reason why you can't prove a negative is pretty simple, too, because you could just keep asking "what if" questions

all right, meta, i've done everything i've could, but i can't seem to make this robot feel--i don't think it's possible
"what if you did it this way?"
okay, i'll try that--didn't work
"what if you did it this way?"
okay, i'll try that--still didn't work
"what if you did it this way?"

ad infinitum
there would never be a point where you couldn't logically ask the "what if" at the end of the experiment
Last Edit: June 21, 2015, 05:04:46 PM by Verbatim


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
i wouldn't be able to show you how our sentience came to be, but it involved evolution for over four billion years
And there's absolutely no justification for assuming the same must be true of AGIs, especially considering the initial jumping-off point of our own technological development, which the AGI will have access to from the beginning, and the potentiality of recursive self-improvement.

Quote
making it so an AI can feel would be an imitation of our sentience
I doubt we will ever have the capacity to grant something the ability to feel anyway; if it arises, it will arise spontaneously.

Quote
saying "not not" is a double negative, which would make it a positive
you can't just play word games with logic
And neither can you. Saying "sentience is not the result of complex neural/computational networks" is logically equivalent to the positive claim that "sentience is composed of something other than just complex neural/computational networks".

You still have an onus on that claim.


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
but that's not gonna convince you of anything, is it
No, because we've already established numerous times that my claim has nothing to do with humans deliberately programming a machine to be sentient.


Pendulate | Ascended Posting Frenzy
 
more |
XBL:
PSN:
Steam:
ID: Pendulate
IP: Logged

460 posts
 
We have very little evidence as to how other animals feel and percieve things
I'm not entirely sure this is correct.
As a matter of subjectivity, I don't think we do. I mean, obviously we have reason to believe that animals with similar nervous systems to ours can feel in similar ways, but they are a small percentage of all animals, and I don't think we can ever truly know what it's like to be them.

Verb's comment on how sentience requires the capacity to suffer is incredibly broad-brushed considering he wants to say that animals = have it but computers = won't have it.
Last Edit: June 21, 2015, 05:13:15 PM by Pendulate


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
ad infinitum
there would never be a point where you couldn't logically ask the "what if" at the end of the experiment
Not quite; I should qualify my statements by saying "prove =/= 100pc" certainty. But Bayesian inference tells us that, at some point, an absence of evidence is equal to evidence of absence.

Like claiming there's an invisible unicorn in your shed. I could set up infra-red cameras and demonstrate a lack of evidence which is equal to evidence of absence, and the only way you could escape it is by defining the unicorn into something non-empirical and therefore meaningless anyway.

But that has no bearing on this conversation.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
I doubt we will ever have the capacity to grant something the ability to feel anyway; if it arises, it will arise spontaneously.
so now you're making my argument now

why must i justify something that you already believe
Quote
And neither can you. Saying "sentience is not the result of complex neural/computational networks"
but i've never said that

just because sentience is the result of our specific neural network doesn't really mean anything


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
but that's not gonna convince you of anything, is it
No, because we've already established numerous times that my claim has nothing to do with humans deliberately programming a machine to be sentient.
well, i'm not going to argue about whether something is going to happen out of sheer "spontaneity", am i?

that just seems a little bit too silly


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
why must i justify something that you already believe
Because humans being unable to deliberately programme a machine with sentience =/= a machine being able to be sentient. Pendulate and I have stated several times that if sentience arises in machines, it will be spontaneous and not the result of some on-the-spot programming, yet you still appear to have an issue with this proposition.

Quote
just because sentience is the result of our specific neural network
That's not what I said; human sentience has no bearing on potential machine sentience other than that the two would presumably be connected by the simple fact that sentience is the result of complex neural networks.

If you do accept that this is the likely cause of sentience--as current neuroscientific research would suggest--then you also have to accept that a sufficiently complex machine could also develop sentience.


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
that just seems a little bit too silly
Not really. . .


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
Because humans being unable to deliberately programme a machine with sentience =/= a machine being able to be sentient. Pendulate and I have stated several times that if sentience arises in machines, it will be spontaneous and not the result of some on-the-spot programming, yet you still appear to have an issue with this proposition.
well, again, that just seems like too ridiculous of a question to ask me

"do you think something could happen spontaneously?"

"uhhhh... maybe!"

like, i can't definitively answer "yes" or "no" to that question without logically contradicting the very nature of spontaneity.
making it a pointless question.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
i could say that i DON'T believe i will get struck my a meteorite within the next ten years of my life

i really don't think that's ever gonna happen

but i have no basis for it whatsoever
so the correct answer is "maybe", but i'm leaning towards "no"

and i don't think you would badger me for an explanation as to why
Last Edit: June 21, 2015, 05:23:29 PM by Verbatim


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
like, i can't definitively answer "yes" or "no" to that question without logically contradicting the very nature of spontaneity.
making it a pointless question.
You're using the wrong definition of spontaneous, which is my fault really for not clarifying.

Definition:
Quote
BIOLOGY
(of movement or activity in an organism) instinctive or involuntary.

What I mean is that the initial development of sentience in the machine will be an involuntary process. Let's do away with the word spontaneous and phrase it like this: upon reaching a certain degree of neural complexity, the machine will develop sentience as a result. However, the establishment of this complexity will not be as a result of deliberate, on-the-spot human programming.


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
but i have no basis for it whatsoever
You mean other than the statistical improbability of being hit by a meteor.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
What I mean is that the initial development of sentience in the machine will be an involuntary process. Let's do away with the word spontaneous and phrase it like this: upon reaching a certain degree of neural complexity, the machine will develop sentience as a result. However, the establishment of this complexity will not be as a result of deliberate, on-the-spot human programming.
that might be an even worse wording

but whatever, you meant that it would be a fluke--we would be experimenting on AI, and then as it turned out, we accidentally made a sentience, yes? and we probably wouldn't notice until much later, yes?

again, discussing whether this could ever happen, to me, is the same thing as discussing spontaneity in the other sense

it's as pointless as quibbling over whether there's a god

You mean other than the statistical improbability of being hit by a meteor.
right--just like the statistical improbability of us producing a sentient AI, deliberately or otherwise

not that there's any statistics on that--the point being, i think our brain chemistry is too complex to be replicated
deliberately or otherwise

that's my argument


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
You mean other than the statistical improbability of being hit by a meteor.
and that's really not much of a basis anyway


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
i think our brain chemistry is too complex to be replicated
But this is my issue.

You haven't justified I) why you're using our own connections as the metric for neural complexity or II) why you think it's too complex either way.

If it was just a passing "Oh, I don't really think it'll happen" then fine. I, myself, would say it almost definitely won't happen within this century, and there's a good chance our own sense of self-preservation would lead to some kind of transhumanism before we give a sentient AI free reign.

But both of those are a far-cry from flat-out asserting "AIs will never be sentient".


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
You mean other than the statistical improbability of being hit by a meteor.
and that's really not much of a basis anyway
Well, it is if you aren't >50pc of the sample size. Which you aren't.


 
Verbatim
| Komm, süßer Tod
 
more |
XBL:
PSN: Verbatim-1
Steam: Jaco230
ID: Verbatim
IP: Logged

48,041 posts
If it was just a passing "Oh, I don't really think it'll happen"
oh please
you're honestly taking issue with how i'm saying it?

"i don't think it'll happen" is really what my assertion boils down to--that's all i've been saying this entire thread
i obviously can't be certain about anything, but i make statements of certainty all the time, usually for effect

what's odd is that you're only taking issue with it now on this particular subject


 
More Than Mortal
| d-d-d-DANK ✡ 🔥🔥🔥 🌈
 
more |
XBL:
PSN:
Steam: MetaCognition
ID: Meta Cognition
IP: Logged

15,060 posts
This is the way the world ends. Not with a bang but a whimper.
what's odd is that you're only taking issue with it now on this particular subject
Probably because I don't think I've ever seen you make a certain temporal statement before. Like, saying with certainty that something isn't true? Fine. But saying something will never happen even when the logical possibility exists? No, that just doesn't really sit right with me.

Plus, I love sci-fi.


Saleem | Heroic Unstoppable!
 
more |
XBL:
PSN:
Steam:
ID: Saleem
IP: Logged

2,483 posts
Sigs fo nigs
I've mulled over in my head for a while the ramifications of creating a sentient AI from the machine's perspective, and I think that the results could be kinda cruel to the creation.

What will a machine do when given emotions, thoughts, and wants? A robotic mind I doubt will function as blissful as a toddler upon, basically being given a soul. It will want to be logical, but emotions tend to negate logic most likely causing internal errors and in extreme situations rampancy.

Giving a machine sentience the best course of action that I've thought of is to guide it along like a child, starting from a downgraded form and work with the AI for years and upgrade its capacity for raw knowledge and computation as time goes along to get it adapted to "life"
Shaping it in our image (Oh look, playing god) could take a couple turns, hopefully it won't develop a superiority complex, or megalomania. Perhaps it can get comfortable living like an average person with extraordinary benefits.

But it is what happens inside the AI's brain, it will feel like we do. It's hard to say how something more or less inanimate reacts to being suddenly "alive"

The machine will learn what is is to suffer, what it is to be happy, angry, and sad. Early prototypes will most likely overload from all the stimulation of being given sentience. I know there is a early AI the Japanese are putting in Asimo robots that can recognize shapes and words. Merely a start, it's too early to fully understand from a factual point, at least for me.

There's some loose ramblings from me, robotics and AI are very intriguing to me, and while I don't have all the knowledge to bring to the table I like to think ahead to what things could be.



Release | Heroic Posting Rampage
 
more |
XBL:
PSN:
Steam:
ID: SmellyWontonNoodles
IP: Logged

1,244 posts
"Ornate chandeliers suspended from a vaulted ceiling lit the spacious chamber; Jack tilted his gaze overhead and noticed how far away they were.  His thoughts wove around those bright lights, like a dance of ether masses spiraling in precious unison. Why must we try to clutch desperately for the mere threads of this world when we can clasp onto a tapestry of untold magnificence beyond this plane of existence?"
I don't think it'll ever reach the point in which AI is able to challenge human dominance. However, just because AI will always be subordinate to us doesn't mean that people will view them as slaves necessarily. I think people will view them as a brother species of sorts, with fascination and admiration.


 
𝑺𝒆𝒄𝒐𝒏𝒅𝑪𝒍𝒂𝒔𝒔
| 𝑪𝒂𝒓𝒎𝒆𝒏
 
more |
XBL:
PSN: ModernLocust
Steam:
ID: SecondClass
IP: Logged

30,018 posts
"With the first link, the chain is forged. The first speech censured, the first thought forbidden, the first freedom denied, chains us all irrevocably."
—Judge Aaron Satie
——Carmen
No.


 
 
Mr. Psychologist
| Imperial Forum Ninja
 
more |
XBL:
PSN:
Steam:
ID: Mr Psychologist
IP: Logged

17,215 posts
<.<
They should be treated equally but given how we treat the next closest thing to a sentient species on this planet I'm not holding out much hope for meatbags to be respectful to AI.

They shouldn't be slaves but no doubt they will.


Tackel | Heroic Unstoppable!
 
more |
XBL:
PSN:
Steam:
ID: Tackel
IP: Logged

2,580 posts
 
They should be treated equally but given how we treat the next closest thing to a sentient species on this planet I'm not holding out much hope for meatbags to be respectful to AI.

They shouldn't be slaves but no doubt they will.

Agreed.


Ásgeirr | Mythic Inconceivable!
 
more |
XBL: ossku
PSN:
Steam: ossku/Oss
ID: Ossku
IP: Logged

13,542 posts
The angel agreed to trade a set of white wings for the head of another demon. Overjoyed, the demon killed one of his own and plucked the head right off its still-warm body.

The angel then led the demon to heaven, where he underwent centuries of the cruelest tortures imaginable. Finally, the pain was so great that he lost consciousness - at which point his dark wings turned the promised shade of white.
i would honestly hold them in a higher regard