This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.
Messages - More Than Mortal
Pages: 1 ... 422423424 425426 ... 502
12691
« on: September 24, 2014, 01:49:31 AM »
There is no bubble. We will see years of relative contraction to the growth, merely because technology seems to develop in an S shape when represented graphically.
It'll only become obsolete when automation can replace them, but that isn't a crash.
Now, if you want to see a bubble, look at higher education.
12692
« on: September 24, 2014, 01:43:58 AM »
I haven't watched it but I will say this. Scientists are there to tell you what's wrong, how severe it is and what needs fixing.
Economists should be the ones to tell you how to fix it.
12693
« on: September 23, 2014, 05:33:09 PM »
"This video is not available in your country."
Commit suicide you fucking britbong terrorist.
How about now?
12694
« on: September 23, 2014, 05:32:10 PM »
Where it is does not matter. How our brains process information matters.
Automation doesn't need to change that. Logic, itself, dictates we shouldn't go without emotions.
12695
« on: September 23, 2014, 05:31:30 PM »
But there's programming behind that. Why would any programmer give a robot the function to defend itself? Self defense if the only real reason why it would bother to evolve itself. I can understand a robot that's programmed to adapt to the ever-changing stock market, but that's different from evolving to where it'll protect itself from a human trying to deactivate it.
Don't get me wrong, the gulf between our current state and artificial intelligence is huge; we're off on a tangent here. The problem, however, isn't that we would "disallow" it to improve itself, it simply might not be an option when you reach a threshold of machine intelligence. That said, there are ideas that, if we ever do create an artificial general intelligence, to ensure it is "human friendly" and then putting it in charge of the development of all other intelligences. However, the only situation where I could ever, ever see such a limitation working is where humans are part machine instead of just inefficient meatbags. You can have as many programmers as you want, but the more intelligent the machine is, the smaller the ability of humans to restrain it.
12696
« on: September 23, 2014, 05:22:51 PM »
Given the recent topics of conversation, it seems appropriate. Really good watch.
12697
« on: September 23, 2014, 05:18:59 PM »
That isn't my point. My point is we can prosper without compromising our humanity. Becoming cyborgs is not what we need to do. Nothing is inevitable. I, personally, would fight until my dying breath to stop something like this from happening.
Machines don't deserve rights. I don't care how sentient it is. They're tools.
I'd argue it isn't at all compromising one's humanity.
Directed, self-imposed evolution? It's practically godhood.
The only problem I see is that robots would never do more than what they're programmed to do, even if they're fully capable. Unless they're programmed to self-defend, like humans are programmed to, they're not going to evolve themselves on their own accord.
Part of the reason that automated mental labour will outpace human mental labour is the capacity, not only for them to teach themselves how to accomplish something (as with the stock market), but to actually re-write themselves.
12698
« on: September 23, 2014, 05:17:34 PM »
You have to look at the long run. Doing something now that may not seems like it changes anything could result in a huge change over the years.
I suppose it depends on your perception of it. It seems like a worthwhile trade-off to mine eyes.
Some things aren't.
What's the point of anything if we aren't us anymore?
"Us" is such an abstract concept to me as to be inconsequential. I am me, and you are you. Whether or not my perception/mind/intelligence is in my skull or in a metal case or somewhere between the two, it makes no difference to me.
12699
« on: September 23, 2014, 05:16:18 PM »
Actually another thing that's incredibly odd is, granted the civilization that relayed the signal does exist, is how it's not that far off from our own technological capability/evolution. Hundreds or thousands of years might seem like a big difference, but there's been 14 billion years of time since the Big Bang. The probability that a species developed just about the same time as us is incredibly odd. It makes you wonder if technologically advanced species really do die off and that's why you don't see any around, just those species that are still developing.
I thought scientists concluded that if the Wow! signal was legit it'd be from quite an advanced civilisation? Nonetheless, it doesn't seem unreasonable to think that once a society produces bona fide artificial general intelligence, if given the opportunity, their creation would destroy them.
12700
« on: September 23, 2014, 05:10:45 PM »
You have to look at the long run. Doing something now that may not seems like it changes anything could result in a huge change over the years.
I suppose it depends on your perception of it. It seems like a worthwhile trade-off to mine eyes.
12701
« on: September 23, 2014, 05:09:07 PM »
Chances are, some race has found us and put a big old NOPE sign on our planet. A planet run by a bunch of hairless apes running around killing each other seems scary if you ask me.
Any enterprising alien race should realise the fantastic opportunity to indoctrinate and breed an army of this bunch of hairless apes. Oh God. . . What if we're all currently in training?
12702
« on: September 23, 2014, 05:07:43 PM »
Learning about Plato at the moment.
*sigh*
Classical philosophers are fucking overrated. Especially Aristotle.
12703
« on: September 23, 2014, 05:06:54 PM »
If privacy is maintained in communal housing just as if it were private, then it probably couldn't be too bad. But I'm really not all that interested in having people walk in on me taking a shit.
And free love... *shivers in fear*
Private property =/= personal property. The latter is shit like your car, your house, your possessions. Private property are things like businesses, capital, et cetera. You can't have a functioning society without the first one.
12704
« on: September 23, 2014, 05:05:22 PM »
That isn't my point. My point is we can prosper without compromising our humanity. Becoming cyborgs is not what we need to do. Nothing is inevitable. I, personally, would fight until my dying breath to stop something like this from happening.
Machines don't deserve rights. I don't care how sentient it is. They're tools.
I'd argue it isn't at all compromising one's humanity. Directed, self-imposed evolution? It's practically godhood.
12705
« on: September 23, 2014, 05:01:01 PM »
I'd rather live in techno-socialist society than our current one
>can't tell if real meta or not
Nobody seriously thinks capitalism would work in an economy where there is general abundance and no need for human labour. Nobody. I'm just saying don't let the government fuck it up now so it perhaps isn't as bad when the really interesting shit starts happening. Although, I only use the word socialist because I imagine a completely automated workforce could only operate when owned socially. What's the point of private property if nobody is earning wages to compensate you for it?
12706
« on: September 23, 2014, 04:59:05 PM »
We got to the top of the food chain throwing spears around, god damn it. Technology apt for its age. You're fighting an inevitability, to be honest, one which I personally welcome. All of that knowledge to be accessed, all of that information we could be in possession of. not for some fucking future hippies to protest rights for. We make robots like you see in the movies, and we're screwed.
Well, if we create a machine that is genuinely sentient then I see no reason to not afford it rights. Although, any machine capable of sentience and improving itself would probably just destroy humanity.
12707
« on: September 23, 2014, 04:56:08 PM »
Oh yeah I was wrong earlier. I was thinking of a different intercepted signal. The Wow! signal hasn't been disproved.
But do you personally think it's legit?
If they haven't been able to disprove it after 35 years then I would say it seems like a good sign. I'm probably way too much of an optimist when it comes to the extraterrestrial life discussion.
Is it even optimistic to think that there's intelligent life out there? Is it not pessimistic?
Only if you're religious.
12708
« on: September 23, 2014, 04:55:10 PM »
Removing the need for relaxation/stress though, I wouldn't. You can give up parts of your humanity without losing it entirely, the things that aren't broken don't necessarily need replacing - Like Love/Happiness/Loyalty. I would be strongly opposed to any hyper edgy twat that was willing to remove those things from their personality in favour of clinical logic.
This is true. But what does the automation of humanity need to exclude such a thing. The greatest moral philosophers and psychologists in history (Hume, Russell and Haidt) have almost conclusively shown that morality is known via intuition and logic is only post hoc. Automation needn't remove that, or any other emotion.
12709
« on: September 23, 2014, 04:53:19 PM »
I mean couldn't the government just... ban robots?
Well, no. History is full of examples of unions trying to lobby the State for the expulsion of automation from the economy and they've failed every single time. The business-owners, if forced to lobby the State, would ultimately win out. Besides, it's too far gone at the moment to even contemplate that, and the government, if it tried, would undoubtedly fuck it up beyond belief. Nonetheless, I don't want to live in a world where automation doesn't go ahead. I'd rather live in techno-socialist society than our current one, even if the ride is incredibly bumpy.
12710
« on: September 23, 2014, 04:36:14 PM »
if we meld our brains with robots or become robotic in nature? Removing the necessity to eat and sleep? The need to even relax in the first place because we've found a way to mess with the chemicals/whatever that cause stress?
I'd do it in a heartbeat.
12711
« on: September 23, 2014, 04:35:35 PM »
Just look at nearly everybody glued to their smartphones to the point they're willing to risk their lives and other lives to send a text while driving or take a damn self-shot
That's down to a bias of immediacy, not some fault with technology.
12712
« on: September 23, 2014, 04:24:06 PM »
Society wouldn't have to work, the government would supply credits that would go toward the various activities they could do with their time. Only a small few of the most intelligent would be given a job—the reward for them being knowledge.
My concerns lie chiefly in the transitory stage between society now and society following.
Also, intelligence would be of no economic importance - nothing human would be. When the mental capacity of machines outpaces that of humans, the economy will be totally devoid of people. We will, I imagine, end up with some sort of scarcity-based gift economy.
Well, I don't know why I'm speculating. I don't know. I hope I'll get to see the results, though.
By transition you mean mass lay offs and joblessness?
Yes. Alongside the inevitable contrast between rich and poor and the probable creation of an incredibly small class of uber-rich people. I have no idea how the government is going to smoothly cope with that transition.
12713
« on: September 23, 2014, 03:58:54 PM »
Naw, I done fixed it.
12714
« on: September 23, 2014, 03:47:00 PM »
The taskbar has disappeared from the bottom of my screen.
Ctrl, alt, del also does nothing.
12715
« on: September 23, 2014, 03:42:21 PM »
12716
« on: September 23, 2014, 03:30:22 PM »
inb4 someone shows me a video of a robot writing poetry, composing music, or painting a pretty picture
I was literally going to do that, but you seem to have beat me to the punch.
12717
« on: September 23, 2014, 03:29:32 PM »
12718
« on: September 23, 2014, 03:23:59 PM »
Well, that's a good enough reason to say technology isn't all that great
I don't understand. Are you saying technology making capitalism obsolete means technology isn't all that great?
12719
« on: September 23, 2014, 02:59:23 PM »
Society wouldn't have to work, the government would supply credits that would go toward the various activities they could do with their time. Only a small few of the most intelligent would be given a job—the reward for them being knowledge.
My concerns lie chiefly in the transitory stage between society now and society following. Also, intelligence would be of no economic importance - nothing human would be. When the mental capacity of machines outpaces that of humans, the economy will be totally devoid of people. We will, I imagine, end up with some sort of scarcity-based gift economy. Well, I don't know why I'm speculating. I don't know. I hope I'll get to see the results, though.
12720
« on: September 23, 2014, 02:37:21 PM »
There's nothing we can do short of destroying the robots and factories.
Fuck that. I want my super-abundance.
Pages: 1 ... 422423424 425426 ... 502
|