Hawking warns of the dangers of AI

Technology, computers, sciences, mysteries and phenomena of all kinds, etc., etc. all here at The Loresraat!!

Moderator: Vraith

User avatar
Vraith
The Gap Into Spam
Posts: 10621
Joined: Fri Nov 21, 2008 8:03 pm
Location: everywhere, all the time

Post by Vraith »

Zarathustra wrote:I think that people fear AI because they imagine it will do tbings we don't want it to do, and take over. But as smart as these programs will be, they won't actually want things. They won't have their own goals. They won't be able to decide that they're better off without these pesky humans.

We're already using AI with absolutely no negative consequences. Your texting app uses AI to anticipate what words you're trying to type. Anyone frightened of their texting app suddenly deciding to text people without your permission? Much less take over the world? Of course not. But if AI is so unpredictable and uncontrollable, why do we all use it daily without worrying about it?

It's like the people who worry about the government putting chips in us and tracking our every move ( I know people like this personally) but don't worry about the chip they carry around with them everywhere they go. Cell phones already present all the dangers they allegedly fear, but they don't worry about them. AI will be like that. It already is.

First note: I'm not really one of those worried much about AI run amok on its own...though I'm pretty worried about the uses a smart bad actor could put even limited AI to. I'm SURE someone out there is working on an AI hacker...and just like the Go-playing AI did things humans never thought of [and don't know exactly how/why the AI "thought" of it], just like a couple of medical AI's have made discoveries humans didn't even though they had the same data---things could get interesting. And I don't even consider these machines to be intelligent. [[I'm not sure anyone does, really---do they?]]

But to play Devil's advocate---they don't have to actually "want" anything. They just need an instruction or set of instructions that is badly written---a "goal" or "purpose" that isn't constrained appropriately, either on purpose or because of unintended/unexpected process/path branching.

Also, a general AI is a far different beast than the "idiot savant" things we're building now. If/when a general system becomes possible---well, it STILL won't need ACTuAL consciousness, or ACTUAL wants, desires, "divine" or other purposes. It just needs a couple lines of code to make it "believe" it has those things.

On the last---I'm not paranoid, but I DO worry about the cell capabilities. And that of all my devices. Anyone who isn't a little worried about those things is a fool. But also, there is mostly fuck-all you can do about it. I do a few of things to make it HARDER, but my efforts are almost surely nothing more than trivial/annoying to the powers that be. The "targeted ads" I get are much more off-target than the ones most of my friends get---but they aren't ALL off-target. And it isn't the abilities of the devices/AI directly I worry about. It's the fact that all of that info is available to fuckheads and sociopaths. Human ones. Human ones that don't in any way have my needs, desires, or best interests in mind. The exact opposite.

Pretty sure I've seen you say something like "companies use that data to provide you things services you want more efficiently, what's wrong with that?"
There are two problems I see with that:
The first is, the information they gather doesn't BELONG to them...also, they don't just keep it, and they don't keep it secure. They both share/sell it AND totally SUCK at preventing it from being stolen. They barely try.
[[living with the fact of data theft is cheaper and easier and relatively harmless---for THEM---than protecting it.]]
The second is: I think you underestimate the extent to which they can flip/manipulate, "provide you things you want" and "make you want what they provide." And I mean "make" in the strong sense. Creating/causing a want. There's a continuum from mere advertising to propaganda to brainwashing/thought-control, and drawing any hard lines is a tricky proposition. But current techniques are quite a bit closer to the second and last than the first in many ways. And a decent AI [even a limited "expert system"] makes it easier. And those are already up and running.
[spoiler]Sig-man, Libtard, Stupid piece of shit. change your text color to brown. Mr. Reliable, bullshit-slinging liarFucker-user.[/spoiler]
the difference between evidence and sources: whether they come from the horse's mouth or a horse's ass.
"Most people are other people. Their thoughts are someone else's opinions, their lives a mimicry, their passions a quotation."
the hyperbole is a beauty...for we are then allowed to say a little more than the truth...and language is more efficient when it goes beyond reality than when it stops short of it.
User avatar
wayfriend
.
Posts: 20957
Joined: Wed Apr 21, 2004 12:34 am
Has thanked: 2 times
Been thanked: 4 times

Post by wayfriend »

Let's not prey upon the confusion between "artificial intelligence" when it means performing tasks using cognitive algorithms, and "artificial intelligence" when it means a self-directing automaton. AI, like many things, comes in many degrees. Disparaging someone's fear of an AI on the big end of the scale by suggesting they actually fear an AI on the small end is just mockery without a valid point. Like saying a fear of tigers is dumb because people have cats no problem.
User avatar
Zarathustra
The Gap Into Spam
Posts: 19629
Joined: Tue Jan 04, 2005 12:23 am

Post by Zarathustra »

WF, if I've said anything to offend you, please accept my apology. I'd hate for an interesting discussion to go off the rails. I was thinking of my dad, not you. He is very paranoid about the whole chip thing, and it drives me nuts because of the cell phone argument. I'm not disparaging anyone, I'm skeptical of "future fear" in all its forms, whether global warming or bible prophesy or technophobia. I apply this criticism toward CEOs, Stephen Hawkins or whomever. Please don't take it personally. I'm not preying upon anything.

V, you're wrong. There is something we can do about it. You don't have to have a cell phone. People got along fine for 1000s of years without them. You could get along fine in modern society with a pager and a land line. You might actually get more done, lol.
Joe Biden … putting the Dem in dementia since (at least) 2020.
User avatar
Vraith
The Gap Into Spam
Posts: 10621
Joined: Fri Nov 21, 2008 8:03 pm
Location: everywhere, all the time

Post by Vraith »

Zarathustra wrote: You could get along fine in modern society with a pager and a land line. You might actually get more done, lol.
Heh...that is kinda funny. But it's becoming less true all the time.
I could do it...but it would be much harder and less efficient.
Even now, even "simple" small farmers and fisherfolk [among others] can't operate efficiently without some serious hardware and data.
Ten or 20 years from now, people who aren't always on will be basically the New Amish as far as the vast majority of the world is concerned.
And, while I think there are some horrifyingly bad risks/possibilities, if we just get a LITTLE bit lucky, act a LITTLE bit more smart and accepting than not... that future will be something between cool as shit and totally fucking AWeSOME.
[spoiler]Sig-man, Libtard, Stupid piece of shit. change your text color to brown. Mr. Reliable, bullshit-slinging liarFucker-user.[/spoiler]
the difference between evidence and sources: whether they come from the horse's mouth or a horse's ass.
"Most people are other people. Their thoughts are someone else's opinions, their lives a mimicry, their passions a quotation."
the hyperbole is a beauty...for we are then allowed to say a little more than the truth...and language is more efficient when it goes beyond reality than when it stops short of it.
User avatar
Avatar
Immanentizing The Eschaton
Posts: 61711
Joined: Mon Aug 02, 2004 9:17 am
Location: Johannesburg, South Africa
Has thanked: 15 times
Been thanked: 21 times

Post by Avatar »

I don't have a smart phone. :D

--A
User avatar
Zarathustra
The Gap Into Spam
Posts: 19629
Joined: Tue Jan 04, 2005 12:23 am

Post by Zarathustra »

Re: V's point a few posts back ... the job I'm currently doing is basically sales. Trust me, if there was a way to MAKE people want what you sale, no company would ever go bankrupt, and every salesman would be rich. I'm actually reading a book on sales now, LITTLE RED BOOK OF SELLING. It says most sales people ask 'how do I sell?' instead of the more informative question 'why do people buy?' That's what Big Data is all about. Not making you do something, but figuring out what you'll do and why. It might feel like someone with this knowledge/skill is making you do something you don't want to do, but they just understand what you want more explicitly and mindfully than you do. Just like a magician understands your perception/focus more than you do.

Selling isn't force any more than illusion is magic.
Joe Biden … putting the Dem in dementia since (at least) 2020.
User avatar
wayfriend
.
Posts: 20957
Joined: Wed Apr 21, 2004 12:34 am
Has thanked: 2 times
Been thanked: 4 times

Post by wayfriend »

Many techies and marketers are tapping, sometimes unintentionally, into decades of neuroscience research to make their products as addictive and profitable as possible. [link]
The leaders of Internet companies face an interesting, if also morally questionable, imperative: either they hijack neuroscience to gain market share and make large profits, or they let competitors do that and run away with the market. [link]
What could possibly motivate users to spend $30 or $50 at a time on a free smartphone game? Game developers have learned to make their apps as appealing as possible by directly exploiting the mechanism of addiction. Habit formation is inseparable from the workings of dopamine in the brain, a neurotransmitter that's tied to learning, exploring, seeking out novelty, and feelings of being rewarded. [link]
etc.
etc.
etc.
User avatar
Zarathustra
The Gap Into Spam
Posts: 19629
Joined: Tue Jan 04, 2005 12:23 am

Post by Zarathustra »

Well, let's not prey upon the confusion between using data to target ads to individuals and the so-called addictiveness of a video game. :P video games have always been difficult to quit, and when you went to the arcade, they were designed to get you to pop quarters into the machine. But this isn't what V and I were talking about. Presumably if you're playing a game, you want to play it. No one is making you want to play it some more with slick ads or propaganda, as V alleged. Casinos are full of games that work on the same principle of reward and dopamine, etc. This has nothing to do with AI or Big Data.
Joe Biden … putting the Dem in dementia since (at least) 2020.
User avatar
wayfriend
.
Posts: 20957
Joined: Wed Apr 21, 2004 12:34 am
Has thanked: 2 times
Been thanked: 4 times

Post by wayfriend »

Zarathustra wrote:Well, let's not prey upon the confusion between using data to target ads to individuals and the so-called addictiveness of a video game.
The topic was "making" people buy things. That's what video games now do - they influence you to want and to spend money for [virtual] things. (The links were pretty clear on that.) They're not spending billions of dollars on the science of inducing people to spend money if it doesn't work. So no confusion on my part at all. It may not be what Vraith was discussing but it's certainly related and on-topic. And it certainly puts to rest the very incorrect notion that there's no way to MAKE people want what you sell. Within certain markets, it's not only possible, it's the business model.
User avatar
Vraith
The Gap Into Spam
Posts: 10621
Joined: Fri Nov 21, 2008 8:03 pm
Location: everywhere, all the time

Post by Vraith »

I was actually talking about BOTH, that's why I mentioned the continuum, though the whole contains more lines/subsectors than just that one.

Z, you're making a separation that used to be broader/clearer and more "real" than it is now, and far more than what is coming.

What is coming is in the realm W was going for.

As in, in 1984 the bad guys took advantage of the fact that everyone is afraid of something to make the "hero" turn...or, more accurately dissolve.
EVERYONE can be addicted.
EVERYONE can be manipulated.
There are some defenses---information, education, critical thinking...
But the creators are working hard to eliminate, overwhelm and repurpose them.
And THEY have the money, they have the know-how, they have the hardware and software.
And, UNLIKE 1984, they don't have to work on you consciously, they don't have to play traditional evil/supervillian and TELL you it's being done, tell you why...that's just a certain sadism and storytelling device. And/or only has to be used on the insightful belligerents.
They INTEND to be as un/sub-conscious as possible.
The current relatively primitive algorithms are quite good at teasing out what you want and looping it to both make it available and make it even more appealing.
They're slightly good at twisting the desire/appeal to make other things ALSO appealing.
If you aren't concerned about it, you need to look deeper and further.
[spoiler]Sig-man, Libtard, Stupid piece of shit. change your text color to brown. Mr. Reliable, bullshit-slinging liarFucker-user.[/spoiler]
the difference between evidence and sources: whether they come from the horse's mouth or a horse's ass.
"Most people are other people. Their thoughts are someone else's opinions, their lives a mimicry, their passions a quotation."
the hyperbole is a beauty...for we are then allowed to say a little more than the truth...and language is more efficient when it goes beyond reality than when it stops short of it.
User avatar
Zarathustra
The Gap Into Spam
Posts: 19629
Joined: Tue Jan 04, 2005 12:23 am

Post by Zarathustra »

wayfriend wrote: The topic was "making" people buy things. That's what video games now do - they influence you to want and to spend money for [virtual] things. (The links were pretty clear on that.) They're not spending billions of dollars on the science of inducing people to spend money if it doesn't work. So no confusion on my part at all. It may not be what Vraith was discussing but it's certainly related and on-topic. And it certainly puts to rest the very incorrect notion that there's no way to MAKE people want what you sell. Within certain markets, it's not only possible, it's the business model.
Likewise, I thought the topic here was AI. That includes the kind on my phone, not just the kind you want to emphasize. If you cut me some slack about extending the argument, I don't mind to return the favor. No one likes to be described as preditor just for speaking their mind.

I read your links. The first one starts out by noting that it's not actually addiction, then goes on to say it's addiction. Whatever. No one forces you to play video games. If you play, you already want it before they have used any reward response to get you to keep wanting it. If someone designs a game that you want to keep playing, all they've done is designed a good game. What's the alternative? Should designers intetionally make games that you don't want to keep playing? This isn't anything new. It's the same principle behind casinoes. I went to a casino. I put a dollar in a slot machine, pulled the handle, nothing happened, I walked away. No one made me want to keep playing. You are able to resist. Well, I am. I refuse to admit defeat or submit to others' control. And I vehemently resist victimhood mindsets in all their forms. Others seem to relish in them. I honestly don't get it.

V, your continuum didn't include dopamine and reward response. If that's what you meant, it didn't come through in your post. Making 'addictive' (i.e. fun) video games isn't the same as propaganda.

Let's not conflate fun with addictive. Fun is fun because of dopamine. There's nothing nefarious about it. That's literally the reason we all like fun things. You get a dopamine hit for lots of reasons.

You guys seem to think that AI will only benefit the nefarious. Maybe Norton will have AI packages they offer to counteract the very things you fear. There is plenty of money to be made in providing useful services. But you will forget that if your starting point is assuming that money and profit is something to be viewed with suspicion.

That's what bugs me the most about all this, the suspicion. Ok, let's be mindful of the risks. But let's not lose sight of the fact that this might be one of the best things that has ever happened to mankind. People are scared of lots of things that has been mostly amazing, whether it's airplanes, cell phones or flouride. Our fear is a greater thing to fear than all the other things we fear.
Joe Biden … putting the Dem in dementia since (at least) 2020.
User avatar
peter
The Gap Into Spam
Posts: 11542
Joined: Tue Aug 25, 2009 10:08 am
Location: Another time. Another place.
Been thanked: 6 times

Post by peter »

wayfriend wrote:
Many techies and marketers are tapping, sometimes unintentionally, into decades of neuroscience research to make their products as addictive and profitable as possible. [link]
The leaders of Internet companies face an interesting, if also morally questionable, imperative: either they hijack neuroscience to gain market share and make large profits, or they let competitors do that and run away with the market. [link]
What could possibly motivate users to spend $30 or $50 at a time on a free smartphone game? Game developers have learned to make their apps as appealing as possible by directly exploiting the mechanism of addiction. Habit formation is inseparable from the workings of dopamine in the brain, a neurotransmitter that's tied to learning, exploring, seeking out novelty, and feelings of being rewarded. [link]
etc.
etc.
etc.
I read a fascinating article on the latter quote that used the 'like' button on Facebook as an example. This tiny little addition pumped up the usage of the site (and made the lady who developed it a fortune) by introducing a tiny little, but highly addictive, adrenaline (or dopamine - whatever) hit into the process of posting or reading posts. Companies like Cambridge Analytica are already using algorithms to crunch Big Data to the point where the systems understand us better than we understand ourselves. Based on the data of millions of users they know what nudges and prompts any individual needs to make them act in a given way - and can tailor bespoke targeted ads or whatever to do so. Remember, these plots do not have to work every time: if they just work heuristically to get enough 'hits' to get the end result they want that's good enough.
The truth is a Lion and does not need protection. Once free it will look after itself.

....and the glory of the world becomes less than it was....
'Have we not served you well'
'Of course - you know you have.'
'Then let it end.'

We are the Bloodguard
User avatar
Zarathustra
The Gap Into Spam
Posts: 19629
Joined: Tue Jan 04, 2005 12:23 am

Post by Zarathustra »

peter wrote:Remember, these plots do not have to work every time: if they just work heuristically to get enough 'hits' to get the end result they want that's good enough.
True. From WF's last link:
Overall, 50% of mobile gaming revenue came from the top 10% of mobile gamers making purchases. These heavy spenders, termed "whales," have been directly compared to the "big fish" courted by casinos. To generate vast profits, freemium games don't have to hook everyone; instead, they only need to attract a small fraction of diehard fans.

...

The makers of the free game Candy Crush Saga made $1.88 billion in revenue in 2013, and the company has stated that only 4% of its users have made purchases through the game. These users, on average, have each paid over $150 while playing.
Most of the money is generated by a relatively small amount of people. Probably people with addiction problems. For the vast majority of people--96% of Candy Crush players, for instance--no amount of "neuroscience research" is going to make them want something they don't want. Perhaps game designers have figured out how to manipulate a fraction of people who already are easily manipulated. The rest of us don't have anything to worry about. I have never, ever paid money for a game app. In fact, I've never played one. There's is absolutely nothing anyone could do to get me to do so. I'm not that bored. I like to read.

I don't have a Facebook account, either. The "like" system is completely wasted on me.
Joe Biden … putting the Dem in dementia since (at least) 2020.
User avatar
Vraith
The Gap Into Spam
Posts: 10621
Joined: Fri Nov 21, 2008 8:03 pm
Location: everywhere, all the time

Post by Vraith »

Zarathustra wrote: You guys seem to think that AI will only benefit the nefarious.

Maybe Norton will have AI packages they offer to counteract the very things you fear.
On the first, I only think it COULD, and current constraints and power loci offer and encourage too many avenues that go that direction. But it needn't, and I'm reasonably optimistic that it won't.

On the second...funny, and perhaps true. I recommend a novel "Mother of Storms." Not for the main plot, that might bug you. But it has other cool stuff in it, some related to this thread. One of the things it has is viruses in the networks. Self-improving, and network-maintaining/improving. They work just like "bad" viruses, only on the side of "good."
[spoiler]Sig-man, Libtard, Stupid piece of shit. change your text color to brown. Mr. Reliable, bullshit-slinging liarFucker-user.[/spoiler]
the difference between evidence and sources: whether they come from the horse's mouth or a horse's ass.
"Most people are other people. Their thoughts are someone else's opinions, their lives a mimicry, their passions a quotation."
the hyperbole is a beauty...for we are then allowed to say a little more than the truth...and language is more efficient when it goes beyond reality than when it stops short of it.
User avatar
peter
The Gap Into Spam
Posts: 11542
Joined: Tue Aug 25, 2009 10:08 am
Location: Another time. Another place.
Been thanked: 6 times

Post by peter »

Saying the self teaching algorithms develop algorithms massively subtlety beyond our recognition abilities to manipulate our behaviour in the direction of their own coldly rational designs! We'd never know....??...... Hell, they could be doing it already!!!!

They're out there I tell you!!!!

:hide:
The truth is a Lion and does not need protection. Once free it will look after itself.

....and the glory of the world becomes less than it was....
'Have we not served you well'
'Of course - you know you have.'
'Then let it end.'

We are the Bloodguard
User avatar
Vraith
The Gap Into Spam
Posts: 10621
Joined: Fri Nov 21, 2008 8:03 pm
Location: everywhere, all the time

Post by Vraith »

peter wrote:Saying the self teaching algorithms develop algorithms massively subtlety beyond our recognition abilities to manipulate our behaviour in the direction of their own coldly rational designs! We'd never know....??...... Hell, they could be doing it already!!!!

They're out there I tell you!!!!

:hide:

HAH---to make you [maybe] feel better...and spoiler for the novel I recommended that no one will read who hasn't already:

The "good" viruses in the net meet people attached to the net...SOOO, they do their job, and make the peoples BRAINS work better!
I'm telling you, we gotta go cyborg, and that's one reason why! Best of both worlds, totally human and hot wired at the same time.
[spoiler]Sig-man, Libtard, Stupid piece of shit. change your text color to brown. Mr. Reliable, bullshit-slinging liarFucker-user.[/spoiler]
the difference between evidence and sources: whether they come from the horse's mouth or a horse's ass.
"Most people are other people. Their thoughts are someone else's opinions, their lives a mimicry, their passions a quotation."
the hyperbole is a beauty...for we are then allowed to say a little more than the truth...and language is more efficient when it goes beyond reality than when it stops short of it.
User avatar
peter
The Gap Into Spam
Posts: 11542
Joined: Tue Aug 25, 2009 10:08 am
Location: Another time. Another place.
Been thanked: 6 times

Post by peter »

Absolutely V! Hot-wired is the way to go! :lol:
The truth is a Lion and does not need protection. Once free it will look after itself.

....and the glory of the world becomes less than it was....
'Have we not served you well'
'Of course - you know you have.'
'Then let it end.'

We are the Bloodguard
User avatar
wayfriend
.
Posts: 20957
Joined: Wed Apr 21, 2004 12:34 am
Has thanked: 2 times
Been thanked: 4 times

Post by wayfriend »

(Read Mother of Storms. It was interesting.)
User avatar
peter
The Gap Into Spam
Posts: 11542
Joined: Tue Aug 25, 2009 10:08 am
Location: Another time. Another place.
Been thanked: 6 times

Post by peter »

Will check it out WF. :)
The truth is a Lion and does not need protection. Once free it will look after itself.

....and the glory of the world becomes less than it was....
'Have we not served you well'
'Of course - you know you have.'
'Then let it end.'

We are the Bloodguard
User avatar
wayfriend
.
Posts: 20957
Joined: Wed Apr 21, 2004 12:34 am
Has thanked: 2 times
Been thanked: 4 times

Post by wayfriend »

A good reason to consider the danger of AI is this: when 1% of the people in the world control all the resources, all the money, and all the armed forces, and when both production and services are automated, and when both innovation and problem resolution are provided by AI, then the other 99% of the people in the world will be unnecessary, expensive, and (from one perspective) unsightly.

Should this be a concern? It depends on the compassion you expect from the 1%. So you might ask: How are they treating unemployed people looking for food, clothing, and shelter so far? Have they been very gentle and caring when they put people out of work with automation and AI?
Post Reply

Return to “The Loresraat”