Jump to content

AI will destroy the world, we are all fuced.


Recommended Posts

  • vinme changed the title to AI will destroy the world, we are all fuced.

When I get famous, I would want to get creditted for the phrase: ¨Technology only seems like magic to those who are ignorant of how technology works.¨

 

The only thing in my way is that I need to design the next equivalent of ´flappy bird´.

Edited by LetswaveaBook
by the way: I ask the AI to create such a game for me.
  • Like 5
  • Haha 3
Link to comment
Share on other sites

1 hour ago, LetswaveaBook said:

The only thing in my way is that I need to design the next equivalent of ´flappy bird´.

Ideas and inspirations are harder to come by than doing the programming. Writing a game like flappy bird is easy in any language that you are good at, but the inginuity is precious. However, I think 0AD is a better example of inginuity and programming... it continues it improve itself over the ages and clever players always invent new strategies.

AI will not destroy the world unless the programmer decides to. AIs are absolutely obedient to the commands of their programmer so it's the programmers whom we need to keep an eye on instead of the end product. Elon Musk has some ambitious ideas but I am not sure about the feasibility, practicality and the potential consequences of them. 

 

4 minutes ago, Lion.Kanzen said:

Or Science.

Technology is applied science and science is applied maths. 

There is no witchcraft in science, it's all about mathematical deduction and proofs. Everything can be proved using first principles and known axioms. Sometimes the result of a calculation may be beyond human intuition or ability to imagine (e.g. 4D symmetry, quantum mechanical behaviour), but there is absolutely no magic or supernatural power involved, which is what makes it credible. 

  • Like 1
Link to comment
Share on other sites

4 minutes ago, Sevda said:

Technology is applied science and science is applied maths. 

I am generalizing. there are times when people forget that there are humans behind it and that science and technology are subject to human error.

IMG_20220505_172643.thumb.jpg.7e2a3e1c09dd7e2d65ef9ba3feb6f88a.jpg

science is a tool. it can always be tested and discover new things.

  • Like 1
Link to comment
Share on other sites

19 hours ago, Sevda said:

AI will not destroy the world unless the programmer decides to.

Usually large neural nets are blackboxes whose mechanisms are not entirely understood by the programmers who make it. Big data tunes the weights. Said data cannot be completely understood by humans due to its scale. See the axed neural nets from various large projects. Most famously, the recruitment engine at Amazon turned out misogynistic and was subsequently pulled.

Then again, skynet isn't a thing. But there is potential for negative impacts here which are unintended by the authors. We don't make AI with if statements anymore. We feed it with enormous quantities of data in hopes that it finds a pattern in there. The pattern might not always be ideal because bias exists which manifests as artifacts in data sets.

Link to comment
Share on other sites

33 minutes ago, smiley said:

Usually large neural nets are blackboxes whose mechanisms are not entirely understood by the programmers who make it. Big data tunes the weights. Said data cannot be completely understood by humans due to its scale. See the axed neural nets from various large projects. Most famously, the recruitment engine at Amazon turned out misogynistic and was subsequently pulled.

people do have a very overhyped image of AI. It doesn't help that every other corporation says their product is "powered by AI" :mellow:

Link to comment
Share on other sites

1 hour ago, real_tabasco_sauce said:

people do have a very overhyped image of AI. It doesn't help that every other corporation says their product is "powered by AI" :mellow:

Of course AIs are not capable of everything; you also need to add a blockchain, or even better, five blockchains.

Link to comment
Share on other sites

You are too behind on trends my dude. It's all about Web3 right now. We take the web and sprinkle blockchains on it. The result is the solution to every conceivable problem of the current web.

Link to comment
Share on other sites

@real_tabasco_sauce yea it amazing seeing all these baby strollers "powered by AI". 

Not to mention all the bingbongs who got caught in cryptocurrency scams just because they heard people got rich off of Dogecoin. 

Much of the hype over emerging technology comes from people who don't understand it, and is fueled by media who want to get views on "amazing"  "flashy" or "end of the world" type stories.

Link to comment
Share on other sites

11 hours ago, smiley said:

entirely understood by the programmers who make it.

Even though the programmers are not deliberately trying to cause havoc, as a professional, they should at least know what they are doing and take the necessary precautions before releasing this product to the world. If they are failing to observe lab safety protocols then you must fire them and replace them with less reckless programmers.

11 hours ago, smiley said:

recruitment engine at Amazon turned out misogynistic and was subsequently pulled.

That's because misogynistic data was fed into the system. As hard as we try to deny it, many countries ( especially the ones researcing AIs) have misogynistic societies and gender inequality, and this will be reflected in the customer information of shopping websites. The AI will subsequently learn male superiority and become misogynistic. If we feed in data from matriarchal societies, then you will see the exact opposite.

 

Link to comment
Share on other sites

24 minutes ago, Sevda said:

Even though the programmers are not deliberately trying to cause havoc, as a professional, they should at least know what they are doing and take the necessary precautions before releasing this product to the world. If they are failing to observe lab safety protocols then you must fire them and replace them with less reckless programmers.

The programmers are working at the behest and for the benefit of their employers (whether it's the academic or corporate world). Why would their employers fire them? lol

Link to comment
Share on other sites

6 hours ago, Sevda said:

That's because misogynistic data was fed into the system. As hard as we try to deny it, many countries ( especially the ones researcing AIs) have misogynistic societies and gender inequality, and this will be reflected in the customer information of shopping websites. The AI will subsequently learn male superiority and become misogynistic.

I might not be entirely up to date about the woke movement, but calling data misogynistic seems on the woke side of things IMHO. Women make different decisions as men. That is reflected in some data. I would say it is a difficult philosophical question such a recruitment engine should be considered misogynistic.

Link to comment
Share on other sites

@LetswaveaBook I'm not up to date with most things and I don't know anything about that recruitment engine. But if it was fed with data where 99% of people in leading positions were male and it would result in females not being considered for leading positions just because of their gender - not their capabilities - I guess one could call it misogynistic.

Link to comment
Share on other sites

Nobody knows why or how the engine got those weights. That's the point. Artifacts unseen by humans manifests in billions of nodes. The engineers at Amazon aren't stupid enough to not see the obvious source of bias. Its the non obvious ones which remains hidden in data sets. In fact, I would venture a decent guess that menstruation, pregnancies, general agreeableness, and overall aggressiveness influencing said output more so than most corporations being led by men. In which case, the data isn't necessarily wrong. Its just biology and body chemistry. The average hours worked also agrees with the sentiment.

AI ethics exists not to vet data. Its to ensure the mathematical approach is actually socially fair as well. A military recruitment model would suggest soldiers to ideally be > 6ft straight males. Easy to see why. Easy to see its the most optimal. Not so easy to see if its the ethically right choice. Standard deviation would suggest that people not of those qualities might perform better as well.

In statistics, stereotyping isn't exactly wrong. It's just probability.

Then there is the more general question of whether or not reducing people to numbers is ethical.

Link to comment
Share on other sites

1 hour ago, smiley said:

general agreeableness, and overall aggressiveness influencing said output more

hmmm...

1 hour ago, smiley said:

A military recruitment model would suggest soldiers to ideally be > 6ft straight males. Easy to see why.

Oh dear. To pickup just the 'male' aspect: AFAIK the US military opened up for women when the average male cannon fodder was just too dumb to be of any use in the field.

Link to comment
Share on other sites

  • 2 months later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...