vinme Posted May 3, 2022 Report Share Posted May 3, 2022 (edited) Edited May 3, 2022 by vinme 2 Quote Link to comment Share on other sites More sharing options...
wowgetoffyourcellphone Posted May 4, 2022 Report Share Posted May 4, 2022 Elon Musk is a parasite. That is all. 1 Quote Link to comment Share on other sites More sharing options...
LetswaveaBook Posted May 5, 2022 Report Share Posted May 5, 2022 (edited) When I get famous, I would want to get creditted for the phrase: ¨Technology only seems like magic to those who are ignorant of how technology works.¨ The only thing in my way is that I need to design the next equivalent of ´flappy bird´. Edited May 5, 2022 by LetswaveaBook by the way: I ask the AI to create such a game for me. 5 3 Quote Link to comment Share on other sites More sharing options...
Lion.Kanzen Posted May 5, 2022 Report Share Posted May 5, 2022 1 hour ago, LetswaveaBook said: When I get famous, I would want to get creditted for the phrase: ¨Technology only seems like magic to those who are ignorant of how technology works.¨ Or Science. Quote Link to comment Share on other sites More sharing options...
Yekaterina Posted May 5, 2022 Report Share Posted May 5, 2022 1 hour ago, LetswaveaBook said: The only thing in my way is that I need to design the next equivalent of ´flappy bird´. Ideas and inspirations are harder to come by than doing the programming. Writing a game like flappy bird is easy in any language that you are good at, but the inginuity is precious. However, I think 0AD is a better example of inginuity and programming... it continues it improve itself over the ages and clever players always invent new strategies. AI will not destroy the world unless the programmer decides to. AIs are absolutely obedient to the commands of their programmer so it's the programmers whom we need to keep an eye on instead of the end product. Elon Musk has some ambitious ideas but I am not sure about the feasibility, practicality and the potential consequences of them. 4 minutes ago, Lion.Kanzen said: Or Science. Technology is applied science and science is applied maths. There is no witchcraft in science, it's all about mathematical deduction and proofs. Everything can be proved using first principles and known axioms. Sometimes the result of a calculation may be beyond human intuition or ability to imagine (e.g. 4D symmetry, quantum mechanical behaviour), but there is absolutely no magic or supernatural power involved, which is what makes it credible. 1 Quote Link to comment Share on other sites More sharing options...
Lion.Kanzen Posted May 5, 2022 Report Share Posted May 5, 2022 4 minutes ago, Sevda said: Technology is applied science and science is applied maths. I am generalizing. there are times when people forget that there are humans behind it and that science and technology are subject to human error. science is a tool. it can always be tested and discover new things. 1 Quote Link to comment Share on other sites More sharing options...
Loki1950 Posted May 6, 2022 Report Share Posted May 6, 2022 Left brain says one thing the right side of the same brain says the contrary which side is correct in it's thinking nether the correct conclusion is a synthesis of both sides. Enjoy the Choice Quote Link to comment Share on other sites More sharing options...
smiley Posted May 6, 2022 Report Share Posted May 6, 2022 19 hours ago, Sevda said: AI will not destroy the world unless the programmer decides to. Usually large neural nets are blackboxes whose mechanisms are not entirely understood by the programmers who make it. Big data tunes the weights. Said data cannot be completely understood by humans due to its scale. See the axed neural nets from various large projects. Most famously, the recruitment engine at Amazon turned out misogynistic and was subsequently pulled. Then again, skynet isn't a thing. But there is potential for negative impacts here which are unintended by the authors. We don't make AI with if statements anymore. We feed it with enormous quantities of data in hopes that it finds a pattern in there. The pattern might not always be ideal because bias exists which manifests as artifacts in data sets. Quote Link to comment Share on other sites More sharing options...
real_tabasco_sauce Posted May 6, 2022 Report Share Posted May 6, 2022 33 minutes ago, smiley said: Usually large neural nets are blackboxes whose mechanisms are not entirely understood by the programmers who make it. Big data tunes the weights. Said data cannot be completely understood by humans due to its scale. See the axed neural nets from various large projects. Most famously, the recruitment engine at Amazon turned out misogynistic and was subsequently pulled. people do have a very overhyped image of AI. It doesn't help that every other corporation says their product is "powered by AI" Quote Link to comment Share on other sites More sharing options...
Gurken Khan Posted May 6, 2022 Report Share Posted May 6, 2022 1 hour ago, real_tabasco_sauce said: people do have a very overhyped image of AI. It doesn't help that every other corporation says their product is "powered by AI" Of course AIs are not capable of everything; you also need to add a blockchain, or even better, five blockchains. Quote Link to comment Share on other sites More sharing options...
smiley Posted May 6, 2022 Report Share Posted May 6, 2022 You are too behind on trends my dude. It's all about Web3 right now. We take the web and sprinkle blockchains on it. The result is the solution to every conceivable problem of the current web. Quote Link to comment Share on other sites More sharing options...
Gurken Khan Posted May 7, 2022 Report Share Posted May 7, 2022 1 hour ago, m7600 said: At the end of the day, a blockchain is just a glorified spreadsheet. Thinking (and talking) like that won't get you the lucrative contracts. Quote Link to comment Share on other sites More sharing options...
BreakfastBurrito_007 Posted May 7, 2022 Report Share Posted May 7, 2022 @real_tabasco_sauce yea it amazing seeing all these baby strollers "powered by AI". Not to mention all the bingbongs who got caught in cryptocurrency scams just because they heard people got rich off of Dogecoin. Much of the hype over emerging technology comes from people who don't understand it, and is fueled by media who want to get views on "amazing" "flashy" or "end of the world" type stories. Quote Link to comment Share on other sites More sharing options...
Yekaterina Posted May 7, 2022 Report Share Posted May 7, 2022 11 hours ago, smiley said: entirely understood by the programmers who make it. Even though the programmers are not deliberately trying to cause havoc, as a professional, they should at least know what they are doing and take the necessary precautions before releasing this product to the world. If they are failing to observe lab safety protocols then you must fire them and replace them with less reckless programmers. 11 hours ago, smiley said: recruitment engine at Amazon turned out misogynistic and was subsequently pulled. That's because misogynistic data was fed into the system. As hard as we try to deny it, many countries ( especially the ones researcing AIs) have misogynistic societies and gender inequality, and this will be reflected in the customer information of shopping websites. The AI will subsequently learn male superiority and become misogynistic. If we feed in data from matriarchal societies, then you will see the exact opposite. Quote Link to comment Share on other sites More sharing options...
wowgetoffyourcellphone Posted May 7, 2022 Report Share Posted May 7, 2022 24 minutes ago, Sevda said: Even though the programmers are not deliberately trying to cause havoc, as a professional, they should at least know what they are doing and take the necessary precautions before releasing this product to the world. If they are failing to observe lab safety protocols then you must fire them and replace them with less reckless programmers. The programmers are working at the behest and for the benefit of their employers (whether it's the academic or corporate world). Why would their employers fire them? lol Quote Link to comment Share on other sites More sharing options...
LetswaveaBook Posted May 7, 2022 Report Share Posted May 7, 2022 6 hours ago, Sevda said: That's because misogynistic data was fed into the system. As hard as we try to deny it, many countries ( especially the ones researcing AIs) have misogynistic societies and gender inequality, and this will be reflected in the customer information of shopping websites. The AI will subsequently learn male superiority and become misogynistic. I might not be entirely up to date about the woke movement, but calling data misogynistic seems on the woke side of things IMHO. Women make different decisions as men. That is reflected in some data. I would say it is a difficult philosophical question such a recruitment engine should be considered misogynistic. Quote Link to comment Share on other sites More sharing options...
Gurken Khan Posted May 7, 2022 Report Share Posted May 7, 2022 @LetswaveaBook I'm not up to date with most things and I don't know anything about that recruitment engine. But if it was fed with data where 99% of people in leading positions were male and it would result in females not being considered for leading positions just because of their gender - not their capabilities - I guess one could call it misogynistic. Quote Link to comment Share on other sites More sharing options...
smiley Posted May 7, 2022 Report Share Posted May 7, 2022 Nobody knows why or how the engine got those weights. That's the point. Artifacts unseen by humans manifests in billions of nodes. The engineers at Amazon aren't stupid enough to not see the obvious source of bias. Its the non obvious ones which remains hidden in data sets. In fact, I would venture a decent guess that menstruation, pregnancies, general agreeableness, and overall aggressiveness influencing said output more so than most corporations being led by men. In which case, the data isn't necessarily wrong. Its just biology and body chemistry. The average hours worked also agrees with the sentiment. AI ethics exists not to vet data. Its to ensure the mathematical approach is actually socially fair as well. A military recruitment model would suggest soldiers to ideally be > 6ft straight males. Easy to see why. Easy to see its the most optimal. Not so easy to see if its the ethically right choice. Standard deviation would suggest that people not of those qualities might perform better as well. In statistics, stereotyping isn't exactly wrong. It's just probability. Then there is the more general question of whether or not reducing people to numbers is ethical. Quote Link to comment Share on other sites More sharing options...
Gurken Khan Posted May 7, 2022 Report Share Posted May 7, 2022 1 hour ago, smiley said: general agreeableness, and overall aggressiveness influencing said output more hmmm... 1 hour ago, smiley said: A military recruitment model would suggest soldiers to ideally be > 6ft straight males. Easy to see why. Oh dear. To pickup just the 'male' aspect: AFAIK the US military opened up for women when the average male cannon fodder was just too dumb to be of any use in the field. Quote Link to comment Share on other sites More sharing options...
Akira Kurosawa Posted August 5, 2022 Report Share Posted August 5, 2022 (edited) I have only one wish - to see 27-th alpha before "Mushroom War". Spoiler Spoiler (Just kidding! ) (Or not... ) Edited August 5, 2022 by Akira Kurosawa 1 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.