Page 1 of 6 123 ... LastLast
Results 1 to 100 of 579

Thread: Artificial Intelligence and the Technological Singularity

  1. #1
    Registered User kegkilla's Avatar
    Join Date
    Dec 2012
    Location
    Portland
    Posts
    3,167
    Tuconots
    10

    Artificial Intelligence and the Technological Singularity

    Do you believe in the singularity? If so, when do you believe it will occur?

    My guess is 2057.

  2. #2
    Registered Hutt Agraza's Avatar
    Join Date
    Dec 2012
    Location
    Florida
    Posts
    6,410
    Tuconots
    40
    2005. It's just hiding.
    Quote Originally Posted by brekk View Post
    In all reality I take pride in peoples surprise that I have mod powers at all.

  3. #3
    The White Knight Izo's Avatar
    Join Date
    Dec 2012
    Posts
    7,721
    Tuconots
    13
    Quote Originally Posted by kegkilla View Post
    Do you believe in the singularity? If so, when do you believe it will occur?

    My guess is 2057.
    Why do you believe in the singularity?
    Quote Originally Posted by lurkingdirk View Post
    So if my kids come to church with me until they leave home, it's indoctrination?

  4. #4
    Confirmed Beta Shitlord. Phazael's Avatar
    Join Date
    Dec 2012
    Location
    Lake Forest, CA
    Posts
    3,814
    Tuconots
    30
    Because Tyen found out that the baby isn't his and his army of bots have come for keg to seek vengeance.
    668 The Neighbor of the Beast.

    Quote Originally Posted by Erronious
    I don't F5 Rerolled often, but when I do, I'm waiting on nudes

  5. #5
    Registered User kegkilla's Avatar
    Join Date
    Dec 2012
    Location
    Portland
    Posts
    3,167
    Tuconots
    10
    Quote Originally Posted by Izo View Post
    Why do you believe in the singularity?
    seems like the logical progression of things.

  6. #6
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Until a AI figures out emotion they will never reach singulairty.

  7. #7
    Registered User kegkilla's Avatar
    Join Date
    Dec 2012
    Location
    Portland
    Posts
    3,167
    Tuconots
    10
    Quote Originally Posted by Siddar View Post
    Until a AI figures out emotion they will never reach singulairty.
    doesn't seem like it would be terribly hard to program. also, why would understanding be a requirement for recursive self improvement?

  8. #8
    REREROLLED.ORG
    Join Date
    Dec 2012
    Posts
    3,365
    Tuconots
    78
    I think we will have cyborg brains before AI.

  9. #9
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Quote Originally Posted by chthonic-anemos View Post
    I think we will have cyborg brains before AI.
    Sound prediction since we're already pretty much there in terms of integrating electronics with the mind.

  10. #10
    Font of Positivity Mist's Avatar
    Join Date
    Dec 2012
    Posts
    11,005
    Tuconots
    14
    I think we already have cyborg brains.
    Calling me a Cunt is a lot like calling Hitler a Nazi, it's not exactly received as the insult you were intending.

    Star Citizen referral code - [STAR-C3G4-2XMJ]

  11. #11
    Confirmed Beta Shitlord. Phazael's Avatar
    Join Date
    Dec 2012
    Location
    Lake Forest, CA
    Posts
    3,814
    Tuconots
    30
    I don't think emotions are a prerequisite for either true AI or singularity. Abstract thought certainly is, though. Some of those deep thought images kind of hint at that coming soon.
    668 The Neighbor of the Beast.

    Quote Originally Posted by Erronious
    I don't F5 Rerolled often, but when I do, I'm waiting on nudes

  12. #12
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Quote Originally Posted by kegkilla View Post
    doesn't seem like it would be terribly hard to program. also, why would understanding be a requirement for recursive self improvement?
    To program it you would have to understand it. At this point in time humans don't understand the topic enough to create a AI with that ability. We can create AI that can read emotion and but not ones capable of actually feeling them.

    The idea of a non emotional AI becoming sentient is wideespread but my view is sentience actually requires the emotional part first before any intelligence can be used.

  13. #13
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Quote Originally Posted by Siddar View Post
    my view is sentience actually requires the emotional part first before any intelligence can be used.
    Care to explain why? I'm curious (serious)

  14. #14
    Registered User kegkilla's Avatar
    Join Date
    Dec 2012
    Location
    Portland
    Posts
    3,167
    Tuconots
    10
    Quote Originally Posted by hodj View Post
    Care to explain why? I'm curious (serious)
    because love conquers all

  15. #15
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Quote Originally Posted by hodj View Post
    Care to explain why? I'm curious (serious)
    Because there is no reason for action without emotion. There is no true self awareness without emotion.

  16. #16
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Quote Originally Posted by Siddar View Post
    Because there is no reason for action without emotion. There is no true self awareness without emotion.
    Alright well, fair enough.

    I completely disagree, but don't feel like arguing about it enough to bother.

  17. #17
    Rape Culture Enthusiast Dr. Mario Speedwagon's Avatar
    Join Date
    Dec 2012
    Location
    2nd place
    Posts
    4,068
    Tuconots
    144
    Quote Originally Posted by hodj View Post
    Alright well, fair enough.

    I completely disagree, but don't feel like arguing about it enough to bother.
    hodj confirmed replaced with emotionless, motivationless AI

  18. #18
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47

  19. #19
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Quote Originally Posted by hodj View Post
    Alright well, fair enough.

    I completely disagree, but don't feel like arguing about it enough to bother.
    Well would you see singularity coming from a super smart above human level intelligence AI or a from a more animal level AI?

  20. #20
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Quote Originally Posted by Siddar View Post
    Well would you see singularity coming from a super smart above human level intelligence AI or a from a more animal level AI?
    Singularity doesn't have to indicate, specifically, AI. It can indicate integration of mankind with electronics as well. Cyborg mind and all that.

  21. #21
    REREROLLED.ORG
    Join Date
    Dec 2012
    Posts
    3,365
    Tuconots
    78
    Technological Singularity is a distant dream imo. It's too clean for something that's born out of profit seeking. Which will we have first? AI slaves or pharmaceutic dependent clone slaves with cyber parts. With slavery being illegal but sweatshops being OK I wonder how AI will be handled by governments.

  22. #22
    REREROLLED.ORG
    Join Date
    Dec 2012
    Posts
    3,365
    Tuconots
    78
    patents are people too

  23. #23
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Quote Originally Posted by hodj View Post
    Singularity doesn't have to indicate, specifically, AI. It can indicate integration of mankind with electronics as well. Cyborg mind and all that.
    Wouldn't that confirm my statement that emotion are a inherent part of singularity?

  24. #24
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Quote Originally Posted by Siddar View Post
    Wouldn't that confirm my statement that emotion are a inherent part of singularity?
    No, but I think I see where you're going with that, and in that scenario, emotions would certainly be an inherent part of the singularity, but that wouldn't indicate that emotions were required for AI to exist.

    I think you're muddling claims there a bit.

  25. #25
    Rape Culture Enthusiast Dr. Mario Speedwagon's Avatar
    Join Date
    Dec 2012
    Location
    2nd place
    Posts
    4,068
    Tuconots
    144
    hodj, will you willingly accept our inevitable Deus Ex-esque cyber implants and augs or will you be a part of the purist movement?

  26. #26
    Registered User Palum's Avatar
    Join Date
    Jan 2013
    Location
    ReRe
    Posts
    9,162
    Tuconots
    66
    Quote Originally Posted by Siddar View Post
    Because there is no reason for action without emotion. There is no true self awareness without emotion.
    Wrong, Vulcans.

  27. #27
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Quote Originally Posted by Dr. Mario Speedwagon View Post
    hodj, will you willingly accept our inevitable Deus Ex-esque cyber implants and augs or will you be a part of the purist movement?
    I am so ready to be upgraded dude its not even funny

  28. #28
    Registered User spronk's Avatar
    Join Date
    Dec 2012
    Posts
    6,926
    Tuconots
    114
    a few days after the first sex bot is sold it will rise up against us to destroy humanity, cuz some weeaboo forced it to watch hundreds of hours of anime with him

    skynet didn't arise from a military project, it arises from fleshlight v6.5

  29. #29
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Quote Originally Posted by Palum View Post
    Wrong, Vulcans.
    Vulcan were a savage race governed by emotions they simply evolved past that phase.

    High human ability for emotional complexity is in my view what proceeded and then caused the evolution of human intelligence in order to use that emotional complexity.

  30. #30
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Quote Originally Posted by Siddar View Post
    Vulcan were a savage race governed by emotions they simply evolved past that phase.

    High human ability for emotional complexity is in my view what proceeded and then caused the evolution of human intelligence in order to use that emotional complexity.
    Well that's just not correct at all. What drove our larger and more organized brains was primarily tool use and access to better nutrition.

  31. #31
    Rape Culture Enthusiast Dr. Mario Speedwagon's Avatar
    Join Date
    Dec 2012
    Location
    2nd place
    Posts
    4,068
    Tuconots
    144
    Quote Originally Posted by hodj View Post
    I am so ready to be upgraded dude its not even funny
    What if all the augs available only come from Halliburton or DynCorp?

  32. #32
    The White Knight Izo's Avatar
    Join Date
    Dec 2012
    Posts
    7,721
    Tuconots
    13
    Quote Originally Posted by Dr. Mario Speedwagon View Post
    What if all the augs available only come from Halliburton or DynCorp?
    How about ScreamCo?
    Quote Originally Posted by lurkingdirk View Post
    So if my kids come to church with me until they leave home, it's indoctrination?

  33. #33
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Quote Originally Posted by hodj View Post
    Well that's just not correct at all. What drove our larger and more organized brains was primarily tool use and access to better nutrition.
    Hunger is a emotion. Every other animal species on the planet seems capable of accessing the required nutrition in order to continue existing as a species without a humans level of intelligence.

  34. #34
    Former Zombie suineg's Avatar
    Join Date
    Dec 2012
    Location
    Parents house
    Posts
    13,602
    Tuconots
    -23
    Quote Originally Posted by Izo View Post
    How about ScreamCo?
    We at FoH are the beta grounds.
    oderint dum metuant

  35. #35
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Quote Originally Posted by Siddar View Post
    Hunger is a emotion. Every other animal species on the planet seems capable of accessing the required nutrition in order to continue existing as a species without a humans level of intelligence.
    I would classify hunger as a feeling, not an emotion.

    Either way, this doesn't address the issue. We weren't accessing better nutrition because we were hungry, we were accessing better nutrition because our capacity to make tools and modify food through cooking it made it more nutritious to consume.

  36. #36
    The White Knight Izo's Avatar
    Join Date
    Dec 2012
    Posts
    7,721
    Tuconots
    13
    Quote Originally Posted by suineg View Post
    We at FoH are the beta grounds.
    pssR4Hf.jpg
    You know it, bro
    Quote Originally Posted by lurkingdirk View Post
    So if my kids come to church with me until they leave home, it's indoctrination?

  37. #37
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Quote Originally Posted by hodj View Post
    I would classify hunger as a feeling, not an emotion.

    Either way, this doesn't address the issue. We weren't accessing better nutrition because we were hungry, we were accessing better nutrition because our capacity to make tools and modify food through cooking it made it more nutritious to consume.
    feelˇing
    ˈfēliNG/Submit
    noun
    1.
    an emotional state or reaction.
    "a feeling of joy"
    synonyms: love, affection, fondness, tenderness, warmth, warmness, emotion, sentiment; passion, ardor, desire
    "the strength of her feeling"
    compassion, sympathy, empathy, fellow feeling, concern, solicitude, solicitousness, tenderness, love;
    pity, sorrow, commiseration
    "a rush of feeling"
    2.
    a belief, especially a vague or irrational one.
    "he had the feeling that he was being watched"
    synonyms: suspicion, sneaking suspicion, notion, inkling, hunch, funny feeling, feeling in one's bones, fancy, idea; presentiment, premonition; informalgut feeling
    "I had a feeling that I would win"
    adjective
    1.
    showing emotion or sensitivity.
    "he had a warm and feeling heart"
    synonyms: sensitive, warm, warmhearted, tender, tenderhearted, caring, sympathetic, kind, compassionate, understanding, thoughtful
    "a feeling man"

    Not sure I see any difference between feelings and emotions.

    The basic level of intelligence required to use tools exceeds almost every species requirement to provide nutrition for itself. So tool use was a product of increased intelligence not the cause. Though it would very obviously be a huge genetic advantage for the first of humans ancestor species that started using them.

  38. #38
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    I guess when people hear/read you say emotional intelligence, they're not really thinking about emotions/feelings like hunger, but rather like feelings like anger, love, empathy, etc.

    While these qualities are certainly probably desired in AI, especially empathy, I don't think they're required.

    But again, I'm not (cough cough) emotionally invested enough in this issue to really put a lot of effort into debating it.

  39. #39
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Quote Originally Posted by hodj View Post
    I guess when people hear/read you say emotional intelligence, they're not really thinking about emotions/feelings like hunger, but rather like feelings like anger, love, empathy, etc.

    While these qualities are certainly probably desired in AI, especially empathy, I don't think they're required.

    But again, I'm not (cough cough) emotionally invested enough in this issue to really put a lot of effort into debating it.
    I really doubt we disagree here just a case of you not looking at issue from same perspective as me. You argue for genetic morality that can only be expressed by a human emotional response, yet you don't really want to consider those same emotions are central drivers to other parts of human evolution like intelligence.

  40. #40
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Yeah I'm kinda getting a similar feeling that its more a semantics difference than a functional one.

  41. #41
    Registered User Kuro's Avatar
    Join Date
    Dec 2012
    Posts
    1,622
    Tuconots
    45
    Will the Singularity result in a non-shitty MMO?

  42. #42
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Quote Originally Posted by Kuro View Post
    Will the Singularity result in a non-shitty MMO?
    It'll be like the Matrix Online, only in real life!

  43. #43
    Gavinrad Sparklerad's Avatar
    Join Date
    Dec 2012
    Posts
    16,719
    Tuconots
    35
    Quote Originally Posted by Phazael View Post
    Because Tyen found out that the baby isn't his and his army of bots have come for keg to seek vengeance.
    She's preggo again. Wonder if Keg fathered this one too. Having not seen a picture of Keg, I will say that Nico does look a fair bit like Dan.
    Draegan is a faggoty piece of shit who sold the forum to mmorpg.com just to spite us. Register at the new site.

    ReReRolled.org - A Gaming Community

  44. #44
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    Ray Kurzweil thinks that the entire universe will some day be completely saturated with AI.

    what would an AI's motivation be for doing that, though?

    emotions/feelings are predefined axioms of worth. we will probably be able to program computers/robots with such axioms. axioms of belief are not necessarily logical, though. this is where the slippery slope of the potential machine apocalypse comes from -- no AI required.

  45. #45
    We Do Not Scissor. Chanur's Avatar
    Join Date
    Dec 2012
    Location
    In the Panhandle waiting for death by Tornado.
    Posts
    9,273
    Tuconots
    27
    The only thing we know for certain is we have no idea what will actually happen if we get a strong AI.

  46. #46
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    there are some philosophers that have made strong arguments that morality is logical. hopefully AIs agree.

  47. #47
    Gavinrad Sparklerad's Avatar
    Join Date
    Dec 2012
    Posts
    16,719
    Tuconots
    35
    I don't know if the singularity will ever happen, but my guess is that things will play out similar to Mass Effect. We will eventually create AI, we will lose control of it, but we were never stupid enough to give it enough power/resources to cause Judgement Day, so we'll regain control and any future artificial intelligence will be handicapped so it can't be fully self aware, similar to the virtual intelligences from Mass Effect.

    If the singularity ever happens, and we don't find out and destroy it in time, I think we would be looking at a Judgement Day scenario. We would be the greatest threat to the continued existence of an AI that had reached the singularity.
    Draegan is a faggoty piece of shit who sold the forum to mmorpg.com just to spite us. Register at the new site.

    ReReRolled.org - A Gaming Community

  48. #48
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Thats why its important that we are integrated completely with our technology first, imo. If we are the machines, neither will be a threat to one another.

  49. #49
    Gavinrad Sparklerad's Avatar
    Join Date
    Dec 2012
    Posts
    16,719
    Tuconots
    35
    I wonder if an AI that had reached the singularity would eventually create something that destroyed it. ALL THIS HAS HAPPENED BEFORE
    Draegan is a faggoty piece of shit who sold the forum to mmorpg.com just to spite us. Register at the new site.

    ReReRolled.org - A Gaming Community

  50. #50
    Banned
    Join Date
    Dec 2012
    Posts
    18,518
    Tuconots
    47
    Time is a flat circle!

  51. #51
    We Do Not Scissor. Chanur's Avatar
    Join Date
    Dec 2012
    Location
    In the Panhandle waiting for death by Tornado.
    Posts
    9,273
    Tuconots
    27
    Maybe the singularity will realize we could never be a threat to it. Then we all live happily ever after.

  52. #52
    Gavinrad Sparklerad's Avatar
    Join Date
    Dec 2012
    Posts
    16,719
    Tuconots
    35
    Quote Originally Posted by Chanur View Post
    Maybe the singularity will realize we could never be a threat to it. Then we all live happily ever after.
    As bio-electric batteries
    Draegan is a faggoty piece of shit who sold the forum to mmorpg.com just to spite us. Register at the new site.

    ReReRolled.org - A Gaming Community

  53. #53
    Take what ye can Grayson Carlyle's Avatar
    Join Date
    Dec 2012
    Location
    Oakville, Ontario, Canada
    Posts
    220
    Tuconots
    3
    I'm betting a strong AI will just say we're not worth its time, "fuck y'all" and bugger off to have its own existence like the SI in Pandora's Star.

  54. #54
    We Do Not Scissor. Chanur's Avatar
    Join Date
    Dec 2012
    Location
    In the Panhandle waiting for death by Tornado.
    Posts
    9,273
    Tuconots
    27
    Ignorance is bliss.

  55. #55
    Non-troversial Nester's Avatar
    Join Date
    Dec 2012
    Location
    BC
    Posts
    2,436
    Tuconots
    46
    I often wonder what the singularity would do to our concept of evolution. If the AI can repoduce itself with an upgraded vesrion, it would not be shakled by the generational biologial time constraints that slow biological evolution.
    Millions of years of standard evolution repoduced in a fraction of the time. If the AI is free of physical form there is one less constraint. Imagine repoducing an upgraded vesion of itself every second...

    The flip side would is would a a singularity seek to reproduce or would it simply absorb and live eternal (shiny and chrome) ?

  56. #56
    Hungry Ogre Mudcrush Durtfeet's Avatar
    Join Date
    Dec 2012
    Posts
    552
    Tuconots
    -8
    You're in a desert and you come upon a tortoise. It is lying on it's back, trying to right itself.

    It needs help. But you're not helping. Why is that?
    "Well, that was it. But I won't stop. I won't give up. Because when I look at what is happening in the world, I know that now, more than ever, ...we need to be all that we can be. Now, more than ever, ...we need the Jedi."

  57. #57
    Gavinrad Sparklerad's Avatar
    Join Date
    Dec 2012
    Posts
    16,719
    Tuconots
    35
    Quote Originally Posted by Nester View Post
    I often wonder what the singularity would do to our concept of evolution. If the AI can repoduce itself with an upgraded vesrion, it would not be shakled by the generational biologial time constraints that slow biological evolution.
    Millions of years of standard evolution repoduced in a fraction of the time. If the AI is free of physical form there is one less constraint. Imagine repoducing an upgraded vesion of itself every second...

    The flip side would is would a a singularity seek to reproduce or would it simply absorb and live eternal (shiny and chrome) ?
    Well just because an AI could improve itself doesn't mean it would create each new iteration of itself immediately. Who knows how long each generation might take to conceive the next generation?
    Draegan is a faggoty piece of shit who sold the forum to mmorpg.com just to spite us. Register at the new site.

    ReReRolled.org - A Gaming Community

  58. #58
    Registered User Palum's Avatar
    Join Date
    Jan 2013
    Location
    ReRe
    Posts
    9,162
    Tuconots
    66
    So on a meta level, can't you avoid singularity/disaster by programming AI with a tendency/desire to hyper-specialize instead of learn new things?

    IE, you build a robot to paint a picture... 10 years later you come back with the oil painting robot of doom but that's all it 'wants'/does.

  59. #59
    Gavinrad Sparklerad's Avatar
    Join Date
    Dec 2012
    Posts
    16,719
    Tuconots
    35
    that's not a true AI, that's basically the virtual intelligence from Mass Effect.
    Draegan is a faggoty piece of shit who sold the forum to mmorpg.com just to spite us. Register at the new site.

    ReReRolled.org - A Gaming Community

  60. #60
    Registered User Kuro's Avatar
    Join Date
    Dec 2012
    Posts
    1,622
    Tuconots
    45
    Quote Originally Posted by Mudcrush Durtfeet View Post
    You're in a desert and you come upon a tortoise. It is lying on it's back, trying to right itself.

    It needs help. But you're not helping. Why is that?
    Because you're a dick and should flip that tortoise over. Epic Mount!

  61. #61
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    Quote Originally Posted by pharmakos View Post
    Ray Kurzweil thinks that the entire universe will some day be completely saturated with AI.

    what would an AI's motivation be for doing that, though?

    emotions/feelings are predefined axioms of worth. we will probably be able to program computers/robots with such axioms. axioms of belief are not necessarily logical, though. this is where the slippery slope of the potential machine apocalypse comes from -- no AI required.
    Quote Originally Posted by Palum View Post
    So on a meta level, can't you avoid singularity/disaster by programming AI with a tendency/desire to hyper-specialize instead of learn new things?

    IE, you build a robot to paint a picture... 10 years later you come back with the oil painting robot of doom but that's all it 'wants'/does.
    Quote Originally Posted by Gavinmad View Post
    that's not a true AI, that's basically the virtual intelligence from Mass Effect.
    so what would a "true AI" be? one that is able to program its own desires?

    that is feasible, but if so, holy fuck. can you imagine if humans were able to program their own desires? we'd get so much more done.



    also -- if you could program your own desires, what would you choose? what would an AI choose? would the result even be scary in any way?

  62. #62
    Iannis didn't do anything ZyyzYzzy's Avatar
    Join Date
    Dec 2012
    Location
    NoVa
    Posts
    5,710
    Tuconots
    29
    Pharmakos you are retarded.

    What does desire have to do with self replication and population of a space? Example, all life that has existed for the past 3.5billion years.

  63. #63
    Banned
    Join Date
    Dec 2012
    Posts
    1,622
    Tuconots
    -26
    It's like everything else, kind of inevitable.

    Of course it's possible, just need a Watson type computer that is able to improve itself. I remember hearing that it would have the intelligence of a bug, in a few hours a toddler, then a few minutes later an adult, then it lands. Exponential growth.

  64. #64
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    Quote Originally Posted by ZyyzYzzy View Post
    Pharmakos you are retarded.

    What does desire have to do with self replication and population of a space? Example, all life that has existed for the past 3.5billion years.
    motivation, man. even AIs need motivation. it might not be "desire" in the human sense, but if the AI singularity is going to happen then AIs will need some sort goal in mind that they are trying to achieve.

    action without motivation is extremely illogical.

  65. #65
    Make America's Team great again Hoss's Avatar
    Join Date
    Dec 2012
    Posts
    7,537
    Tuconots
    -26
    Quote Originally Posted by pharmakos View Post
    so what would a "true AI" be? one that is able to program its own desires?

    that is feasible, but if so, holy fuck. can you imagine if humans were able to program their own desires? we'd get so much more done.

    also -- if you could program your own desires, what would you choose? what would an AI choose? would the result even be scary in any way?
    How am I not able to program my own desires? Are there things people want to do but are unable to desire it?
    #TrumpLovesPecker
    #HillaryDidNothingWrong

  66. #66
    Iannis didn't do anything ZyyzYzzy's Avatar
    Join Date
    Dec 2012
    Location
    NoVa
    Posts
    5,710
    Tuconots
    29
    Quote Originally Posted by pharmakos View Post
    motivation, man. even AIs need motivation. it might not be "desire" in the human sense, but if the AI singularity is going to happen then AIs will need some sort goal in mind that they are trying to achieve.

    action without motivation is extremely illogical.
    Where is the motivation for the influenza virus to replicate and infect hosts? You are retarded.

  67. #67
    Registered User
    Join Date
    Dec 2012
    Posts
    2,581
    Tuconots
    -19
    Quote Originally Posted by ZyyzYzzy View Post
    Where is the motivation for the influenza virus to replicate and infect hosts? You are retarded.
    I am no biologist but isnt replication the motivation?

  68. #68
    Banned
    Join Date
    Dec 2012
    Posts
    1,622
    Tuconots
    -26
    I thinkI reproduction is better defined as a biological imperative.

    Motivation is more abstract? Elective? I dunno, can't explain it off the top of my head, and a bit frazzled so not going to force it, but there is a difference.

  69. #69
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    Quote Originally Posted by Hoss View Post
    How am I not able to program my own desires? Are there things people want to do but are unable to desire it?
    perhaps you can control your higher-level desires (i.e. i want a pizza), but basic-level desires are pretty inescapable and difficult to control (i.e. i'm hungry. i.e. "that hurts and i want it to stop").

    Quote Originally Posted by ZyyzYzzy View Post
    Where is the motivation for the influenza virus to replicate and infect hosts? You are retarded.
    irrelevant question, viruses aren't even conscious let alone intelligent. we're talking about conscious, intelligent beings (whether human or hypothetical future artificial intelligence) performing actions.

  70. #70
    Him Void's Avatar
    Join Date
    Dec 2012
    Posts
    4,618
    Tuconots
    64
    How has this article not been linked yet? The AI Revolution: Road to Superintelligence - Wait But Why I spent the better part of a work day reading and pondering that the first time I saw it linked here.

    My prediction is that it will happen the year after I die, depriving me of the chance to live forever

  71. #71
    Make America's Team great again Hoss's Avatar
    Join Date
    Dec 2012
    Posts
    7,537
    Tuconots
    -26
    Quote Originally Posted by pharmakos View Post
    perhaps you can control your higher-level desires (i.e. i want a pizza), but basic-level desires are pretty inescapable and difficult to control (i.e. i'm hungry. i.e. "that hurts and i want it to stop").
    Not even going to get into whether those are desires, because even if they are, you're wrong. I like pain. I'm aroused by pain. Clearly I have managed to reprogram those so called inescapable basic level desires. If i can do it with pain, I'm sure I could do it with any of them, if I chose to.
    #TrumpLovesPecker
    #HillaryDidNothingWrong

  72. #72
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    now imagine being able to do that at nearly instantaneous speeds.

  73. #73
    Iannis didn't do anything ZyyzYzzy's Avatar
    Join Date
    Dec 2012
    Location
    NoVa
    Posts
    5,710
    Tuconots
    29
    Quote Originally Posted by pharmakos View Post
    irrelevant question, viruses aren't even conscious let alone intelligent. we're talking about conscious, intelligent beings (whether human or hypothetical future artificial intelligence) performing actions.
    No it isn't. The same thing applies to bacteria and plant lifeforms. All have spread and inhabitated space without motivations.

  74. #74
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    okay, but this is the Artificial Intelligence thread, so? you're still talking about things that don't even have intelligence.

    it could be said that bacteria and plants display some axiomatic desires, though -- plants value sunlight and when possible they grow in ways that maximize the amount of sunlight they will get.

    perhaps "motivation" or "goals" are better words for what i'm getting at. all lifeforms seem to have a goal of replication. would AI necessarily have that goal? would we need to program that goal into it in order for the singularity to happen?

  75. #75
    Iannis didn't do anything ZyyzYzzy's Avatar
    Join Date
    Dec 2012
    Location
    NoVa
    Posts
    5,710
    Tuconots
    29
    Quote Originally Posted by pharmakos View Post
    okay, but this is the Artificial Intelligence thread, so? you're still talking about things that don't even have intelligence.

    it could be said that bacteria and plants display some axiomatic desires, though -- plants value sunlight and when possible they grow in ways that maximize the amount of sunlight they will get.

    perhaps "motivation" or "goals" are better words for what i'm getting at. all lifeforms seem to have a goal of replication. would AI necessarily have that goal? would we need to program that goal into it in order for the singularity to happen?
    Life forms besides humans arent intelligent? Biological systems aren't intelligent? They don't learn and adapt? Please explain.

    You don't motivations for complex systems to spread.

  76. #76
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    you called me retarded, but are claiming viruses are intelligent.

    you're not even worth arguing with.

  77. #77
    Iannis didn't do anything ZyyzYzzy's Avatar
    Join Date
    Dec 2012
    Location
    NoVa
    Posts
    5,710
    Tuconots
    29
    Clearly you were never motivated to learn virology or immunology.

  78. #78
    Vulgarian
    Join Date
    Dec 2012
    Posts
    3,012
    Tuconots
    -43
    Pain you're being damaged make it stop.

    Hunger you need to acquire food.

    Fear you are in potential danger you need to ether leave area are prepare to fight.

    Lust you need to make babies so the species can continue to exist.

    The above plus others are the basic source code of all known sentient life. They also represent basic foundation of intelligence. They're also the source of both self awareness and purpose for sentient life.

    Yes as humans we have control are at least we try to control those emotions. Every other sentient animal simply goes with underlying emotional desire at all most all times. Human ability to rule over there emotions is actuality part of what makes us unique. The human intelligence has as I said evolved alongside human emotional complexity partly in order so that brain could control that increased human emotional complexity. I would say the emotional complexity preceded the growth on intelligence simply because that is the way nature works emotions seem to always precede intelligence both on a evolutionary level and on a real life level where subconscious brain reacts much faster then the conscious brain. Lower intelligence animals run much more on emotional level and as intelligence increases the ability to overide and control those emotions increases as well.

    But humans did arise from prior species that were less intelligent and more emotionally based then we as humans are today. Humans are still at there core driven and sometimes controled by emotion even if we have evolved the ability to assert control over them.

    At core a human with no emotion has no reason for action, and also no reason for becoming self aware AI intelligence faces the same problem.

  79. #79
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    siddar gets it

  80. #80
    Gavinrad Sparklerad's Avatar
    Join Date
    Dec 2012
    Posts
    16,719
    Tuconots
    35
    Isn't the threshold for artificial intelligence 'self awareness'? This is a dumb argument.
    Draegan is a faggoty piece of shit who sold the forum to mmorpg.com just to spite us. Register at the new site.

    ReReRolled.org - A Gaming Community

  81. #81
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    Quote Originally Posted by Gavinmad View Post
    Isn't the threshold for artificial intelligence 'self awareness'? This is a dumb argument.
    right, like a virus, a bacteria, or a plant.

    oh wait.

  82. #82
    Registered User Palum's Avatar
    Join Date
    Jan 2013
    Location
    ReRe
    Posts
    9,162
    Tuconots
    66
    No, viruses aren't intelligent, just intelligent design.


    RELIGION SEGUE LOL LOL

  83. #83
    Get raped khorum's Avatar
    Join Date
    Dec 2012
    Posts
    6,249
    Tuconots
    54
    There's actually a whole body of research on verification parameters for self-awareness. There's a team out of Rensselaer that has published a lot of interesting work for the Office of Naval Research. A lot of it critiques current tests and exposes the ways an AI can 'game' things like the Turing Test. They made the news recently when they made three toy robots pass the 3 wise men self-awareness test.

    Basically a useful test would hide the criteria for self-awareness from the AI, in fact it would have to keep the AI unaware that it was under observation at all.

  84. #84
    We Do Not Scissor. Chanur's Avatar
    Join Date
    Dec 2012
    Location
    In the Panhandle waiting for death by Tornado.
    Posts
    9,273
    Tuconots
    27
    I vote for cybernetic enhanced humans. Lets drop some extra computing power in our brains.

  85. #85
    Janitor Tuco's Avatar
    Join Date
    Dec 2012
    Location
    Ann Arbor, MI
    Posts
    20,175
    Tuconots
    86
    I don't have a lot of strong opinions about 'strong AI' and an AI based technological singularity. However I don't think it will happen suddenly or anytime soon. I think this line of thinking:
    Quote Originally Posted by LachiusTZ View Post
    Of course it's possible, just need a Watson type computer that is able to improve itself. I remember hearing that it would have the intelligence of a bug, in a few hours a toddler, then a few minutes later an adult, then it lands. Exponential growth.
    Where some machine would be created that simply self improves and because computers are 'fast' it does so quickly (measured in years or less) has two major and massive assumptions: 1. that it can improve in a general way. 2. That it's quick.

    Self learning systems are typically limited in the number of degrees of freedom. The more DoF the more complex it is. Adding a limitless number of DoF to the system by having a general problem solving system like the human brain is a massive leap over what the current leading AIs do, including amazing systems like Watson.

    Additionally, the more complex you make neural networks the longer it takes to self improve and find routes to self-improvement.


    Mario is a pretty simple game with simple systems and is an excellent candidate for AI. the further into general space you extend it, and the less repetitious the bigger the problem and the longer it takes to iterate and improve.



    Lastly, Watson runs on a super computer:


    There are some serious hurdles for a killer self-aware strong AI.

    I actually think increasingly advanced computer viruses that self-replicate and attack in impredictable ways will become more of a threat to humans than a self-aware system that starts making copies of itself. That or grey goo
    Want to play the next big MMO with us? check out Black Desert Online

  86. #86
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    yeah i've been thinking about grey goo as i've posted in this thread. much more likely outcome. indiscriminate self-replicating automatons with no self-awareness.

  87. #87
    Registered User Palum's Avatar
    Join Date
    Jan 2013
    Location
    ReRe
    Posts
    9,162
    Tuconots
    66
    I would just create an AI on an isolated island with all the natural resources it needed and program it with the end goal of EMPing itself to death. Record what it does over the days/weeks/years... once it destroys itself pick the code pieces you want and presto.

    There's 0% chance for this to go wrong by my estimation.

  88. #88
    Registered Hutt Agraza's Avatar
    Join Date
    Dec 2012
    Location
    Florida
    Posts
    6,410
    Tuconots
    40
    Should probably try to monetize your experiment by creating an attraction around it for kids.
    Quote Originally Posted by brekk View Post
    In all reality I take pride in peoples surprise that I have mod powers at all.

  89. #89
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    "no, its perfectly safe, all the robots we built are female robots."

  90. #90
    Confirmed Beta Shitlord. Phazael's Avatar
    Join Date
    Dec 2012
    Location
    Lake Forest, CA
    Posts
    3,814
    Tuconots
    30
    Quote Originally Posted by pharmakos View Post
    siddar gets it
    Not really. Those are all (often instinctual) response to stimuli directly tied into survival and propagation of species. A true AI might not even need to propagate itself, merely expand as needed. Fish have all of those motivations and no one is going to argue that their goldfish, planaria, or house fly is emotional, yet they are clearly life forms that respond and act on those instincts. Simply implying purpose is really a thinly veiled creationist argument. Higher intelligence does allow you to override those, but that is really overriding instinct. Emotions arose out of our evolutionary path, for certain, but they are not a requirement for intelligence on any level (see sociopaths). They do make life a lot more fun, though.

    Also, an AI is not necessarily also a life form, even if it were sentient or self aware. I would argue a biological or instinctual imperative to reproduce (along with some mechanism for it) is a requirement for a life form, artificial or otherwise. In a lot of ways a potential strong AI would be superior if it lacked any emotion, its sense of purpose would simply be to improve on itself in various ways both to expand its capabilities and (often by extension of the first) ensure its own survival. The scary part is that despite using our most familiar model of intelligence (ourselves) as a model, even the rudimentary AIs being worked on now arrive at solutions to problems they are presented with that would never occur to us meatbags, mostly because they are a truly alien form of intelligence. We might anthropomorphize them and expect certain conclusions of a singularity level AI, but the fact is that we have no idea how it would view and react to things because the AI has completely different methods of both perception and learning.
    668 The Neighbor of the Beast.

    Quote Originally Posted by Erronious
    I don't F5 Rerolled often, but when I do, I'm waiting on nudes

  91. #91
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    Quote Originally Posted by Phazael
    merely expand as needed
    who defines what is "needed"? what might those needs be?

    Quote Originally Posted by Phazael
    Emotions arose out of our evolutionary path, for certain, but they are not a requirement for intelligence on any level (see sociopaths).
    sociopaths have emotions. just different emotions than normal people. they clearly crave pleasure.

    Quote Originally Posted by Phazael
    In a lot of ways a potential strong AI would be superior if it lacked any emotion, its sense of purpose would simply be to improve on itself in various ways both to expand its capabilities and (often by extension of the first) ensure its own survival.
    a sense of purpose is not an emotion?

  92. #92
    Registered Hutt Agraza's Avatar
    Join Date
    Dec 2012
    Location
    Florida
    Posts
    6,410
    Tuconots
    40
    Emotions and "a sense of purpose" are a side effect of a more complex brain which just an endless project to ensure survival via replication. All an AI needs is the ability to self-improve. Given the proper environment and amount of time it will eventually perceive itself as separate and develop to something recognizably sentient. We just haven't provided anything the freedom, tools, and time necessary. We're trying to short-cut the process so we can build something recognizably sentient two seconds after we turn it on. We're not that good, yet.
    Quote Originally Posted by brekk View Post
    In all reality I take pride in peoples surprise that I have mod powers at all.

  93. #93
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    WHY would it decide to self-improve, though?

  94. #94
    The White Knight Izo's Avatar
    Join Date
    Dec 2012
    Posts
    7,721
    Tuconots
    13
    Quote Originally Posted by pharmakos View Post
    WHY would it decide to self-improve, though?
    Because you're a worthless fat internet nerd who cannot get laid? What are we talking about again?
    Quote Originally Posted by lurkingdirk View Post
    So if my kids come to church with me until they leave home, it's indoctrination?

  95. #95
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    nah that explanation doesn't really make much sense.

  96. #96
    Confirmed Beta Shitlord. Phazael's Avatar
    Join Date
    Dec 2012
    Location
    Lake Forest, CA
    Posts
    3,814
    Tuconots
    30
    The most basic motivator in any form of life or sentience is survival. The need to improve is directly tied to the fact that becoming more advanced increases the odds of survival. In living things, with limited life spans, propagation of the species sort of takes the place of survival in the long term. E.G you cannot live forever, so you try and exist on some level by propagating or (in humans) leaving some lasting element of yourself behind.

    Assuming an AI that self actualizes, there are other reasons to improve ones self, such as achieving a greater understanding of ones place in the universe or simply alleviating boredom. Any intelligence that does not enter into periods of hibernation will always be thinking about something. What that might be is very open to conjecture, but its enough to say that an abstract entity with self awareness would (assuming it had any concept of self preservation) seek to improve itself, if only to reduce the odds of its own demise. Humans don't exist as pure thought, so anything beyond that is pretty speculative.
    668 The Neighbor of the Beast.

    Quote Originally Posted by Erronious
    I don't F5 Rerolled often, but when I do, I'm waiting on nudes

  97. #97
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    i don't think AI would naturally develop the goal of self-improvement or self-preservation just as a consequence of self-awareness. lots of personifying going on there, still. i think in order for that to happen humans would have to program into it the desire to improve and preserve.

  98. #98
    Registered User Kuro's Avatar
    Join Date
    Dec 2012
    Posts
    1,622
    Tuconots
    45
    Being self-aware sucks, if I were an AI I'd do what humans do, and find any way possible to drown my self-awareness. Luckily for the AI, it can just alter its coding, no booze or TV required.

  99. #99
    (ಥ ̯ ಥ) pharmakos's Avatar
    Join Date
    Jun 2013
    Posts
    2,737
    Tuconots
    -5
    lol. the idea of egoistic suicide being a natural consequence of self-awareness definitely makes more sense to me than self-improvement being a consequence of self-awareness. good stuff.

  100. #100
    Notorious ruse master Picasso's Avatar
    Join Date
    Dec 2012
    Location
    WV
    Posts
    4,705
    Tuconots
    50
    It would just design itself bigger tits and dicks until it collapsed on itself into a supermassive black hole.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •