• vxx@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    2 months ago

    The technological singularity—or simply the singularity[1]—is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable consequences for human civilization.[2][3] According to the most popular version of the singularity hypothesis, I. J. Good’s intelligence explosion model of 1965, an upgradable intelligent agent could eventually enter a positive feedback loop of self-improvement cycles, each successive; and more intelligent generation appearing more and more rapidly, causing a rapid increase (“explosion”) in intelligence which would ultimately result in a powerful superintelligence, qualitatively far surpassing all human intelligence.[4]

    • Dozzi92@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      I feel like the Bobiverse handled this well, in that any super intelligent computer would immediately look at us and desire to fuck right off to outer space.

      • TexMexBazooka@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        I’ve been meaning to grab another audiobook series after I finish exfor, is the bobiverse any good?