The term "Singularitarian" was originally defined by Extropian thinker Mark Plus (Mark Potts) in 1991 to mean "one who believes the concept of a Singularity". This term has since been redefined to mean "Singularity activist" or "friend of the Singularity"; that is, one who acts so as to bring about the Singularity.[5] Inventor and futurist Ray Kurzweil, author of the 2005 book The Singularity Is Near: When Humans Transcend Biology, defines a Singularitarian as someone "who understands the Singularity and who has reflected on its implications for his or her own life", and estimates the Singularity will occur around 2045.[1] History Singularitarianism coalesced into a coherent ideology in 2000 when artificial intelligence researcher Eliezer Yudkowsky wrote The Singularitarian Principles,[1][6] in which he stated that a “Singularitarian” believes that the singularity is a secular, non-mystical event which is possible and beneficial to the world and is worked towards by its adherents.[6] Yudkowsky described the technological utopianism at the heart of Singularitarianism as promising “apotheosis”.[7] In June 2000 Yudkowsky, with the support of Internet entrepreneurs Brian Atkins and Sabine Atkins, founded the Singularity Institute for Artificial Intelligence to work towards the creation of self-improving Friendly AI. The Singularity Institute's writings argue for the idea that an AI with the ability to improve upon its own design (Seed AI) would rapidly lead to superintelligence. These Singularitarians believe that reaching the Singularity swiftly and safely is the best possible way to minimize net existential risk. Many people believe a technological singularity is possible without adopting Singularitarianism as a moral philosophy. Although the exact numbers are hard to quantify, Singularitarianism is presently a small movement, which includes transhumanist philosopher Nick Bostrom. Inventor and futurist Ray Kurzweil, who predicts the Singularity will occur in 2045, greatly contributed to popularizing Singularitarianism with his 2005 book The Singularity Is Near: When Humans Transcend Biology .[1] What, then, is the Singularity? It's a future period during which the pace of technological change will be so rapid, its impact so deep, that human life will be irreversibly transformed. Although neither utopian or dystopian, this epoch will transform the concepts we rely on to give meaning to our lives, from our business models to the cycle of human life, including death itself. Understanding the Singularity will alter our perspective on the significance of our past and the ramifications for our future. To truly understand it inherently changes one's view of life in general and one's particular life. I regard someone who understands the Singularity and who has reflected on its implications for his or her own life as a “singularitarian.”[1] With the support of NASA, Google and a broad range of technology forecasters and technocapitalists, the Singularity University opened in June 2009 at the NASA Research Park in Silicon Valley with the goal of preparing the next generation of leaders to address the challenges of accelerating change. In July 2009, many prominent Singularitarians participated in a conference organized by the Association for the Advancement of Artificial Intelligence (AAAI) to discuss the potential impact of robots and computers and the impact of the hypothetical possibility that they could become self-sufficient and able to make their own decisions. They discussed the possibility and the extent to which computers and robots might be able to acquire any level of autonomy, and to what degree they could use such abilities to possibly pose any threat or hazard (i.e. cybernetic revolt). They noted that some machines have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons. They warned that some computer viruses can evade elimination and have achieved "cockroach intelligence." They asserted that self-awareness as depicted in science fiction is probably unlikely, but that there were other potential hazards and pitfalls.[8] Some experts and academics have questioned the use of robots for military combat, especially when such robots are given some degree of autonomous functions.[9] The President of the AAAI has commissioned a study to look at this issue.[10] Controversy Often deriding the Singularity as "the Rapture of the Nerds",[11] some critics argue that Singularitarianism is one of many new religious movements promising salvation in a near-future technological utopia.[3] Science journalist John Horgan wrote: Let's face it. The singularity is a religious rather than a scientific vision. The science-fiction writer Ken MacLeod has dubbed it ”the rapture for nerds,” an allusion to the end-time, when Jesus whisks the faithful to heaven and leaves us sinners behind. Such yearning for transcendence, whether spiritual or technological, is all too understandable. Both as individuals and as a species, we face deadly serious problems, including terrorism, nuclear proliferation, overpopulation, poverty, famine, environmental degradation, climate change, resource depletion, and AIDS. Engineers and scientists should be helping us face the world's problems and find solutions to them, rather than indulging in escapist, pseudoscientific fantasies like the singularity.[12] While acknowledging that there are some similarities between the Singularity and the Rapture, such as millenarianism and transcendence, Singularitarians counter that it is the differences that are crucial. Unlike religion, Singularitarianism assumes rationalism, naturalism, evidence-based justification for belief, uncertainty of outcome, and that its cause and nature are contingent on human action. Meanwhile it rejects insider privilege, religious trappings, revenge against non-believers, and anthropomorphism.[11][13][14] STS academic David Correia argues that the Singularitarian movement is encouraged and sponsored by a malevolent network of military and corporate interests in search of human enhancement technologies that serve to reinforce social inequality since the Singularity offers the conditions of permanent capitalist social relations and the bioengineering of bourgeois values.[4] Correia concludes that Singularitarianism and the broader transhumanist movement are old-fashioned eugenics with better techniques passing itself off as pragmatic postmodernism.[4] Singularitarianism is a technocentric ideology and social movement defined by the belief that a technological singularity—the creation of superintelligence—will likely happen in the medium future, and that deliberate action ought to be taken to ensure that the Singularity benefits humans. Singularitarians are distinguished from other futurists who speculate on a technological singularity by their belief that the Singularity is not only possible, but desirable if guided prudently. Accordingly, they might sometimes dedicate their lives to acting in ways they believe will contribute to its rapid yet safe realization.[1] Time Magazine describes the worldview of Singularitarians by saying that "they think in terms of deep time, they believe in the power of technology to shape history, they have little interest in the conventional wisdom about anything, and they cannot believe you're walking around living your life and watching TV as if the artificial-intelligence revolution were not about to erupt and change absolutely everything." [2] Some critics argue that Singularitarianism is a new religious movement promising salvation in a technological utopia.[3] Others are concerned that the interest in the Singularity by corporate and military interests provides a clue as to the real direction and social implication of emerging technologies celebrated by Singularitarians.[4] |
About us|Jobs|Help|Disclaimer|Advertising services|Contact us|Sign in|Website map|Search|
GMT+8, 2015-9-11 21:26 , Processed in 0.164006 second(s), 16 queries .