TakeOnIt
Compare opinions of world leading experts and influencers.

Is a technological singularity likely?

A technological singularity is a super-human intelligence made possible by technological advances. Several technologies are potentially heading in this direction, the most notable being artificial intelligence, but also others such as computer-brain interfaces and genetic engineering. Such an intelligence could in turn create an even greater intelligence, leading to an explosive, unpredictable, but undoubtably radical change to society.

Implications to Other Questions


Experts and Influencers

Suggest Expert Quote (click to expand, no login required)
Agree

Ray Kurzweil    Inventor, Artificial Intelligence Researcher
Agree
We are entering a new era. I call it "the Singularity." It's a merger between human intelligence and machine intelligence that is going to create something bigger than itself. It's the cutting edge of evolution on our planet. One can make a strong case that it's actually the cutting edge of the evolution of intelligence in general, because there's no indication that it's occurred anywhere else.
25 Mar 2001    Source


Eliezer Yudkowsky    Artificial Intelligence Researcher
Mostly Agree
What would humans with brain-computer interfaces do with their augmented intelligence? One good bet is that they’d design the next generation of brain-computer interfaces. Intelligence enhancement is a classic tipping point; the smarter you get, the more intelligence you can apply to making yourself even smarter. ... This positive feedback cycle ... rapidly surges upward and creates superintelligence (minds orders of magnitude more powerful than human) before it hits physical limits.
30 Sep 2007    Source


Experts In Cognition


Robin Hanson    Economics Professor
Mostly Agree
...the economy that awaits our grandchildren [I expect] to follow a societal discontinuity more dramatic than those brought on by the agricultural and industrial revolutions. ... [The arrival of machine intelligence on a human level] could produce a singularity--an overwhelming departure from prior trends, with uneven and dizzyingly rapid change thereafter. A future shock to end future shocks.
01 Jun 2008    Source

Sub-Arguments Of This Expert:
Could a computer ever be conscious?
   Agree

Disagree
Experts In Cognition


Steven Pinker    Psychology Professor
Disagree
There is not the slightest reason to believe in a coming singularity. The fact that you can visualize a future in your imagination is not evidence that it is likely or even possible. Look at domed cities, jet-pack commuting, underwater cities, mile-high buildings, and nuclear-powered automobiles--all staples of futuristic fantasies when I was a child that have never arrived. Sheer processing power is not a pixie dust that magically solves all your problems.
01 Jun 2008    Source

Sub-Arguments Of This Expert:
Could a computer ever be conscious?
   Neutral

Experts In Science


Jeff Hawkins    Neuroscientist, Inventor of Palm Pilot
Disagree
If you define the singularity as a point in time when intelligent machines are designing intelligent machines in such a way that machines get extremely intelligent in a short period of time--an exponential increase in intelligence--then it will never happen. Intelligence is largely defined by experience and training, not just by brain size or algorithms. It isn't a matter of writing software. Intelligent machines, like humans, will need to be trained in particular domains of expertise.
01 Jun 2008    Source

Sub-Arguments Of This Expert:
Could a computer ever be conscious?
   Mostly Agree

Neutral

Douglas Hofstadter    Professor of Cognitive Science
Neutral
It might happen someday, but I think life and intelligence are far more complex than the current singularitarians seem to believe, so I doubt it will happen in the next couple of centuries. [The ramifications] will be enormous, since the highest form of sentient beings on the planet will no longer be human. Perhaps these machines--our 'children'--will be vaguely like us and will have culture similar to ours, but most likely not. In that case, we humans may well go the way of the dinosaurs.
01 Jun 2008    Source


Experts In Cognition


David Chalmers    Philosophy Professor
Neutral
Will there be a singularity? I think that it is certainly not out of the question, and that the main obstacles are likely to be obstacles of motivation rather than obstacles of capacity.
02 Apr 2010    Source


Experts In Media


Time Magazine    Popular Magazine
Neutral
...even though it sounds like science fiction, it isn't, no more than a weather forecast is science fiction. It's not a fringe idea; it's a serious hypothesis about the future of life on Earth. There's an intellectual gag reflex that kicks in anytime you try to swallow an idea that involves super-intelligent immortal cyborgs, but suppress it if you can, because while the Singularity appears to be, on the face of it, preposterous, it's an idea that rewards sober, careful evaluation.
11 Feb 2011    Source



Comments

Add Your TakeOnIt (click to expand, no login required)
0 Points      MTC      23 Feb 2013      Stance on Question: Agree
The question makes no mention of time. Given enough time, one of the definitions of the singularity will happen, it is inevitable.


0 Points      Nashhinton      06 Oct 2011      Stance on Question: Mostly Agree
The technological singularity is likely to happen in the middle of the 21st Century.


0 Points      the27th      27 May 2010      Stance on Question: Mostly Disagree
"Smarter than human" is a vague concept. We don't consider powerful calculating machines to be smarter than us. For the purpose of the Singularity, to get accelerating technological growth, we must mean inventing a an AI that is better at inventing new technologies than we are.

I think we are far away from inventing a machine that can invent its own machines. AI research today focuses on getting machines to accomplish the same tasks of perception and learning that humans (and animals) do. I think it's clear that we'll get much better at machine perception. In my lifetime we may well get a machine that comprehends language the way humans do. But I don't know if anyone's even working on machines that can innovate. That may be a severely hard problem.


1 Point      Adam Atlas      26 Apr 2010      Editorial Comment
Should this perhaps be separated into a few different questions, for the divergent ways people have used the term "singularity" as explained here and here? I was considering adding a quote from a post by PZ Myers where he harshly criticizes the idea of "the singularity", but he is focusing entirely on Kurzweil's idea of predictably accelerating progress, and he says nothing about the likelihood or desirability of a smarter-than-human AI undergoing an intelligence explosion (which appears to be the focus of this question's description).


1 Point      Benja      28 Apr 2010      Editorial Comment
Comments can now have links, using the markdown format. I edited your comment to make use of this feature.


0 Points      Adam Atlas      28 Apr 2010      Editorial Comment
Ah, cool. I was just about to submit that as a feature request, actually. :) Is the rest of Markdown also available?


0 Points      Benja      28 Apr 2010      Editorial Comment
Which markdown features do you want the most?


1 Point      Adam Atlas      28 Apr 2010      Editorial Comment
Mainly just bold and italic. Blockquotes would be useful too; might be better than graying/italicizing everything written between quotes.


1 Point      Benja      28 Apr 2010      Editorial Comment
"Mainly just bold and italic."
I'm sorry but that feature is just too hard.






0 Points      Benja      26 Apr 2010      Editorial Comment
Yes, the usage for the existing question should be for intelligence explosion only. Perhaps Kurzweil's quote should be removed if he's been misquoted.

Perhaps separate questions could be added for the other two kinds of singularity. The reason I didn't add them, was that the "intelligence explosion" version seemed to be the one that was the most debated.



1 Point      Adam Atlas      05 Apr 2010      Stance on Question: Agree
I can see several paths to a Singularity (I'm working under the intelligence explosion interpretation). If we create an Artificial General Intelligence that is smarter than humans — smart enough that it has significant advantages over humans in skills like programming and engineering — then we get an intelligence explosion pretty much right away. Or if we get to the point where we have enough processing power and brain-scanning precision to perform whole-brain emulation, we could create a million copies of the world's best programmers and have them work on improving their own software, a million copies of the world's best engineers and physicists and have them work on improving their own hardware, or a million copies of the world's best cognitive scientists and computer scientists and have them work on reverse-engineering the ingredients of general intelligence and then implementing it. And so forth. (Personally, if we're going to go that route, I think a million copies of Eliezer Yudkowsky would work best.) Similar things could also happen if powerful Intelligence Amplification techniques are developed, though I think the end goal should remain Friendly AI, with IA/uploading as secondary means to that end.

Anyway, if we can keep ourselves from going extinct for long enough, a Singularity of some kind is just about inevitable, though not as a prophecy of a utopian future; indeed, its inevitability necessitates serious thought and research on how we can make sure we get it right the first time.


0 Points      Benja      05 Apr 2010      General Comment
"I think a million copies of Eliezer Yudkowsky would work best".

Definitely an upside here for the reproductive success of nerds.



0 Points      JGWeissman      06 Mar 2010      Stance on Question: Agree
Technology is advancing rapidly, and contributing to the increase in its advancement. Unless we destroy ourselves first, this feedback will lead to a singularity.