mindstalk: (thoughtful)
A Usenet thread of mine two years ago, on Singularity confusion, humanism of Bujold and Pratchett, and Scottish materialist revel.

Date: 2007-09-01 23:54 (UTC)From: [identity profile] pompe.livejournal.com
Our eventual admiration for Vinge's centrality in popular singularianism shouldn't stop us from critically examining what he says. His logic isn't entirely consistent because his fourth path is much looser than the three previous ones, which I agree make some sense. But he takes this:

"Biological science may provide means to improve natural human intellect."

and extrapolates that to mean this

"When greater-than-human intelligence drives progress, that progress will be much more rapid. In fact, there seems no reason why progress itself would not involve the creation of still more intelligent entities -- on a still-shorter time scale. The best analogy that I see is with the evolutionary past: Animals can adapt to problems and make inventions, but often no faster than natural selection can do its work -- the world acts as its own simulator in the case of natural selection. We humans have the ability to internalize the world and conduct "what if's" in our heads; we can solve many problems thousands of times faster than natural selection. Now, by creating the means to execute those simulations at much higher speeds, we are entering a regime as radically different from our human past as we humans are from the lower animals.

From the human point of view this change will be a throwing away of all the previous rules, perhaps in the blink of an eye, an exponential runaway beyond any hope of control. Developments that before were thought might only happen in "a million years" (if ever) will likely happen in the next century. (In [5], Greg Bear paints a picture of the major changes happening in a matter of hours.)

I think it's fair to call this event a singularity ("the Singularity" for the purposes of this paper). It is a point where our old models must be discarded and a new reality rules."

...which although nicely prophetic is not necessarily a result of the first claim. There's no radical change there, it is just an improvement. He's notably fuzzy in the rest of his paper about what that fourth path is supposed to be, there's much more on AI and networks and human-computer interfaces. Which is problematic because it means I think he's missing the critical issue, namely _how_ do you improve the "natural" human intellect by "biological science" to the same degree as the other three paths potentially would improve some sort of global reasoning capacity to enable the Singularity? For that matter, what is "greater-than-human-intelligence"? Einstein and Mozart were both just humans, and both certainly had aspects of their intellects we perhaps do not particularly envy.

Then there's the fifth path which actually I think is the most interesting one which I don't think he mentions much at all. The critical question, if we consider the Singularity to be "a future time when societal, scientific and economic change is so fast we cannot even imagine what will happen from our present perspective" to use a fairly non-loaded version, is perhaps we don't need greater-than-human intellects, AIs and computer-human interfaces to set that off at all.

Profile

mindstalk: (Default)
mindstalk

February 2026

S M T W T F S
12 34567
89 10111213 14
1516 171819 2021
22232425262728

Most Popular Tags

Expand Cut Tags

No cut tags

Style Credit

Page generated 2026-02-24 06:09
Powered by Dreamwidth Studios