?

Log in

No account? Create an account
chibi_tektek

Something I don't get about Singularity theory

I'm reading Ray Kurzweil's The Singularity is Near,  and there's something that I don't get about the theory of exponential growth, or even exponentially growing exponential growth.  And that's physical or historical flukes.




This cycle of machine intelligence’s iteratively improving its own design will become faster and faster. This is in fact exactly what is predicted by the formula for continued acceleration of the rate of paradigm shift. One of the objections that has been raised to the continuation of the acceleration of paradigm shift is that it ultimately becomes much too fast for humans to follow, and so therefore, it’s argued, it cannot happen. However, the shift from biological to nonbiological intelligence will enable the trend to continue.
Kurzweil, Ray (2005-09-22). The Singularity Is Near: When Humans Transcend Biology (p. 28). Penguin Publishing Group. Kindle Edition.



So what I don't get is that this is tracing trends at a very, very high level, which seems to ignore the effects of chaos at the individual level, even when that is significant.  In the aggregate, advancement was slow long ago, and much faster now.  Part of that is individual processing --how fast someone can think and how much knowledge they have access to-- but part of it also is communication between individuals, especially in an era of distributed and federated intelligence.  I seem smarter, am smarter, when I have access to the internet.  Without it, I would be hard-pressed to build a shack to live in or make clothes for my body.  Without the larger distributed-intelligence of civilization, I'd have a real hard time with the clothes problem.

So where that comes back to singularity theory is that the theorizing assumes that once something has happened somewhere, it's everywhere.  That once it's invented, the race has grown.  This ignores that technological advancements took decades or even centuries to spread in the past, and that historical events, flukes, could erase the advancements entirely or set them back for a long time.

So as advancement within the individual becomes much much faster, the cost for losing that individual, of their particular intellectual wandering, becomes that much greater.  What if the Singularity met a natural disaster?  What if communities with substantially different lines of inquiry were light years apart with no communication?  What if nanobots weren't accessible for a particular population?  Couldn't that disrupt that exponentially exponential progress pretty substantially, slow it down?

Comments

I could see that happening to a widely-dispersed civilization, but I don't think the Singularity concept needs to be evenly distributed across an entire race. As I understand it, the argument is that even a relatively confined area with a small percentage of said civilization in it could "go singular", and suddenly (from our perspective, at least) have access to far greater capabilities and resources. I chose the words that I put in quotes because I'm making the analogy that you don't need to have all the uranium in the country go critical to get an immense explosion in the fraction which does.

So a Singularity might involve just part of a civilization changing phase from human to superhuman. But I want to ponder the possibilities of disasters (natural or self-inflicted) causing a Singularity to fail, sputter like a hand-cranked biplane propeller... suppose it takes many tries before a Singularity actually get airborne?
That makes sense. Thank you for helping me to think it through. That thing you want to ponder is in fact why I'm wrestling with this, now. First, I think that while it's well and good to look at the big picture raises in civilizations, the lived, on-the-ground experience of those advancements is going to be fragmentary, unequal, conflict-ridden. The *stories* are in how it's not smooth.

Including the story that I'm starting to write, which depends on a chunk of human civilization getting really close to a technological singularity and freaking out about what's on the other side, sabotaging themselves into backing down from it. But, then, the real story lies in what remains - how far you'd have to break things down in order to have the plane not take off immediately on the next try, and whether the failure of that biplane engine could nevertheless lead to a different flying machine later on.