?

Log in

No account? Create an account
I'm on the worldcon program! - Input Junkie
August 1st, 2009
09:29 am

[Link]

Previous Entry Share Next Entry
I'm on the worldcon program!
Added too recently to be on the website, and I don't know if I'll be on the paper program....

3-118 Sat/Sam 12:30 1hr 30min
P-512BF Human Culture
How are We Getting on Towards
the Singularity Then?

Jo Walton, Robert Charles Wilson
The rapture is beginning to feel a bit like
the Second Coming. Any moment now!
Do we live for it? Do we live in hope? Are
we learning to think of it as a future we
desire only in theory?

(7 comments | Leave a comment)

Comments
 
[User Picture]
From:ckd
Date:August 1st, 2009 03:07 pm (UTC)
(Link)
I'm also on that one as a late addition. I'm looking forward to it.
[User Picture]
From:selenite
Date:August 1st, 2009 04:02 pm (UTC)
(Link)
My car has the bumper sticker "In case of Singularity this car will be AI controlled."
[User Picture]
From:captain_button
Date:August 1st, 2009 04:09 pm (UTC)
(Link)
Which Singularity? The one Vernor Vinge wrote about or the BS strawman Jo Walton attributes to him?

Actually the description makes it clear. Sneering at "the rapture for nerds" is the new "Those weirdo sci-fi freaks, what a bunch of pathetic losers".
[User Picture]
From:nancylebov
Date:August 1st, 2009 05:19 pm (UTC)
(Link)
I don't know Jo's version. I'll be going with Vinge's Singularity as I understand it-- a technologically driven change so extreme that we as we are now can't comprehend it.

Please remember that the description of a panel isn't binding on the participants and probably wasn't written by any of them.
[User Picture]
From:captain_button
Date:August 1st, 2009 06:57 pm (UTC)
(Link)
Well, of course this is my reading of Vinge, but the concept of the Singularity isn't really about technology per se. It is about the the ideas that superhuman intelligences are possible and that superhuman intelligences would be able to create super-super-human intelligences, which would be able to create super^3-human intelligences, and so on. And that the time needed for each extra jump would be shorter than the previous one.

Now if you don't accept those three postulates, the concept doesn't work. But if you don't buy the postulates you should just say so, not redefine the concept to "ordinary human beings with fancier toys" so you can sneer at a straw man.

Personally, I don't find the postulates that plausible, but if they were true and such superhuman intelligences did exist, the concept of an incomprehensible Singularity makes sense to me.
[User Picture]
From:captain_button
Date:August 12th, 2009 02:31 am (UTC)
(Link)
So how did the panel go?
[User Picture]
From:nancylebov
Date:August 12th, 2009 12:42 pm (UTC)
(Link)
Reasonably well. I had some trouble getting into the conversation, but there was quite an interested crowd. Unfortunately, I've forgotten a lot of what went on. If more memory surfaces or find someone who took notes, I'll let you know.

Oh yeah-- I did come up with one notion that I like. One assumption of the Singularity is that becoming more intelligent means that it's progressively easier to become even more intelligent. But what if each increment is harder to discover?

Getting all people up to the level of the smartest people would be a huge change, and possibly a Singularity, but the next step after that would require inspiration.

Edited at 2009-08-12 12:44 pm (UTC)
nancybuttons.com Powered by LiveJournal.com