Singularity – Is THIS our future?
- Doreen Peri
- Site Admin
- Posts: 14598
- Joined: July 10th, 2004, 3:30 pm
- Location: Virginia
- Contact:
Singularity – Is THIS our future?
Singularity, aka "The Singularity" ...
is this our future?
"In future studies, a technological singularity (also referred to as just "the Singularity") is a predicted future event when technological progress and societal change accelerate due to the advent of superhuman intelligence, changing our environment beyond the ability of pre-Singularity humans to comprehend or reliably predict. This event is named by analogy with the breakdown of modern physics knowledge near the gravitational singularity of a black hole."
http://en.wikipedia.org/wiki/The_Singularity
http://www.ugcs.caltech.edu/~phoenix/vi ... -sing.html
" Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.
Is such progress avoidable? If not to be avoided, can events be guided so that we may survive? These questions are investigated. Some possible answers (and some further dangers) are presented."
http://www-rohan.sdsu.edu/faculty/vinge ... arity.html
"There is no clear definition, but usually the Singularity is meant as a future time when societal, scientific and economic change is so fast we cannot even imagine what will happen from our present perspective, and when humanity will become posthumanity."
http://www.aleph.se/Trans/Global/Singularity/
What IS Singularity?
http://www.singinst.org/what-singularity.html
Oh, there are many more links. Just do a google search for singularity or "the singularity."
Scary stuff!
To me, if evolution leads us this way, I want no part of it! We were better off without technology, entirely. My hands and arms and eyes and mind and imagination are already WAY too connected to this freaking box!
Opinions, please?
Is it all sci-fi riff raff? Why is so much money being invested in developing this type of concept? From Microsoft to NASA to govt funded scientific research & development... on and on. It's all really creepy to me!
is this our future?
"In future studies, a technological singularity (also referred to as just "the Singularity") is a predicted future event when technological progress and societal change accelerate due to the advent of superhuman intelligence, changing our environment beyond the ability of pre-Singularity humans to comprehend or reliably predict. This event is named by analogy with the breakdown of modern physics knowledge near the gravitational singularity of a black hole."
http://en.wikipedia.org/wiki/The_Singularity
http://www.ugcs.caltech.edu/~phoenix/vi ... -sing.html
" Within thirty years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will be ended.
Is such progress avoidable? If not to be avoided, can events be guided so that we may survive? These questions are investigated. Some possible answers (and some further dangers) are presented."
http://www-rohan.sdsu.edu/faculty/vinge ... arity.html
"There is no clear definition, but usually the Singularity is meant as a future time when societal, scientific and economic change is so fast we cannot even imagine what will happen from our present perspective, and when humanity will become posthumanity."
http://www.aleph.se/Trans/Global/Singularity/
What IS Singularity?
http://www.singinst.org/what-singularity.html
Oh, there are many more links. Just do a google search for singularity or "the singularity."
Scary stuff!
To me, if evolution leads us this way, I want no part of it! We were better off without technology, entirely. My hands and arms and eyes and mind and imagination are already WAY too connected to this freaking box!
Opinions, please?
Is it all sci-fi riff raff? Why is so much money being invested in developing this type of concept? From Microsoft to NASA to govt funded scientific research & development... on and on. It's all really creepy to me!
- Axanderdeath
- Posts: 954
- Joined: December 20th, 2004, 9:24 pm
- Location: montreal or somewhere in canada or the world
- Zlatko Waterman
- Posts: 1631
- Joined: August 19th, 2004, 8:30 am
- Location: Los Angeles, CA USA
- Contact:
Dear Doreen:
The AI fanatics have been itching to "get out" of our human state for decades.
It's true that humans have made a bit of a mess of things: look at George W. Bush in his presidency . . .
Joel Garreau has an interesting book and interesting articles to read on this topic. He approaches things from a humanistic perspective, not from a machine-oriented view:
http://www.amazon.com/gp/product/038550 ... s&v=glance
and Amazon.com also recommends a "The Singularity" book along with Garreau's.
My advice? Stay human.
--Z
The AI fanatics have been itching to "get out" of our human state for decades.
It's true that humans have made a bit of a mess of things: look at George W. Bush in his presidency . . .
Joel Garreau has an interesting book and interesting articles to read on this topic. He approaches things from a humanistic perspective, not from a machine-oriented view:
http://www.amazon.com/gp/product/038550 ... s&v=glance
and Amazon.com also recommends a "The Singularity" book along with Garreau's.
My advice? Stay human.
--Z
- Doreen Peri
- Site Admin
- Posts: 14598
- Joined: July 10th, 2004, 3:30 pm
- Location: Virginia
- Contact:
I never want to get out of my human state and wouldn't wish that on any human being.
I find all this stuff totally creepy, especially the fact that reputable people are treating the concept as anything other than sci-fi and throwing lots of money at it in order to push the concept forward into reality.
I can't think of any other word than creepy.
Que sera sera? whatever will be will be?
Not if I can help stop it!
I find all this stuff totally creepy, especially the fact that reputable people are treating the concept as anything other than sci-fi and throwing lots of money at it in order to push the concept forward into reality.
I can't think of any other word than creepy.
Que sera sera? whatever will be will be?
Not if I can help stop it!

Yeah right, and how are you liking your chances at this stage?doreen peri wrote:Not if I can help stop it!
A friend last night was telling me about Anne Rice's description of the roles of God and Satan as she portrays them in the fifth book (I think) of the Lestat series (Interview?).
Anyway, she was saying something about how the character Montoch (or something) declares to be the Devil to Lestat. Tells lestat that he was present at creation and describes how he was perturbed by Gods plan way back in the begginning of time days, all the bad stuff that god allowed etc. and how he wanted to change it. Well, apparently, god said something like "do what you want" but first sent Satan down to experience earth for himself (I might have details foggy as this is a second hand retelling that I haven't read myself). So satan wanted to turn the boat around, just like the cabin boy on the ship of fools and god said "well, you go to earth and collect up as your own anyone who questions my plan or fails to trust me".
Excuse any discrepancies but I think that's the jist.
and this is where I'm coming from: everything is beautiful and perfect. there is no cause for concern. from bad comes good -- balancing each other out.
This is the centre of the universe.
My tribe is gathered around me.
Behold me.
I AM.
My tribe is gathered around me.
Behold me.
I AM.
and...
P.S. Sorry Dor, I haven't read your links, though I will try to have a look at some stage. Don't know much about AI but I think it's a little zealous merging man with machine. I have heard McKenna speak a little bit about some thing where our very consciousness is connected to some etherical interface whereby we can access any information at any time without a computer. Or something like that.
I don't know if that's related to this in any way, but if the reliability of my internet connection is any indication of the reliability of my microchip interface 50 yearz from now, I don't want to know about it just yet.
I don't know if that's related to this in any way, but if the reliability of my internet connection is any indication of the reliability of my microchip interface 50 yearz from now, I don't want to know about it just yet.
This is the centre of the universe.
My tribe is gathered around me.
Behold me.
I AM.
My tribe is gathered around me.
Behold me.
I AM.
- abcrystcats
- Posts: 619
- Joined: August 20th, 2004, 9:37 pm
OK, I read a few of the links -- maybe enough to have questions and not enough to comprehend -- this is what I got out of it, though.
You are saying that humans will create an AI that is so complete in itself that it will fully replace human intelligence and probably even human emotions -- AND these AIs will do a better job with things in the world. Is that correct? If this was even possible (and it probably is) then why would humans create it? I ask you -- aren't we VERY egotistical and dominating creatures? Why would we voluntarily create a being better than we are? That we can create it, I believe. That we WOULD -- I doubt.
I am probably stupid and missing a whole bunch of information, but what does that have to do with getting sucked up by a black hole? It could happen. I don't see what one has to do to with the other.
As for what we do with ourselves as human beings -- it's sort of like pedalling a bicycle. I can imagine pedalling a bicycle so fast that I can't even control the bike, but could I do it? Probably not. I probably couldn't pedal a bike so fast that I couldn't steer it or effectively use the brakes. The reason is that my ability to speed on the bike develops about as fast as my abilities in other areas. As much as I practice racing, I have to practice braking and turning; it's just a matter of what it takes to operate the bike.
I CAN'T imagine humans making progress go so fast that we, ourselves, couldn't stop it or control it. I think we've gotten to our ultimate point of disengagement, already. Or very close. Right now, we are like a precocious child, intellectually gifted but emotionally backwards. We can't handle it very well, NOW. We're going to coast in this position for a few hundred years until our emotional maturity catches up to our intellectual ability. OR -- our current patterns will end us.
We're at a crucial stage. We have to start some emotional and spiritual maturity, and we have to do it NOW. We have to stop overpopulating the planet. We have to start feeding the hungry. We have to start distributing the resources we have among us, and we have to start allowing the rest of the planet to stabilize from the last 100 or 200 years.
Whatever happens, has to happen now, or in the very near future. If it doesn't, then we won't over-technologize ourselves, we'll just create so much of a burden on the earth that we'll exterminate ourselves by a viral plague, or we'll disrupt the earth's homeostasis to the point where it can't support us. I think it's probably going to be a disease that wipes us out. A stupid, simple thing like AIDs, or bird flu -- something that sneaks up on us and gets us before we even know it's there.
We won't do it to OURSELVES in an obvious way, like AIs. We're much too self-involved to create any kind of technology that would abolutely replace us. Partially, yes. I could see one part of the population replacing the other half that way, but I don't think it will get that far. I think nature will mess us up way before then.
I don't fear humans advancing too far for themselves to keep up. That's optimistic. I fear humans exploiting so many resources that they use them up before they know they've done it. That's what, in my opinion, is really happening.
You are saying that humans will create an AI that is so complete in itself that it will fully replace human intelligence and probably even human emotions -- AND these AIs will do a better job with things in the world. Is that correct? If this was even possible (and it probably is) then why would humans create it? I ask you -- aren't we VERY egotistical and dominating creatures? Why would we voluntarily create a being better than we are? That we can create it, I believe. That we WOULD -- I doubt.
I am probably stupid and missing a whole bunch of information, but what does that have to do with getting sucked up by a black hole? It could happen. I don't see what one has to do to with the other.
As for what we do with ourselves as human beings -- it's sort of like pedalling a bicycle. I can imagine pedalling a bicycle so fast that I can't even control the bike, but could I do it? Probably not. I probably couldn't pedal a bike so fast that I couldn't steer it or effectively use the brakes. The reason is that my ability to speed on the bike develops about as fast as my abilities in other areas. As much as I practice racing, I have to practice braking and turning; it's just a matter of what it takes to operate the bike.
I CAN'T imagine humans making progress go so fast that we, ourselves, couldn't stop it or control it. I think we've gotten to our ultimate point of disengagement, already. Or very close. Right now, we are like a precocious child, intellectually gifted but emotionally backwards. We can't handle it very well, NOW. We're going to coast in this position for a few hundred years until our emotional maturity catches up to our intellectual ability. OR -- our current patterns will end us.
We're at a crucial stage. We have to start some emotional and spiritual maturity, and we have to do it NOW. We have to stop overpopulating the planet. We have to start feeding the hungry. We have to start distributing the resources we have among us, and we have to start allowing the rest of the planet to stabilize from the last 100 or 200 years.
Whatever happens, has to happen now, or in the very near future. If it doesn't, then we won't over-technologize ourselves, we'll just create so much of a burden on the earth that we'll exterminate ourselves by a viral plague, or we'll disrupt the earth's homeostasis to the point where it can't support us. I think it's probably going to be a disease that wipes us out. A stupid, simple thing like AIDs, or bird flu -- something that sneaks up on us and gets us before we even know it's there.
We won't do it to OURSELVES in an obvious way, like AIs. We're much too self-involved to create any kind of technology that would abolutely replace us. Partially, yes. I could see one part of the population replacing the other half that way, but I don't think it will get that far. I think nature will mess us up way before then.
I don't fear humans advancing too far for themselves to keep up. That's optimistic. I fear humans exploiting so many resources that they use them up before they know they've done it. That's what, in my opinion, is really happening.
Who is online
Users browsing this forum: No registered users and 1 guest