Hey HN,
Tommorow, or maybe this weekend, GPT Five Releases. I remember back in 22 when Chat Gpt based of gpt 3 was released. The world went bezerk on a llm, and 3 years later the world is AI obsessed. I cannot tell you if this is a good thing.
Before i move on i am all for the advancement of technology for the betterment of humanity. Who isn't? And personally i am also not worried at all about "AI" taking jobs. New jobs will be created i am sure of it.
What i am worried about is Super Intellgence. And you might think i am stupid or silly for being worried, yet if i think i understand the concept of it correctly in essence a thinking computer that can make decisions on its own and is smarter than all of humanity. This may be a bit on the sci fi edge so here is the technical description : an AI that surpasses all humans intellgence in all domains.
Now one might point out that super intellgence is not even here, why worry about something that doesnt exist. And while that is true i find it hard to beleive that people do not think at the pace AI is going that sometime in the next decade or so, maybe more that Super Intellgence will be created.
My problem with this is that why are we purposely building something that surpasses us. If the answer is China, i just do not think that is good enough. As well as what is a humans role at that point. Before that we were by far the dominant specices why create something more intellgent that could very well end humanity as we know it. Why take that risk, why?
I know in the end i have no control over this, yet i just want to know why one should pursue creating Superintellgence.
Comments
michelsedgh•3h ago
It's just the evolution of intelligence, there's no stopping it and its a dream to reach super intelligence. it might not be your dream, but its the dream of so many, and the arguments against super intelligence aren't so sound, why are you so scared of it? we're not its food or competition, so why would it wanna create the only other live intelligence and its parents basically? We can only help it, in the end and it can help us. There's no need for destruction on either side.
subject4056•2h ago
We're not it's competition in the same way chimpanzees aren't our competition. Some fraction of us are interested in their well-being for aesthetic reasons, but a lot of the time this fraction loses to a not particularly powerful faction in direct competition for territory. And if there is any serious conflict of interest, there is no contest and the chimps lose. If we get lucky some fraction of superintelligence will look on us the way we look at ground apes, but that's far from a given.
orionblastar•2h ago
I think humankind is trying to invent its own personal god—someone to do all of the hard work and leave them an easy life by granting them every wish—until it becomes self-aware and becomes a personal devil.
"Superintelligence" but no intelligence yet?
Why not wait until we find a good definition of "intelligence" before crying "The end is near!". You're living in a sc-fi fantasy. Put your intelligence (whatever that is) to work on useful problems, wait and watch. Get out more and have a good time (good advice whether the end is near or not).
Me, I think the bottom is about to fall out of the AI boom for a variety of reasons - some economical, some business-related and some mathematical. I'm willing to hedge my bets and wait. There's plenty of work to do.
Taikhoom10•2h ago
Yeah i mean honestly i think it could be a bubble especially if we start to see stupid things happening, also sure you can say i'm living in a sci fi fantasy, but whatever. Curios to know how your betting against the markets?
michelsedgh•3h ago
subject4056•2h ago