It's the weekend. Let's do some brainstorming. What's forefront on your mind with regards to the human predicament, and how we move forward?
We all see that it seems pretty much everyone has gone insane, or at least that's what the narrative portrays. Why are companies, governments, and other organizations doing such stupid things?
So we've made machines that can appear to "think", by articulating linguistic constructs that make sense. In fact, there appears to be actual semantic and logical structure embedded within the neural networks of LLMs. Ok, let's keep our heads on straight, keep in mind the mathematics involved, chill out, and just think about what it is that we actually want (and what "we" are, for that matter). Are we going to someday transcend language? I'm aware that I'm typing this message out to you, using language, with hopes that this form of communication will become obsolete. But what will that look like? I know that human language in it's current form is severely limited.
If we're to design some future communication protocol and methodology, let's contemplate what types of things even need to be communicated? For example, what is "the news" nowadays? Could it be simply a video stream of the sunrise? Well it would take a long time to get there. In the meantime, it should probably be something like "yeah, it's another day, and humans are doing the work necessary to put order to chaos and make things better for everyone".
From a more concrete perspective, I'd say that we've reached the point when we can transcend the idea of what "work" is currently. We can see beyond the limited view of doing things for monetary profit, and see that certain systems could be implemented much better if there were certain forms of entity-organization and resource-allocation at play. What is a company if it's ran by a computer? What does it serve? I mean, it sounds silly to suggest some big companies with overlapping domains of operation should consolidate their operations, but we all know that that's where things should be headed in many cases.
So yeah, what's your vision for the future?
mindcrime•3h ago
what's your vision for the future?
Honestly, I consider those two pretty different questions. At the very least, I'd approach them very differently in terms of time-scale. What's "top of mind" for me is more about the short-term threats I perceive to our way of life, whereas my "vision for the future" is - to my way of thinking - more about how I'd like things to be in some indeterminate future (that might never arrive, or might arrive long after my passing).
To the first question then: what's on my mind?
1. The rise of authoritarianism and right-wing populism, both in the US and across the world.
2. The increasing capabilities of artificial intelligence systems, and the specter of continued advances exacerbating existing problems of unequal wealth / power imbalances / injustice / etc.
Combine (1) and (2) and you have quite a toxic stew on your hands in the worst case. Now I'm not necessarily predicting the worst case, but I wouldn't bet money that I couldn't afford to lose against it either. So worst case, we wind up in a prototypical cyberpunk dystopia, or something close to it. Only probably less pleasant than the dystopias we are familiar with from fiction.
And even if we don't wind up in a straight up "cyberpunk dystopia", one has to wonder what's going to happen if fears of AI replacing large numbers of white-collar jobs come true. And note that that doesn't have to happen tomorrow, or next year, or 5 years from now or whatever. If it happens 15 years, or 25 year, or 50 years, or whatever, from now, the impact could still be profound. So even for those of you who are dismissive of the capabilities of current AI systems, I encourage you to think about the big picture and play some mental simulations with different rates of change and different time scales.