Right now corporations are building the infrastructure out wildly and incorporating it into everything. They’re concerned about a race to the top while creating absolute inefficiency and ignoring responsible, sustainable growth.
The task of GenZ should not be to avoid AI, in my opinion.
Rather, embrace it. Own it.
WEAPONIZE IT.
When Google mainstreamed the Search Engine and added tool after tool, it made things that were previously legacy (Word Processing? Pay a big licensing fee to Microsoft, only save to your local machine or hard media! Along comes Google Drive and Docs and now you can edit your document everywhere and a computer crash doesn’t take it out!) well, digitized.
AI is that integration at warp speed.
We now have the tools to work harder and faster. We have near-instant access to research. If we are discerning, AI is actually not a weapon against us. It is a tool we can use to change the narrative.
Big companies are actually banking on fear of the masses. They want you to believe that AI is too big. That it is all-knowing. They don’t want you to recognize you can download ollama and a localized agent and tune it to your needs. Or to get into Gemini and ask it how you can disconnect from Google’s cloud if that’s really what you want it to do.
AI is the future. But it needs human hands. The question. You need to ask is: your hands? Or Microsoft’s?
Local models are quite efficient as well.
The moment you start doing this at any scale, the companies will notice, and after a few winks, nods, and campaign donations, you will not be able to use it anymore.
2 comments that smack of AI authorship, or if the above is human-created, god I wish they'd used AI.
There are truly mentally unwell people in charge who would like get out the E-meter and audit everyone who does not follow their new Scientology knockoff. Yes, the advertising methods and suppression of opposition are the same.
Or a belief of those scared that an imploding "AI" bubble will ruin their financial futures. Or just that most of the humans in their own white collar professions will be replaced by AI's.
Could be me too, but seeing China's general societal infatuation with AI outpace the US by orders of magnitude, I think that's a bit less likely.
Some people are genuinely interested and excited about this new technology. Other people have an interest that the AI will succeed. At least on the surface it seems that these two groups are louder (or more successful) than the ones that oppose AI.
> We can make this technology illegal, and shut it down completely. Why don’t we?
Because there are not many (if any) lobby groups that pour money into making it illegal and also because of fear of not being left behind. There are also plenty of lobby groups that invest a lot of money into putting AI into everything.
you'd need everybody to be onboard, be it your neighbor, the guy 8000 miles away from you on the other side of the planet, all the nations
if even one goes "well ill just keep going" it won't work.
it's like with nuclear weapons, nobody wants to be the one without them unless nobody else has them, so in the end they're still prevalent.
no government on earth will make ai outright illegal. they are the perfect thing to shrug accountability onto, let alone all of the actual semi-useful reasons of keeping ai legal.
how would you even make it illegal? people have local models everywhere. if your country makes it illegal but mine doesnt, people from your country will just vpn and access them in my country. it would have to be a worldwide effort (lol).
a central tenet of justice is that the punishment fits the crime. for CSAM, it is obvious why extremely long prison sentences are fitting. the damage CSAM causes is immense and hard to even capture with words.
the damage llms cause is... not even close.
However, is this exclusive to young people? I'm a millenial (early 90s) and I share their sentiment. I might not share it for the same reason though. Personally, I'm concerned about what AI usage would do to my cognitive ability, and as such I try to limit my use. I can't avoid using it at work (we're being tracked on "AI Adoption") and it does genuinely speed up some of my tasks. And I do play around with AI coding tools, mostly because I think I _should_ know them in this day and age.
But apart from that, I'm not using it. I'm using DDG searches rather than asking ChatGPT for solutions, I still go around reading websites and papers instead of AI summaries, and I don't outsource my writing to it. (i.e, I write my own emails, my own blogs, my own poorly worded HN comments, etc).
I remember similar concerns from Millennials about Gen-Z with the Internet and social media. In the end the Internet and Social Media Gen-Z grew up with was quite different from the one Gen-Y was worried about and the reaction of the new generation to it of course not uniform. Similar developments might happen with Gen Alpha and AI, which seems even more polarizing to me.
They tell me I don't have a real job because I just tell the computer what to do, and I don't do the thing myself (to which I can't help but respond that they're absolutely right). If I try to spin them a bullshit story, they tell me how can that be true and maybe I got brainwashed by AI. Also they hate ads with a passion.
If anything, I'm incredibly hopeful for newer generations. They'll probably mostly be fine, like most of us were.
For most of computing history this has been the case, too!
Like take finance where people just email broken spreadsheets around all day. If they stop doing that then farmers can't get loans to buy crops which means crops don't get planted and so on-so-forth.
Certainly emailing spreadsheets doesn't seem very "real" but there's actual value in providing liquidity it's just not physically demanding.
On the flip side, professional sports is very physically demanding but can you really call what kids do for fun "real work"?
They'll see.
In general those "Generation XYZ is threatened by this, thinks that" tropes often annoy me. I'm born somewhere between Gen-Y and Gen-Z and those boundaries feel totally arbitrary.
Once you use AI for all your work you won't be growing anymore, just fading away
"You're not a real ham if you don't use Morse code", "You're not a real machinist if you use CNC", "Your mechanical drawing skills are going to atrophy if you use CAD CAM. "You should manually tape PCB layouts, so you have more control.
And another grandfather's favorite, "Why do you want to use the forklift? You won't always have one, and a pry bar and rollers are good enough, and you learn the value of real work."
Hm interesting
So they are making the distinction between regular "human brain" coding and AI-assisted coding?
Regular coding could be described as "not doing the thing yourself, but telling the computer what to do"
(FWIW I do think there is a huge difference; however I am not sure the general public has a very good idea of what "programming" is. I remember having some code up on my screen and my educated family was confused, even at the concept)
The current state of the world begs to differ with "most of us being mostly fine". Critical thinking skills and the ability to make wise decisions among the various electorates seem to be in a incredibly shitty state.
Anecdotally, Gen Z-ers as a whole are definitely not better at this; they're easily swayed by flashy memes, TikToks and other forms of disinformation. Where younger people used to have a more society minded, leftist lean (before ultimately becoming jaded), they more than ever side with right wing populists from a young age. Not all of them, but a much larger chunk than before.
I still can barely have a convo with it where it doesn't just make up total unworkable bollocks.
It can manage some coding though tbf, but again, not sure how far a completely non-tech user would find it.
paste the verge article text into your favorite AI tool and ask for an analysis.
Make sure to ask it to read the source Gallup data that this article leans on and compare the conclusions drawn.
> They are being told, on the one hand, that these tools are going to eliminate millions of jobs, and on the other that they have to use them if they don’t want to fall behind.
I'm currently reading a fascinating book called Blood In The Machine° about the Luddites who opposed certain technologies in 19th century England and the parallels with the current state of affairs. It's important to remember that while history doesn't repeat itself, it often rhymes.° https://www.goodreads.com/book/show/59801798-blood-in-the-ma...
We're also no strangers to enshittification, we have first hand experience of technology causing negative societal effects when in the hands of for-profit entities.
If you use AI to understand things for you, you're short-changing yourself.
jdw64•1h ago
But AI is actually not very good at replacing an entire lower-level worker’s job as a whole. It works well only when that work is broken down into smaller and smaller tasks.
The core problem is this: the coercive force of AI use is felt by the lower classes, while the upper classes still have the freedom not to use it. AI may be able to make decisions based on more data than executives do, and perhaps even make better decisions than management. Yet the people being replaced are the lower-level workers.
This is the problem. The upper classes, who claim that AI is an essential tool, still have the freedom not to use it. But the lower classes cannot survive unless they use it. It becomes a tool required for survival, while at the same time being treated as something wrong, inferior, or low-status if you use it.
To get a job, AI becomes an essential survival tool. But culturally, it is also treated as a tool that damages creativity. I see this in open-source communities as well, in the class discourse around open source.
The same culture appears on Hacker News. Among the upper layer of open-source communities, there is often hostility toward AI-generated code, based on ideas of human purity: AI code is said to have no meaning, no responsibility, no real authorship. So even within open source, this takes on a class character.
But as a freelance developer, I have to trade against my own code-writing ability in order to survive and deliver. Because of AI, the floor price of software delivery has collapsed. If I do not use AI, I cannot meet the new requirements.
In the past, a job that would have given me two months and paid $5,000 is now expected to be completed in two weeks for the same $5,000. Without AI, that volume of work is impossible to handle.
This kind of discourse always makes me uncomfortable. I dislike it, but I have to use it.
AI lowers the barrier to creation and learning, but the way it lowers that barrier can also bypass the training of thought itself. It turns young people into both beneficiaries and damaged subjects at the same time.
And we live under this loop of coercion. Sometimes I think I do not want to use AI.
But if I want to survive, I have to use it. I feel the abilities I once took pride in beginning to decay, and I feel myself becoming increasingly bound to AI companies. At the same time, I also feel another kind of ability beginning to emerge.
Perhaps growing older means learning how to live inside irony.
alephnerd•39m ago
This is why the harshest critics of AI tend to be white collar workers of this social class. The same kinds that told coal miners and autoworkers to "learn to code" and called them deplorables for voting nativist in 2016.
Any chance to build mutual trust was gone. The jobs worst impacted are jobs where most of the workers are Democrats and live in blue states that don't swing. Meanwhile, those manufacturing, construction, and healthcare jobs that are becoming a bigger part of the economy tend to be in the purple part of the country.
techno303•33m ago
i don't see a relationship betwern criticism and the chance of automation/replacement
the harshest critics that i see tend to be, almost ubiquitously, creatives
perhaps just my walk of life
alephnerd•22m ago
Eric_WVGG•30m ago
“Wow, this is very, very good at my job, which must be a difficult job because it pays well and I'm a smart guy. Imagine how well it will work for the dum-dums.”
ModernMech•5m ago
The actual pitch was to bring educational and alternative energy opportunities to an area that is impoverished and facing harsh economic realities. It's worth pointing out that the people WV did end up electing did not improve the region and did nothing for coal miners' ecnonomic wellbeing, as many of those coal plants shut down anyway and no one of their elected officials did anything to stop it, nor did they provide any economic alternatives to the region:
"coal production has declined 31% since Trump took office [first term], and by some estimates, more than five dozen coal-fired power plants have closed."
https://www.politifact.com/factchecks/2020/oct/14/donald-tru...
> called them deplorables for voting nativist in 2016.
She called a spade a spade. As mad as they were in 2016 for being called that, they proved her 100% right when they sacked the capitol in a violent insurrection in 2021. That's deplorable behavior.
htx80nerd•30m ago
AI just repeats whatever the prevailing opinion is at that time. I am a very heavy AI user (Claude, Gemini, ChatGPT) and have queried it on a variety of topics. AI is not thinking, it is repeating.
banannaise•23m ago
ben_w•23m ago
That would be an improvement. They are generally far too sychophantic to just repeat the prevailing opinion, and instead synthesise the opinion that they think wants to be heard by the user.
jdw64•23m ago
But that is not what most “work” usually means. Work is mostly repetitive. The actual moment of decision is brief.
So what do I mean by work here? I mean the collection, organization, and synthesis of the materials needed before reaching that decision.
For that part of the process, AI is extremely effective.
mzi•16m ago
So you have quadrupled your income? That seems like the opposite of a collapse.
jdw64•12m ago
In my case, unlike contract freelancers who are hired for a fixed period, I usually work on a project-delivery basis. Of course, well-known programmers may be able to negotiate salary-like contracts, but that is not my situation.
I think my earlier example may have been unclear. What I meant was not that the price increased. I meant that a project that used to take two months for $5,000 is now expected to be delivered in two weeks for the same $5,000.
That point probably needed more explanation. In the current freelance market, prices have collapsed more than many people realize.
My English is not perfect, since I am not from the English-speaking world, so I may have caused some confusion. Please understand my point as: work that used to reasonably take two months is now expected to be completed within two weeks.
jitler•11m ago
You’re good, GP is just a low IQ troglodyte with the critical thinking abilities lower than rock’s.