Namely, small, clever teams will be able to do big and entertaining things that were not possible before.
(But yes there will also be tons of slop.)
But a few quick thoughts: Video games have always been about cutting edge technology. Because they are interactive, they are best positioned to leverage AI tech. (unlike static media)
Prototyping tends to be the most important but most neglected process for finding the fun. AI is a catalyst for rapid prototyping such that a studio can more quickly build and assess game loops and de-risk the rest of their dev cycle before staffing up or pulling team members off other projects.
AI may long-term create more leisure time macro-economically for everyone, meaning more consumer time that may be consumed playing games. (Owen Mahoney thinks the industry will soon triple in size)
2. Games are just code that's fun. How does this sounds as a process for making something fun: "Start by de-risking." Hmmm OK, yes, this tracks with my experience of private equity companies being the most innovative and successful creators of games.
it's fascinating how this delusion persists.
More leisure will only occur if it becomes true that the marginal return of doing additional work falls almost to zero. So to say people will have more leisure time actually suggests more that they will lack opportunities to do things of value than it does that they will choose to have more leisure time. Which is depressing.
Each gang member could be running their AI value-miners at home, but of course since they're the only kind of value in the new AI-communist society they'll be the obvious target for the other leisure gangs. So after enough rounds of violence each leisure gang will run a fortified, paramilitary "intelligence mining" operation, and oh by the way indie software dev is punishable by death in these territories.
Is this scenario, like, 9 kinds of insane? Sure! But so is the idea that we'll all be at the beach doing Idunno what, fucking? All this to say yeah, I'm with you that anyone who describes that AI will make a future defined by a lack of productive work is describing a depressing future...
You have not supported this argument.
On the contrary, I think LLMs are not a big help.
We've already been here. Procedural generation was the magical solution that was supposed to help a single dev make giant worlds. And it did. It helped you make giant, utterly empty and soulless worlds.
Great for minecraft. Not so great for No Mans Sky, as there were significant limitations. Useless for anything that depends on a story or immersion in that story or characters.
This idea that you can wire an in game character up to an LLM is misguided and doesn't seem to understand what players want a character to be.
In Mass Effect 2, a fan favorite character infamously had little content for most of the game. Garrus was a very loved character, including being the love interest for a lot of people who played the game, but during 2 he just sits in a part of your ship and says "Sorry, can't talk, running some calibrations" for almost the entire game.
Put the entire game's script into an LLM as context and have it pretend to be Garrus and try and talk to it. Will the LLM take a strong stance that Garrus would take? Will it correctly figure out that Garrus would rather kill a bad guy than let him get away, and then have him make convincing arguments about that in reference to the previous mission, and then in the next mission put you in a situation where you have to decide whether Garrus is right and whether you should dome that "Bad Guy" instead of letting the police fail to aprehend him? Will that Garrus make you think about this world and your place in it and whether your personal or chosen morals are right?
Probably not.
People don't want to chat about the weather with random NPCs. People want characters that have character notes and integrate into the story and make you feel.
So far LLMs can't really write that well, and certainly are not cohesive and multidisciplinary enough to be able to build that into a game in a convincing way.
There's a famous game called "Facade" which plays up this big fanfare about how it uses "AI" and Natural Language Processing to bring two characters to life for you to talk with and navigate a crisis with, but it's almost entirely lies. The actual logic of how it works is almost identical to old fashioned text adventure parsers in their heyday. It's a heavily scripted sequence, with fairly few actual paths it can take, and the script is not that big. It got so much press for what was basically done in the 80s. I think some people have tried to hook LLMs into it, but it just doesn't feel good. The problems that AI dungeon adventure game always had still exist, just under more paint.
However, I’ll say that as a result of their AI integration they are also doing way more human writing in the form of prompts and other procedural elements than if they just used old fashioned dialog trees.
I think AI can only be used as an enhancement in certain specific and controlled ways.
One mistake I see a lot right now is the assumption that you can delegate design and creative direction to AI. I think that generally yields slop. In fact I think the Creative Director/design role at a game studio may be the hardest digital job for an AI to replace. I had the opportunity to express that idea to Sam Altman once and he did refute it.
But I don't see the games industry as all that vulnerable to AI at all. Game engines already drive constantly-improving dev efficiency through improved abstractions.
VC’s raised easy money (ZIRP era) and they wanted to deploy fast. Founders told VC’s what they wanted to hear to secure capital.
(I raised, with friends, in 1999, and was senior at a VC-funded startup prior to that).
Most people who comment on Hacker News would not have preferred the status quo ante of YC.
That’s an interesting point but I have to imagine all the worst founders know it too so the filtering may not have gotten easier. I’d be curious to hear from them.
There's also another way in that circumvents all the other filters - being a founder at an existing startup that has really good traction already. You can have a resume of no-name companies, no degree, and never have founded a company before, but if your business is growing, making money, and looks wildly scalable with YC's support then you can get in that way.
This makes sense in a dynamic environment with sensitive local conditions and "network lag" in the chain of command. But in more static or settled market environments it may be wiser (for investors) to focus elsewhere and restrict founder autonomy. We see this pretty commonly with successful founders who get "phased out" and replaced with more experienced managers.
I wonder how much this sort of "distributed decision-making" has been formalized and studied.
Auftragstaktik describes a clear purpose / intent. Like: capture the bridge (but: we don’t care how you do it since we can’t foresee specific circumstances)
OODA describes a process of decision making.
So, Auftragstaktik answers who decides what and why.
OODA answers how decisions are made and updated over time.
They’re complementary but different.
This was eye-opening. I used to think militaries were completely centralized and top-down, but a friend who was an officer explained this to me and pointed me to the literature. It was fascinating and educating to understand the principles behind Mission Command being successful as a method (competence, mutual trust, shared understanding, etc).
Not sure this was followed very recently.
If you believe you are being a given an order that is illegal and refuse, you are essentially putting your head on the chopping block and hoping that a superior officer (who outranks the one giving you the order) later agrees with you. Recent events have involved the commander in chief issuing the orders directly, which means the 'appealing to a higher authority' exit is closed and barred shut for a solider refusing to follow orders.
That doesn't mean a soldier isn't morally obligated to refuse an unlawful / immoral order, just that they will also have to pay a price for keeping their conscience (maybe a future president will give them a pardon?). The inverse is also true, soliders who knowingly follow certain orders (war crimes) are likely to be punished if their side loses, they are captured, or the future decides their actions were indefensible.
A punishment for ignoring a command like "execute those POWs!" has a good chance of being overruled, but may not be. However an order to invade Canada from the President, even if there will be civilian casualties, must be followed. If the President's bosses (Congress/Judiciary) disagree with that order they have recourse.
Unfortunately the general trend which continues is for Congress to delegate their war making powers to the President without review, and for the Supreme Court to give extraordinary legal leeway when it comes to the legality of Presidential actions.
It is my understanding that the Russians do it that way, which does not seem to work out great for them.
> In war the first principle is to disobey orders. Any fool can obey an order. He ought to have gone on, had he the slightest Nelsonic temperament in him.
One recent concern I have is (anecdotally) how poor a deal YC startups I've seen are offering for founding engineers.
Before YC started, early hires who helped make a startup successful could get rich on stock.
YC may have changed the investment scene for the better in some ways, such that founders are less likely to screwed by investors. But today, early hires are the ones who seem to be getting screwed.
In cases I've seen recently, even if the startup has a nice exit for founders and investors, employees would have been better off as a worker drone at a FAANG-like.
Do we need PG to write an essay (or the richest managing partners to make a video), about the value of incentivizing early hires?
Or, don't even talk about the value of it (since some aspiring founders are aggressively confident now, that they know how all the ducks are lined up), but talk about new criteria: YC looks for a respectable pool to incentivize early hires as positive signal, when determining who to fund.
Now its just a way to sell these companies to those OGs.
you raise a valid point about survivorship bias I guess, but as of the last few years it seems a lot of rage bait and do anything to get signal instead of the positive optimism that I felt a lot of the companies offered those years ago. i guess what i'm trying to say is that it has become the final version of itself as a venture firm, whereas before it was quirky (again, I am outsider!! so this is my POV) and backed companies that made products/services that I think made my life objectively better.
end of the day, I really like YC and think they do a good thing overall. but I think people/founders have realized how to game it, if that makes sense.
(* Edit: well, some do, at least, and they seem to be highly motivated to post comments!)
Of course, right now, gloom is all over most of social media, and everywhere else, for understandable reasons of world reality (besides engagement manipulation/dynamics reasons).
And, within our field, a lot of the current enthusiasm and curiosity for tech news is for getting our economic tickets, in a tech gold rush that is not nearly as optimistic as the Web one was.
Will the optimism on HN come from topics that are interesting, but detached from all that threatens us, like a break or a reminder that there is still goodness and greatness?
Will it come from finding ways to correct or fight against what threatens humanity?
Will it come from tantalizing hints of personal advantage or opportunity to wind up on top?
Another question that's orbiting around this in my head is how do we explain to the community what we're looking for and why. Knowing HN, and the internet in general, I can imagine the backlash from certain segments of the commentariat ("Oh so we're only allowed to do happy talk now?" - "Once again the corporate overlords crack the whip" - etc. etc.)
On the other hand, I also know that other sizeable segments of the community have been tired of the cynicism here for a long time - so there will be positive repercussions too.
What you wrote above is so different than what I thought I was reading above that I had to ask myself why it felt that way in the first place. Probably the "nepo founders" bit was what got me, and "fraud-aligned" didn't help either. Maybe also this is a case of the 'rebound' phenomenon where the reply comment says more clearly what was originally meant; that happens a lot too: https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
I wonder who thinks that a practice of combining sub-market salary with miniscule ISO grants (not even real equity) for early hires is a good idea for startups that want to win big.
https://news.ycombinator.com/item?id=46437148
> This reminds me of when YC seemed to be a response to the dotcom boom environment, a bit "by hackers, for hackers", to help hackers start Internet businesses. Rather than mostly only the non-hackers starting dotcoms (such as with affluent family angel investors and connections). Or rather than hackers having to spend their energy jumping through a lot of hoops, while dealing with disingenuous and exploitative finance bro types.
PG does say he actually looks for “naughty” founders as one of the key filtering traits. I link this essay in my post.
But you still have to be ethical and do what you say or it won’t be possible to grow your business long term.
The gamer in me wonders if the “ideal founder” can be described as “chaotic good”?
Pickle is one of the most egregious examples, but I've noticed a trend of these fly-by-night operations with YC backing getting found out as frauds over the last handful of years. I don't know if it's happening more or if I just wasn't aware of them or if the rate was the same but people started talking about them more.
What Pickle is doing is essentially they're falling on the wrong line of "fake it to you make it", it would be totally fine to do what they're doing (allowing pre-orders with a $200 deposit for a Q4 '26 product) if they just weren't lying about the specs. It's pretty clear they aren't going to deliver anything like what they've promised, but that is just ambition. The whole point of YC is that 1 out of 1000 of these companies are going to deliver something revolutionary and you don't get that without 1000 of them trying to do something revolutionary.
Having said that, you only need to watch the launch video to realise the CEO is total moron ("If everyone wore the same pair of glasses, what would they look like?"). But the way YC works, they don't actually have the power to tell Pickle what to do. YC are going to lose their investment on that company.
But the whole supposed point of YC is investing in people not founders. If that's the pitch and you invest in a moron, that makes you look bad too. YC should be good at telling if people are morons - that's kind of their entire job.
> But the way YC works, they don't actually have the power to tell Pickle what to do.
They get 7% of your company. They do actually have some power.
I don't feel bad for Paul Graham and his partners, I'm sure he's got his bag and then some, but from the outside it looks like it (the YC-adjacent thing, that is) lost big(-ish) when it came to riding the AI hype train.
Ah, now I know why the vibe of Humble Bundle changed.
The first ones were truly fun packages with soul. Now they're churned out without feelings, it feels all algorithms.
spacemarine1•1d ago
daedrdev•1d ago
psawaya•1d ago
daedrdev•1d ago
Related, I find it interesting is that gacha games seem to ahve the highest possible returns but almost none are made by western game companies.
trueismywork•1d ago
awkward•1d ago
dpoloncsak•1d ago
jsheard•1d ago
dpoloncsak•1d ago
I do kinda get what you mean, though. Gacha mechanics feel expected in anything western, while 'loot boxes' are still a 'feature' of some games in the east. Eastern studios have definitely noticed, though, and are running the same playbook.
dpoloncsak•1d ago
modwilliam•1d ago
dpoloncsak•1d ago
modwilliam•1d ago
dpoloncsak•1d ago
fwip•1d ago
modwilliam•1d ago
spacemarine1•1d ago
I don’t believe that ultra-predatory mechanics are long-term sustainable. They usually yield a “ring of fire” effect that creates a growing ring of users for a while but really you’re burning out all your core users and will implode. This is how many describe the original Zynga model.
Supercell (founded around the same time) has cultivated longterm ecosystems and IP by respecting their players.
EGG takes a similar long-term perspective.
YetAnotherNick•1d ago
Loughla•1d ago
bigstrat2003•1d ago
Espressosaurus•1d ago
VCs are looking for the billion dollar exit. 10s of millions is 100-1000x off what they look for
YetAnotherNick•1d ago
dpoloncsak•1d ago
I don't think they're trying to be. I think they're whale hunting, find a few high spenders, and milk them for all they're worth. Then they spin up a new IP (or license one out), rinse and repeat
chrisweekly•23h ago
(Sharing to help others bc I had to look it up.)
Analemma_•1d ago
mikepurvis•1d ago
Trying to ride that to the moon is a very different proposition from a B2B play where you sell some service that concretelt delivers $X/mo recurring value to each customer for a $Y/mo price tag, and X > Y, but Y - your costs still turns a healthy profit. If you do that right, everyone is winning and the economy as a whole grows, not at all the same as the zero-sum game that is soaking a few whales and ruining their lives.
7777777phil•1d ago
jsheard•1d ago
Fortnite is a bit of weird backwards example because the early PvE iteration had paid lootboxes, but they were scrapped in the Battle Royale spinoff which actually got popular, and eventually removed altogether. They still do things like engineering FOMO to drive sales but ironically the games monetization was the most exploitative when nobody was playing it.
But now the siren song of lootboxes is calling to them once again... https://kotaku.com/fortnite-loot-boxes-gambling-roblox-20006...
spacemarine1•1d ago
I helped Indie Fund Hollow Knight back in the day and look a them now!
spacemarine1•1d ago
mikepurvis•1d ago
I'm not an indie dev, but if I was I would happily give up a chunk of my potential profit to be listed on there, knowing the size of the market that says "oh yeah... a Devolver title, I would blind-buy this, it's probably pretty good."
mrdataesq•1d ago
guywithahat•1d ago
seizethecheese•1d ago
conartist6•1d ago
Once upon a time someone like me for whom engineering competence is a core aspect of my identity would have never considered turning my back on YC. But now I'm just embarrassed by them. The things they now think are the only things worth investing in mostly make me want to vomit, like the vibe coding casino-IDE startup. As someone who still espouses their old values rather than their new ones, I'd rather succeed on my own.
csa•1d ago
Genuine question…
Do you not think that a large percentage of (random cut off) $1b companies over the next 10 years will be AI?
And/or do you not think that the next $100b+ company will be AI-centered?
conartist6•1d ago
csa•1d ago
Largely true.
> while consumers are not actually especially eager to have all human contact progressively stripped from their lives
Hmm… I agree with this sentiment, but I think it’s mostly a straw man. There are many things that AI can do well that people will end up embracing directly or indirectly.
Medical scans is one big one, imho.
Mundane but important legal services is another.
Skillful mediation of scutwork is definitely embraced.
Good and fast simple customer service via phone or text will end up being very welcome (at least in some contexts). I realize that most people will prefer superlative human customer service, but that’s currently not a widespread available reality, especially for simple tasks.
All sorts of learning (great and essentially free tutors).
All sorts of practice (e.g., language, speeches, debates, presentations, etc.).
All of the above (and more) are things that people are using AI for right now, and they seem to be loving it.
I realize that some folks use AI tools in regressive and sometimes dehumanizing ways, but that’s not the fault of the tool, imho.
philipallstar•1d ago
People have been trying this for a long time, as it's an obvious win, but have struggled so far. Perhaps newer models will help, though.
conartist6•1d ago
You could make a customer service AI that was an advocate for the consumer, but it would likely spend the company's money liberally. So instead you'll end up with AI agents incentivized to be stingy and standoffish about admitting the company could improve, just like the humans are.
You can tutor with AI, but there's no knowing what it will teach you. It will sound as convinced of itself when it teaches you why the earth is flat as it does teaching you why the earth is round. The one thing it will certainly do is reinforce your existing biases.
You can practice with AI, but you'd learn more by posing yourself the questions.
A doctor can have AI look at medical scans, but they can't defer to AI judgement and just tell the patient "AI says you have cancer, but I don't really know or care one way or the other". So again, the skill in reading results needs to be in the doctor.
rfrey•1d ago
spacemarine1•1d ago
YC just so happens to invest super early in small teams.
So the overlap of YC and AI is inevitable. AI is not an investment genre per se but it can be used to accelerate or improve any ecosystem if used carefully and cleverly.
Since my Humble Bundle days, I’ve always been partial to small companies and small dev studios. Not all EGG companies use AI but they are all keeping tabs on the technology. Mitch Lasky has said that AI may have opened a window in which small studios may have their best shot at outsized success in recent history. Eventually the big dogs will catch up and adopt the new tech themselves but right now David has a shot at Goliath.
conartist6•1d ago
corimaith•1d ago
Ironically when media culture is at is at its healthiest is when winners are diverse and common, and more importantly smaller shows that try out new things can still break even, with periodic flops being generally tolerable. That low risk culture for attempting new ideas is precisely what creates legendary franchises later when a few of these hit everything right.
bob1029•1d ago
About 99% of the work you do on a game will wind up in the trashcan. Doesn't matter what kind of work it is. Code, audio, textures, models, map layouts, multiplayer balancing work, etc. are all susceptible in the same way. No one is safe from the chaos. It takes a lot of human energy and persistence to produce sufficient 1% content to fill up a player experience.
I'd estimate for a B2B SaaS product, the ratio is approximately the same, however you don't need such a broad range of talent to proceed. One developer with a desire to do the hard things constantly can be all you need to make it to profitability. Going from one employee to N employees in a creative venture is where things go bananas. If you absolutely must do an indie game and you need it to succeed or your internet will get cut off, you will want to strongly consider doing it by yourself. Figuring out how to split revenue and IP with other humans when you can't get the customer on the phone is a nightmare.
spacemarine1•1d ago
kjkjadksj•1d ago
jacquesm•2h ago
WhereIsTheTruth•1d ago
Gaming is pure B2C: hit driven, capital intensive, and unforgiving
spacemarine1•1d ago
There are clever gaming platforms out there and the best games seem to turn into platforms effectively.
doctorpangloss•1d ago
api•1d ago
FloorEgg•1d ago
Also I know of many successful indie games, some of which were built by one person.
I can think of so many exceptions to your point on both sides that I question your thesis as a rule.
rapidfl•1d ago
deadbabe•1d ago
throw-12-16•1d ago
Dunkey is building a game incubator of sorts and there are some interesting titles coming out of it.
duped•21h ago
Video games are a subset of entertainment which is capped in TAM by the population the game reaches, the amount of money they're willing to spend per hour on average, and average number of hours they can devote to entertainment.
In other words, every dollar you make off a game is a dollar that wasn't spent on another game, or trip to the movies, or vacation. And every hour someone plays your game is an hour they didn't spend working, studying, sleeping, eating, or doing anything else in the attention economy.
What makes this different from other markets is that there is no value creation or new market you can create from the aether to generate 10x/100x/1000x growth. And there's no rising tide to lift your boat and your competitors - if you fall behind, you sink.
The only way to grow entertainment businesses by significant multiples is by increasing discretionary income, decreasing working hours, or growing population with discretionary time and money. But those are societal-level problems that take governments and policy, and certainly not venture capital.