If there is a risk to jobs, it wouldn't show up here since actually less jobs is "good" for business...
Up to a point. Then you no longer have customers.
Prisoner's dilemma, with the businesses as the 'prisoners'.
One of the ways to change the Nash equilibrium for that game is for enough people to empower some outside agent that punishes defectors. (Metaphorically, for the original prisoners in the thought experiment, a gangland boss).
To the extent that the current leadership of government and business are facing a collective action problem it is because different actors have a different number of iterations they are optimizing for.
Put differently, when it's #CrimeSeason, you gotta get yours before the bill is due, and different crooks on different schedules.
But a business needs paying customers, preferably employed by someone else. Other businesses having to pay their employees is good for business in this regard.
The reality is that he couldn’t attract the right workers because factory work is sole destroying.
I think it doesn't even pass the bar of an undergraduate research paper.
It’s a lot more expensive to pay out a securities fraud settlement if you don’t list it and then suffer a loss that can be pinned on ‘AI’.
I look forward to the eventual lawsuit against companies hiding their actual risks in the chaff. Do not use your groin to stop the chainsaw. (To any lawyers, this is the chaff).
Instagram was moments before ipo.
Also the FB html5 play at the time was a decent bridge to the app play which did of course explode growth.
So let’s be fair they didn’t fail but they acquired to grow even more.
I'd be curious about the position of these segments in the 10-Ks, like, if they're suddenly all at the top, that's much more interesting than being tacked on to the end.
It's free to put any bad thing in the risks section of your 10-K; investors aren't going to shun your company over it. If you fail to put the risk in, the bad thing happens, and your company loses value, on the other hand, you may get sued for securities fraud - and courts have been oddly receptive to these suits.
It's like any other clause that gets added to any other mostly-boilerplate legal document over time: one firm adds it, pretty soon everyone copies their work and it's a standard term. It's viral. How fast this spreads among company filings is a matter of epidemiology, not something that actually tells you the companies' outlook.
- https://en.wikipedia.org/wiki/The_Interpublic_Group_of_Compa...
- https://en.wikipedia.org/wiki/Omnicom_Group
- https://en.wikipedia.org/wiki/Warner_Bros._Discovery
- https://en.wikipedia.org/wiki/Fox_Corporation
- https://en.wikipedia.org/wiki/Paramount_Global
I'm sure there are a lot more.
What happens when anyone can write a script and have a feature length movie or 12 episode season.
Youtube, Facebook, Instagram, TikTok all beg to differ. AI slop farms are making bank over there, especially on short form content aimed at little kids. They get hundreds of millions of views, earning more than what Disney/WB makes on some of their high budget garbage movies in cinema.
What is one specific example?
Hedge funds and investment banks are already using these tools to the max, and the markets are plenty profitable for everyone
Everything is Applied Math if you squint hard enough.
All this AI/ML doomerism and boosterism is ridiculous. If you do not understand how gradient descent works or why as of today GPUs are better suited for model training compared to CPUs you should not have a say in this discussion.
Most conversations around AI/ML appear to basically be pop-philosophy discussions that aren't even that grounded in philosophy fundamentals.
If you have trash fundamentals, you will have a trash understanding of the world.
I think this scenario is plausible because the path to this scenario is so smooth so it will be the default outcome unless something strange happens to prevent it.
But now think about how most of the population can afford their lifestyle, ie. buying stuff from S&P 500 companies: selling their time as labor.
seydor•4h ago
Zenst•4h ago
So saying AI is the biggest threat to the S&P is glossing over the root causes. Analysts, getting sucked into the marketing hype, are self-fulfilling in that some people go on their recommendations. After all, in the past if a bank was questioned about how stable it is, could easily see a domino of withdrawals that snowballs into actualy becomming unstable, even if it wasn't before.
epolanski•4h ago
Permit•4h ago
> This report uses a range of cutting-edge LLM-assisted data techniques to extract key risk information from S&P 500 company filings. Following the recent boom in generative AI, we examine reported risks from these leading firms related to artificial intelligence. We clarify the extent to which firms are reporting new AI related risks, what kind of risks are being reported and what these indicate about the broader dynamics of AI in big business.
This is unrelated to marketing departments.
blharr•2h ago
They want to mention AI to boost interest and such, and they end up mentioning AI risks to hedge/cover their backs