This is not a story about AI.
Seriously, why anyone listens to crypto ceos is beyond me. Modern day snake oil salesmen.
If i was working on a finance or finance adjacent company i'd be very hesitant to use anything that might send data outside the company.
Management is hard so I’m generally a little more patient with managerial missteps. But this is a different level of unreasonable. Heck a lot of developers in the finance world adopt slowly because they’ve worked with compliance departments and it becomes a habit.
I assume "onboard" means something like "set up an account and get it working locally".
IMO this was the optimal approach, trying in my own time and not risking damaging the company codebase until it was safe... But I might have been fired, by the sounds of it.
Can’t say I’m surprised that a crypto CEO - an industry totally overflowing with contradictions - is completely unfazed when confronted with yet another contradiction
I don't think the CEO should know whether or not I've used AI though nor do I think it's fair to fire people for it.
I guess I could maybe see a case for catching someone saying "I don't care about trying out new tools" - it's a position held by some of the least productive people I've ever worked with. But there other reasons why someone might not have picked up new tools yet, like "I'm just trying to get my damn work done" or "I tried this tool but it just seemed distracting".
I fall into those camps all the time wrt new tools.
From my reading of the article, you don’t have to think AI is useful or great to keep your job there, you just have to try the tool out because the CEO said to.
Just because I've found it to be very helpful doesn't mean everyone will.
They would rather go to a Saturday meeting than do the thing their CEO explicitly asked them to do in the very reasonable timeframe they were asked to do it.
My take on the past 20 years or so is that programmers gained enough market clout to demand a fair amount of agency over things like tooling and working conditions. One of the results is, for instance, that there are no more proprietary programming languages of any importance, that I'm aware of. Even the great Microsoft open-sourced their flagship language, C#.
Non-developers like myself looked to the programming world with a bit of admiration or perhaps even envy. I use programming tools in my job, and would not choose a proprietary tool even if offered. The engineering disciplines that depended on proprietary tooling tend to be lower paying, with less job mobility.
Maybe the tables have turned, and employers have the upper hand once again, so this may all be a moot point, or a period to look back on with fondness.
It's... "uncurious" is the best way I can think of to describe it.
And he also bucks the trend by running a crypto company in the US instead of some random island in the Carribean and actually talking to regulators in the hopes of getting regulatory clarity.
1. The idea of them using AI coding tools in a forced way like this. (Meticulous code quality, and perfect understanding of every detail, are critical.)
2. The culture implications of insta-firing someone whose explanation you didn't like, for why they hadn't started using AI tools yet.
3. Scheduling the firing call for a Saturday. Are they in some kind of whip-cracking forced march, and staff going to be fatigued and stressed and sick and making mistakes?
I'm sorry but there's just no fucking way. Even before AI these crypto coins companies were absolute clown factories. There's no way they ever had it.
I've worked on the triaging side of large corporate bug bounty programmes & trust me when I say that security-by-obscurity is far more impactful in keeping our world (incidentally) secure than any active measure. Absence of exploit does not equal absence of vulnerability.
Sure, code quality is important everywhere, & even moreso in finance, but if you're going through this world believing the mean standard across financial tech is high, even before considering the likely rot of coin-brained companies on their engineer's standards, then you need to readjust your skepticism.
On the other hand, the cultural implication of feeling my superiors even have any level of granular interest in monitoring the individual tools I personally use to generate outputs that benefit the company... outside of obvious security/endpoint concerns, there's no world in which that's an environment conductive to quality.
Is he returning a favor for all the goodies that "crypto" is getting from this administration? Like Tether being legitimized in El Salvador by best friend forever Bukele and having its finances and (alleged) USTD backing handled by Lutnick's Cantor Fitzgerald?
"I'm interested in the technology and have been paying attention to its development, but it's not yet to the point that I believe it will be worth integrating into my workflow."
Though I will say, if you copy and paste that question into ChatGPT, it can give you some options to respond in a diplomatic way ;)
The few people who don’t will be forced out naturally when they can’t keep up.
No. The Butlerian Jihad (https://dune.fandom.com/wiki/Butlerian_Jihad) hasn't happened yet.
I would guess that something similar could exist in the Terminator / Skynet timeline, but I am not aware of the religious beliefs of the humans struggling there.
I suspect I was just laid off for not using any of the AI tools at work. Here's why I didn't.
1) They were typically very low quality. Often just more hosted chatbots (and of course they pick the cheapest hosted models) with bad RAG on a good day.
2) It wasn't clear to me that my boss wasn't able to read corespondance with chatbots the way he could with my other coworkers which creates a kind of chilling effect. I don't reflexively ask it casual questions the way I do at home.
3) Most of my blockers were administrative, not technical. Not only could AI tools not help me with that but in typical corporate fashion trying to use the few sanctioned tools actually generated more administrative work for me.
Oh well. I'm kind of over corporate employment anyway and moving onto my own thing. Just another insane misfeature of that mode of socialization at that scale.
I wonder how likely it is for CEO roles to get taken over by a sophisticated LLM at this point. I’d wager we’d see a 20x increase in value. I use and value llms in my coding and research workflows already but to fire people for careful and slow adoption speaks very poorly to individual and company maturity.
I don't think this is true at all. In fact there are major tech companies that ban the use of AI when coding and those folks do it for their job everyday without an llm.
BallsInIt•4h ago
> meeting on Saturday with everybody who hasn’t done it
Even more toxic
janice1999•3h ago