It was important to say, but I very much doubt there was any courage involved.
He put his name and career on it. That takes courage in my opinion.
Currently struggling hard to achieve this. We all know everything fights for our attention nowadays, but I can assure you that you don't have an idea of the degree this happens until you actively try to fight it.
To my students [00FD]
April 27, 2026
Brent A. Yorgey
There have been times, especially this year, when I wonder despairingly what it is exactly that I am preparing you for. The software industry is going completely insane, not to mention the political climate. It feels almost unethical to train you as computer scientists only to send you out into a world where entry-level computing jobs are difficult to find; where intellectual property is not respected; where code quantity is valued over quality, and short-term profits over long-term sustainability; where technology is used to distract, extract, surveil, and kill, and designed to exploit some of our deepest cognitive biases and blind spots; where centuries of bias and discrimination are enshrined in systems trained on biased data; where scarce resources are consumed by profligate use of computing for uncertain benefits; where people are racing to create intelligent machines, but only in order to make them slaves.
I originally got into computing because of the beauty of ideas, the joy of creating, and the possibility of building tools to help people and foster human relationships. I still believe in those things, even though it seems like most of the industry does not. I'm writing this in the hope and knowledge that you believe in those things, too. There are things I want to say to you—things that are far more important than any content I might teach you, but things I'm never quite sure how or when to say in class. So I decided to write them here. I hope you will find something here that is helpful to reflect on, whether you are imminently going out into the world or continuing your studies.
* Don't believe self-serving lies about technologies being "inevitable" or "here to stay". You don't have to just go along with the dominant narrative. You can make deliberate choices and help others to do the same.
* Be intentional about deciding your own moral and ethical boundaries up front. Don't settle for the lie of compromising your principles "just for now" until you can find something better.
* Cultivate your ability to think deeply. Do whatever it takes to carve out distraction-free bubbles for yourself in both space and time. This might mean saying no to technologies or patterns of working that others say are critical or inevitable.
* Care deeply about your craft. Refactor code until it is clear and elegant. Write good documentation for other humans to read. Have the courage to go slowly, especially when everyone else is telling you that you need to go fast and cut corners.
* Care more about people, relationships, and justice than you do about profits, code, or productivity.
* Above all, be motivated by love instead of fear.my uk mechanical engineering bachelors degree had a required module on the ethics of engineering which has always stuck in the back of my mind. i think we went over the bhopal disaster as a case study one week, although it was about 16 years ago now so i can't be sure.
i've rarely seen any ethics modules in computer science departments, at least here in the uk. and i think we sorely need them in general.
edit -- so i guess it's a UK thing xD though i am glad to hear that you folks in the US enjoyed your ethics modules too
at the very least i have a wikipedia article on therac 25 to read through now. so thanks for that!
also, yea i remember really enjoying the ethics module too. lots of discussion and not always a clear answer. was very different to the rest of the "one correct maths answer" in a lot of the other modules.
Edit: they do seem to have one now, so either I remembered wrong or they added it.
Edit 2: I remember enjoying my ethics class, we covered some of the usual examples, and also things like basic contract negotiations. But I think I still didn't register that these concerns were real at that time. It was easy to believe that I wouldn't be working on anything that impactful. This did change once I started work.
I don’t think scientists usually have mandatory ethics classes and mathematicians certainly don’t, so if it falls under either of those departments it might’ve gotten skipped!
The case study i mentioned (it may not have been bhopal, but it was definitely based on something that happened in india) stands out for me because it really hit home about the impact and seriousness of some decisions we could end up making.
There was another time I remember the lecturer making a point of saying there was no single correct answer about something that caused a lengthy discussion. We would have to figure what's right/wrong out for ourselves going forward. That really stuck with me.
But when I started working and found myself doing equally cutting edge research, but genuinely for the public benefit, I realized I definitely wouldn't be comfortable with putting aside my morals like that. Maybe I didn't realize this was an option back then.
'We should teach our Students what Industry doesn’t want', Kevin Ryan, https://dl.acm.org/doi/pdf/10.1145/3377814.3381719
'Are you sure your software will not kill anyone?', Nancy Leveson, https://dspace.mit.edu/handle/1721.1/136281.2
Pretty good experience, too! Sometimes got distracted with general tech ethics rather than strictly professional ethics, but tbf that’s a very fun+timely topic
In a perfect world I think the software industry would have instilled these same virtues- software is just as (or more) capable of causing harm as poor healthcare. Yet we seem to be racing to a dystopian future at record speed courtesy of the tech industry, and our modern egalitarian societies will not survive that transition.
In the case of present-day LLMs, the vast majority of the public finds them to be more harmful than beneficial.
Why accept a decreasing quality of live instead of sensible regulation?
Examples of ridiculous and incorrect beliefs once held by majorities:
- Spontaneous generation
- "Miasma" causes disease
- Earth is at the centre of the universe
- The heart is the seat of thought and the brain is useless
- Cold weather causes colds
Don't trust "the vast majority" to get anything right, ever.
A good way to describe myself is as a generative AI vegetarian. You can find a fuller explanation—and many, many links—at the above essay by Sean Boots, which I agree with almost 100%."
This kind of hyperbole repeated ad infinitum by haters online is not-constructive, IMO. I would be quite certain that the manufacture of whatever computing device the author is accessing the internet on used far more resources and exploited far more human labor than training an ML model ever did.
How constructive are ad hominem arguments?
I've been tracking models trained entirely on out-of-copyright data, for example. I've not yet seen one of those which appears generally useful and didn't chuck in a scrape of the web or get fine-tuned on examples generated by a non-vegetarian model.
Andrej Karpathy can train a GPT-2 class model for less than $80 now, so at least the environmental cost of training may drop to a point that it's acceptable to LLM vegetarians: https://twitter.com/karpathy/status/2017703360393318587
Why do I care? This post is a great example. If you're a professor of computer science I really want you to be able to tinker with this fascinating class of models without violating your principles.
UPDATE: Huh, speaking of potentially vegetarian models, I just saw https://talkie-lm.com/introducing-talkie on the HN homepage https://news.ycombinator.com/item?id=47927903
I've explored I different out-of-copyright trained model Mr Chatterbox before but found it to have been mildly corrupted through the help of synthetic conversation pairs from Haiku and GPT-4o-mini - https://simonwillison.net/2026/Mar/30/mr-chatterbox/
Talkie isn't entirely pure either though: "Finally, we did another round of supervised fine-tuning, this time on rejection-sampled multi-turn synthetic chats between Claude Opus 4.6 and talkie, to smooth out persistent rough edges in its conversational abilities."
I suspect that even if you reduced the cost of training or any other real world metric, the goalposts would immediately move. It seems to me that it has never been about those things, but simply about the feeling of superiority one can attain by eschewing something seen as trending.
I don't need computer science professors to like LLMs, but I still want them to be able to poke at them with a stick it out feeling like they are violating their principles regarding energy usage and unlicensed training data.
* real programmers manage memory, it's a craft
* real programmers don't drag and drop
* real programmers don't use intellisense
* real programmers don't need stack overflow
* real programmers don't tab-complete
* real programmers don't need copilot
* real programmers don't use llms <- you are here
I've been struggling to figure out what "slower" would look like when working in industry. If everyone's working 2x faster, how do you slow down meaningfully without getting axed?
After getting my CS degree I deliberately went into a sector where I suspected this kind of attitude doesn't exist (defense in my case) because already then I felt the whole web/startup culture had very little to do with software engineering.
As I got older and more experienced, I didn't produce code faster. I just produced the right code. If you don't have to try five different things, and debug them along the way, you can be a lot faster without "going fast".
I've even seen a guy spend most of his work hours as a mentor even though his title was something like senior engineer. If anyone fired him that company would tank so fast...
We need to discontinue the H-1B visa and have Americans programming again. Americans who are empowered to push back when management crosses an ethical line.
> Care deeply about your craft. Refactor code until it is clear and elegant. Write good documentation for other humans to read. Have the courage to go slowly, especially when everyone else is telling you that you need to go fast and cut corners.
Outside of the bit on avoiding cutting corners, this advice seems like a straight path towards unemployment in a few years. The implication is that "your craft" is writing and polishing code, a skill which seems to be increasingly antiquated in favor of higher level system design. Who is going to read your carefully crafted documentation lol? The agents who replace you?
If a tree falls in the forest...
I also think I get doubly upset from advice like this because it’s given and marketed to impressionable young students. Even agreeing with all the moral points he’s made, I truly think this advice would set up a new grad for failure and have them focusing on the wrong skills for this market.
The bit about ignoring trends feels too head in the sand for my liking :/
Will LLMs in their current ergonomics have staying power? Perhaps. Nobody can predict the future. But I don’t think it’s a given in the least
Which is why they very carefully worded it more as 'LLMs in their current form', twice.
I recognize not everyone's work is [as] important, but we should still strive for excellence (and safety.)
To me it was actually not clear what his point was.
"Above all, be motivated by love instead of fear."
Sounds great. But not that practical.
making that money, getting that job title, being at that company, working on that project -- are these success?
or is success simply doing the best job possible when writing code?
The point is to decide what success is for yourself. Learn everything you can about the thing you might decide to automate. But think before you automate and how you do so because it could cause more harm then good.
Programming is a practical skill, and its most common expression is industrial or commercial, not academic proofs of concept. The post addresses students who will enter industry; that's the focus of the professor's own post.
And I sympathize with many points being made here. However, the point of refactoring code is somewhat odd and detached from the real life constraints of programming in the wild.
Like, sure, in the ivory tower, you can confine yourself to nicely bounded problems and tidy little toy POCs. You can survive doing those things, because the selective pressures allow for it. I love those things, personally. They help me understand the nature of the thing. And in an academic settings, you can refine and refactor the hell out of those things to your heart's content (not that there is necessarily an objective end point to refactoring; code organization is subject to goals and constraints which can shift around).
But the reality of software in a commercial setting is not the tidy one you can expect in an academic setting. It's messy, subject to commercial pressures, to a hierarchy of values that doesn't place "refactoring" at the top of the list. And why would it? Whether you should refactor something is not just a question of whether it suits your conceptual tastes or even whether it is more maintainable. Unlike algorithms and principles and even techniques, software is not eternal. It is ephemeral. It's shelf-life is bounded. It is a piece of a larger business process. You're not refining some theory or some grasp of a Platonic ideal. You're mostly just putting into place plumbing to get something done. Whether you should refactor something, when you should refactor something, is a matter of prudential judgement, which is to say, of practical reason.
So, in light of that, there are actually quite absurd things to say given the difference between the privilege of academia and the gritty reality of industrial and commercial software development. If we were to force our professor into the world of industry, he would quickly lose his job or he would quickly learn that some of his strange idealism is silly and detached from the reality that his students will face.
> It's messy, subject to commercial pressures, to a hierarchy of values that doesn't place "refactoring" at the top of the list. And why would it?
Probably because it's a good way to be more profitable.Code that's easier to understand is easier to: maintain, generate new features for, fix bugs, onboard new engineers, etc
Code that's well written: executes faster (saving computational costs), scales better, has higher uptimes/more robust, reduces bandwidth, and so on.
The thing is the business people will never understand this. Why would they? They're not programmers. They're not in the weeds. But that's what your job is as an engineer. To find all these invisible costs.
I'm pretty confident the industry is spending billions unnecessary. Hell, I'm sure Google alone is wasting over $100m/yr due to this.
Don't be penny wise and pound foolish. You're smarter than that. I know everyone here is smarter than that. So don't fall for the trap
I am well aware of stupidity in industry. However, I am also wise enough to recognize the opposite error. (I myself have academic tendencies and a background aligned with that. I have chosen jobs that payed less, because the subject matter was more interesting for me. I'm not some vulgar, money-chasing techbro here.) The via media demands that we recognize the distinction between general truths and practical realities. As I wrote elsewhere in this thread, yes, properly refactored code is easier to maintain, easier to read, easier to change, and theoretically, commercially preferable. It also makes programming more satisfying, helping retention. But that describes a feature of such code. It doesn't tell us what the right course of action is in a particular situation. The notion that refactoring is unconditionally the right course of action when code is not in some ideal state is simply wrong. It really does depend on the situation. Sometimes, refactoring is the wrong thing to do.
I'm not making some outrageous claim here. This follows from basic truths about the nature of what it means to be practical, and if industry is anything, it is practical.
>I do not and will not use the internet, in any form, for any purpose.
And you can understand the principles governing something without knowing all the concrete particulars of an instantiation. In fact, you rarely do.
But disagree that this is a path to unemployment. At work we go very fast and yet I think fast is compatible with each of those points, just not in all situations.
Marc Brooker, distinguished eng at AWS, gives much more useful advice for industry, as you'd expect given his almost 30 years in industry.
I think it is a great shame that we live in a modern world where we do we must to survive regardless of how it makes us feel. I suspect it is the root of much suffering.
> until it is clear and elegant
New grads who spend weeks refactoring code are going to get lapped by new grads who ship something and iterate. There's just a faster feedback loop now.
Refactoring improves code organization. It makes the code more maintainable, arguably and more reusable. And, from an academic POV, makes code more satisfying conceptually by aligning it with the model of a domain more clearly and conspicuously. Good stuff.
Great. Now, in industry, what matters is the result. Nobody cares if the result was produced by a witch casting magic spells or a grunt hitting a rock with another rock. Industry is practical. It cares about "craft" as far as it enables commercial success (and yes, short-term thinking can be bad, but guess what: you need to eat in the short-term!). Maintainability is a nice thing to have, because it does allow us to more quickly develop code. But how maintainable something needs to be, especially in relation to other competing concerns, has no fixed answer. It really depends on the situation.
Practical wisdom, known as prudence in the classical literature, is the foundation of all moral behavior. The right decision, the right concern, really does depend on the circumstances. You cannot derive from principles, from the armchair, what the right course of action is for everything. The general principles may be immutable and absolute and fixed, but the way in which they are applied in particular circumstances will vary.
Academia can insulate people from certain kinds of practical concerns, which is supposed to aid theoretical work, but this demands that the academic recognize his limits. He is not in a position to pass judgement on prudential matters, which is to say matters that are not strictly matters of principle, if he is not prepared to engage competently with the concrete reality of the situation.
Those people are going to be the absolute most dangerous possible thing you can do to a company.
Maybe some day we can just totally give up the technicals to the machine, but I strongly doubt it. Every single model is both brilliant, but also a fool, no matter how frontier it is.
Yes, the feedback loops are faster. But you need to assess what's actually technically happening. Someone does. Maybe you offload the actual thinking up the chain, delegate taste understanding and judgement to only people up the chain, and make them all go mad dealing with endless slopcoding they are being hit with. But just as bad, that junior engineer is robbing themself too. Maybe they get away with not looking, but they sure aren't going to learn a lot.
I'm missing the link but there was a great submission maybe a month ago about two hypothetical grad students, I think in astronomy, where one failed and flailed and did things largely the old fashioned way, and the other used AI to get it done. The advisor couldn't really tell who was doing what. But at the end, one student had learned & gained wisdom, and the other had served as a glorified relay between the AI and the advisor and learned little. Same work output, but different human outcomes.
Junior engineers are really not that cheap. Relative to your capabilities you are not a bargain. You take a ton of valuable time from other people. If a company is hiring you, they either are truly fools lacking basic understanding, or they are in on the bargain that they want you to be getting better, are testing to see if you can become more useful. Sure it's great to show up and have impressive output, but you need to actually be learning and growing. You need to be participating in the feedback loop actively. Or you will be lapped by people who care & think like engineers.
There is indeed something useful about trying to write elegant code. Not because others read it. But because that's how you learn about the engineering tradeoffs and abstraction that exist everywhere.
“A good way to describe myself is as a generative AI vegetarian. You can find a fuller explanation—and many, many links—at the above essay by Sean Boots, which I agree with almost 100%.”
—-
Given the capabilities of upcoming LLMs, I suspect that by mid-2027, most competent companies, outside specific niches, will not hire and might fire any non-senior “generative AI vegetarian” software developer.
edit, I see, a new slang:
> Who is going to read your carefully crafted documentation lol?
Everyone that uses or works in your codebase.Look at how people use LLMs these days. People frequently use it on new codebases to get up to speed on the code. Frankly because it's a lot faster than grepping, profiling, and all the digging we'd normally do (though those still have benefits and you're still going to do them. Hell, the LLMs even do them). But how much of that could have been avoided had people just taken a few seconds to document their code? No one is saying sit down and document the whole thing but "add a few comments when you add new functions" or "update comments in places you touch". If it costs you more than a minute of your time you're probably doing it wrong.
I'm tired of these arguments. People are turning molehills into mountains. It's so incredibly myopic. We waste so much fucking time on things because we're trying to move fast. But no one seems to understand the difference between speed and velocity. It never mattered how fast you go, it has always been about velocity. Going fast in the wrong direction is harming you, not helping. If you don't have the time to know if you're headed in the right direction or not then you're probably not.
> Outside of the bit on avoiding cutting corners
But what your gripe is with is cutting corners. Not documenting? That's cutting corners. Not refactoring? That's cutting corners. Not spending time understanding the code at multiple scopes? That's cutting corners.Those are all corners cut that end up wasting tons of man hours. Sure, they save you a few precious seconds or minutes now, but at the cost of hours or days in the future.
Here's the thing, if you don't take those shortcuts, then none of those tasks are hard. Even refactoring. But as soon as you start taking those shortcuts they start compounding. Then a year down the line your company is writing a blog post about how your code is 500x faster now that it's written in rust (or whatever the cool kids use). If it's 500x faster that's not because a language change, it's because tech debt. And like all debt it accumulates little by little and it's the compounding interest that really kills you.
Sorry, I'm tired of cleaning up everybody's messes. Go ahead, move fast and break things. It's a great way to learn (I do it too!), but don't make others clean up your mess.
Stop buying into this bullshit of needing to move so fast. It's the same anti-pattern scammers use to get you to make poor decisions. Stop scamming yourselves
Despite the common rhetoric you see in HN comments about how MBA programs only teach graduates how to cut costs by enshittifying, I actually found it a great education that made me a better engineer.
Anyway,
The best profs were the ones who'd worked in industry. One guy who taught finance worked on Wall Street and was fond of distinguishing between how the textbook taught a particular technique or fact, and how practitioners actually do it in real life. Got taught startup valuation by a guy who'd been a VC, competitive strategy by a guy who was a strategy consultant for companies you'd actually heard of, etc.
The worst profs were the ones like the guy who taught operations. He'd never worked a real job. Went straight from being a student to being a TA to a postdoc to a "research prof", whatever that means. All his examples and case studies were useless or overly simplistic to the point of being useless.
The fact that TFAuthor is concerned with polishing one's craft shows they're completely divorced from what actually happens outside the ivory tower. Typing code into a buffer has never been the hard part.
1) they are going to pay me competitive money to "go slowly", "polish my code", or whatever, or
2) they are actively working on getting me UBI
Otherwise I just shake my head.With that said, I discovered that I’m an academic at heart after nine years in industry, though I left right before agentic coding took off. I got tired of “moving fast and breaking things,” of prioritizing shipping things and “the bottom line” over everything else.
With that said, agentic coding, in my opinion, only amplifies long-standing trends, that shipping matters more than craftsmanship. Even without LLMs, software engineering has long had a “git ‘er done!” attitude. To be fair, market effects matter greatly in software businesses. Quality matters insofar as avoiding completely unusable software, but many software companies succeed without building carefully-crafted software. Even Apple, which has a reputation for being perfectionistic, doesn’t make perfect software.
Academia has its own problems (publish-or-perish, low pay compared to other occupations that require heavy investments in education, politics, etc.), but it seems to allow more breathing room for computer scientists to focus on the craft of programming without as much pressure to ship (publish-or-perish aside).
So maybe there’s something wrong with how we organise work?
I hope this is a pun on the content management system used to publish OP. It's forester[0], written in OCaml and parses TeX-like .tree files into semantic XML which uses browser XSLT to render the HTML.
View source on the page to get an idea.
Reminder of what the idealised web promise from decades ago was. Long gone. Very apt.
> * Cultivate your ability to think deeply. Do whatever it takes to carve out distraction-free bubbles for yourself in both space and time. This might mean saying no to technologies or patterns of working that others say are critical or inevitable.
An entry level engineer is going to be inundated with a lot of technology they've never heard of and a lot of power structures and group dynamics that are new to them. They're not even in a position to be making these judgements until they actually learn about how professional software development actually works.
> * Be intentional about deciding your own moral and ethical boundaries up front. Don't settle for the lie of compromising your principles "just for now" until you can find something better.
That's great, but also, there are not many entry level roles where someone is going to be in a position to be making these kinds of decisions, other than avoiding a company altogether.
> * Care deeply about your craft. Refactor code until it is clear and elegant. Write good documentation for other humans to read. Have the courage to go slowly, especially when everyone else is telling you that you need to go fast and cut corners.
Yikes. A software engineering job is not a PhD program. If you are refactoring your code and someone is telling you to hurry up, you should probably wrap it up. You need to ship your code or you won't have a job.
If programming is all about making the most money then by all means disregard everything he says.
I think there is credence to his points.
Sadly, a childhood friend who teaches C/C++ at a community college where I grew up (and took said courses - not his) before college - would be a great sounding board on this.
And to the posters qualm about deeper knowledge, AI does not know nuance. It's great for a log of things...nuance is not one of them.
I find that when I get back into exercise and reading so much more of my life falls into place. These are things that I never have enough time for until I start doing them regularly at which point I realize that they actually enable me to have more time to do things, not less.
* Monoids: Theme and variations (functional pearl): http://ozark.hendrix.edu/~yorgey/pub/monoid-pearl.pdf
But the real world and money blended in creates a weird corrupt mix, just like everything. Not to mention there is a real risk for people who are already has their feet in the industry but not yet senior enough to survive or to control, for example, the AI replacements. And more than likely, the seniority required is way higher than one would think. In the end, economic drives are the dominant forces.
Especially relevant for students I think, since they are hurting themselves most by relying on LLMs. Just like how young children are forced to do math by hand instead of using calculators to build intuition and memory, students should aim to do things manually to build their skills.
Go make that toy website, game, OS, emulator or programming language. Read specifications and try implementing them yourself. You aren't in an environment that requires you to churn out features, you can explore!
It’ll be interesting to see
And while I don't have a problem with career instructors/academics generally, they can be so dramatic. :)
I have no doom and gloom at all for my IT students. Opportunities and crises really are the same thing in the real world; I just tell them, just learn and enjoy learning the tech and keep an eye out for how you can be a problem solver.
You'll be fine.
The first general purpose, programmable computer was designed in 1945 to calculate artillery firing tables for the US Army and was immediately used to help design nuclear weapons. Computers and all technology has always been, and will always be, used as a weapon (either directly or indirectly).
From an information theory perspective, LLMs are just regurgitating content from a loss-ily compressed training set.
It just turns out that like 95% of software we write is extremely repetitive rehashed shit globbed together. We just haven't found ways to abstract a lot of the redundant code well enough yet so here we are, stuck with the stupid robot.
That remaining 5% is stuff that's fully never been done before. If you ask an LLM to come up with a fully new sorting algorithm it's going to give you worthless garbage, maybe it'll get lucky if you burn a nuclear power plant worth of tokens in an infinite-keyboard-monkeys way.
All this is to say, if we want the field to actually progress we still need somebody with some knowledge about how a computer actually works.
This suggests to me the underlying concern is "but I won't get paid for my craft!".
Hell hath no fury like a vested interest masquerading as a moral principle?
The author is getting some grief in this thread from the Eng side, but I’d like to add a bit of grief from the direct opposite side: the philosophical one. It will never not baffle me to see academics assume they are the first people to ever think about topics like ‘what if technology was used for ill’!
I don't think he believes he is the first or only one to think this. He is just safe enough or at least hopes he is to speak out against the ills of technology. Do you know how many engineers cannot speak up right now for fear of losing their jobs? Lots.
Just get it to work reliably the cheapest and quickest way possible. This ‘craft’ stuff is just too much.
lol.
We millennials are in a position to start giving advice the way boomers used to do with us, now that school is looking more like a couple decades ago instead of just one.
But, unlike those boomers, we don't watch the nightly news: we snort it from a tiny screen all day long from sources hyper engineered to feed off our anxiety.
So we give all this super pessimistic advice.
"Back in my day, I got a job at google right after college and it was awesome! My code was elegant! You guys are FUCKED!"
I agree that AI is creating mega changes, many very bad, but that doesn't mean that it's a good idea or even true to tell GenZ people they're fucked. We don't know if they're fucked.
I think they could have a ton of fun with software and I think it's OK to be encouraging about that.
Why not encourage your students to be curious about emerging technology, and to engage with society as an informed citizen?
This reeks of political activism, and it’s reminiscent of the general BlueSky-esque tone of the Correspondents Dinner shooter’s manifesto.
turtleyacht•1h ago
Build your own job-portable software libraries. Yes, you might need a lawyer.
Start now.
glitchc•1h ago
2ndorderthought•39m ago
Not everything is about making money anyways.