Instead I find myself more concerned with which virtual machine or compiler tool chain the language operates against. Does it need to ship with a VM or does it compile to a binary? Do I want garbage collection for this project?
Maybe in that way the decision moves up an abstaction layer the same way we largely moved away from assembly languages and caring about specific processor features.
Personally, I think statistics like this are biased towards the median of the past few decades and do not necessarily tell much about the future; other than that things apparently move very slowly and people are mostly conservative and stuck in their ways.
Cobol is still in that list. Right above Elixir, which apparently is a bit of a niche language. Kotlin has only been around for about 15 years, and the 1.0 release was actually only nine years ago. Java was released 30 years ago and it's been dominant in enterprise development for 25 years now. So, no surprise that Java is nearer to the top.
Python is surprising but it's been around for quite long and gained a lot of popularity outside the traditional computer science crowd. I know biochemists, physicists, etc. that all use python. And it's a great language for beginners obviously. It's not so much that people switched to python but that it is driving the growth of the overall programmer community. Most new programmers use python these days and that explains why it is the #1.
Javascript has had a virtual monopoly on basically anything that runs in a browser, which is of course the most popular way to distribute code these days. Especially since plugins were deprecated and things like applets, flash, etc. disappeared around fifteen years ago. Anything that ran on the web was either written in Javascript; or transpiled/compiled to it. WASM is starting to change that but it's early days.
What the past 25 years tell us is that things definitely change. But very slowly. C++ still outranks Javascript. That's because it's mostly browsers where it is used. It's a lot less popular for other things.
I like Kotlin, so I'm biased. But it's obviously not the most popular thing by a long shot. But popular doesn't mean good. I actually like python for small unimportant things. But I reach for Kotlin if I need to do it properly. I used to reach for Java. But Kotlin simply became the better tool for the job; at least for me. I even prefer it over typescript and I do occasionally use it for web frontend development. The transpiler is pretty good. And there's a WASM compiler too and Compose for WASM just entered beta. Kotlin seems future proof and it seems to be growing into wider adoption. There are a few million programmers around by Jetbrains counts. It's not nothing.
But you can easily have both of them in the same project (e.g. when slowly moving to kotlin) and have them interop.
Only up to C90, and even modern C++ doesn’t fully implement modern C.
JS is a valid TS.
That one is perhaps more interesting from an industry/jobs trend perspective whereas the TS vs JS trend is also interesting on its own.
Ooh, then JS&TS are not number two!
The tooling and ecosystem aren’t great compared to some of these languages, but Java itself can be pretty damn good.
I used to be a hater many years ago but I’ve since grown to love it.
One of my complaints with Gradle is that if you write a plugin (Java) it shares the classpath with other plugins. You might find some other plugin depending on some old version of a transitive dependency.
People have their problems with Maven but unless it's some overly complicated legacy project (where npm just explodes I guess? Like I have hD windows machines get frozen from deleting the node_modules folder), it just works and you just give a list of dependencies.
An example of this ossification of understanding is how people still think dotnet is Windows only because they stopped caring before Core/Modern dotnet became a thing
I too would like some illustration of why the tooling (Intellij, etc) is insufficient. Maybe gradle as a build system? Although I have to say with LLMs, gradle build scripting is a LOT easier to "build" out.
One thing is for sure, don't get tight down to one language cause it is popular today. Go with whatever make sense for you and your project.
Not really, Python got popular in 2000s because it was the only sane of the three choices - Perl, Python, TCL. You must be young.
Python has an agenda as well, Guido has said multiple times it was a language designed for teaching programming, and one of the reasons Zen of Python came early on.
Also around the mid '00 it started replacing perl as the unix scripting language of choice.
Python isn't popular because of LLM's. Python is used for LLM's because it's popular. You can replace LLM in that with dozens of other labels and it's still true.
>And Python got popular cause of LLM AI thing.
Python got wildly, maybe exponentially more popular because of LLM AI thing (sic), but only in the last few years.
There, fixed that for you.
Much before that, Python was already quite popular for a long time (although it was slow to take off initially), and was used in a lot of areas, including web dev - Django (Disqus, Instagram?), Flask, TurboGears, Pyramid, Tornado (FriendFeed), Zope, Plone, and many more web frameworks, and apps built on them, PDF generation (ReportLab, more), scientific programming (Numeric, NumPy, SciPy, more), data science, ActiveState, system administration (e.g. on some Linuxes, at least Ubuntu, IIRC), and even GUI app development (PyQt, Tkinter, wxPython). I read somewhere, quite a while back, that the Dropbox GUI clients on both windows and Linux, maybe on Mac OS too, were written using Python and wxPython.
Google some of those project names, and see the start dates of those projects. That will give you a clue about how long it has been in use.
Google was using Python a lot from many years back, on the front end of its web properties, apart from other uses that I would not know about.
And tons of startups and corporates used Python from long back, and still do.
I know some of these things, because I have worked with Python from a long time, starting with v1.5, with light use, and with heavier use from v2.0 or so.
It can't go forever, but as far as I can tell the usage in corporate has started not long ago. You'll have Rust jobs, but they'll be the same shit as Java jobs. There was a study done like a year ago that showed across the board decline in developer satisfaction with all the "new and shiny" JS frameworks. I 100% think that when companies will inevitably start hiring people to maintain those legacy Rust "internal sideprojects" the same will happen. It is when a not driven by passion workforce, people who complete taks and features instead of doing the "provably correct thing" have a go that technologies get vibe checked. We will see which way it goes.
Companies using C, C++ or even Go probably are less keen on switching stacks for the sake of it.
LabView is a kick in the pants...
I'd wager it is the installed base keeping LabView on life support. =3
Yet, the modern licensing model jettisoned a lot of their long time proponents. Like many things in life, most problems were non-technical in nature.
SpaceX is an interesting company, and it made some unique design choices. =3
Generally, IT and Engineering agreed to deprecate their product lines off critical systems about 2 minutes after the deal went through. =3
It comes with device interfaces (not exactly drivers, but some times it has drivers too).
And the house already used labview the last time that happened.
My favorite Julia also made the list this year... nonzero users means there is hope for fun languages yet.
With the new Intel+NVIDIA RTX SoC deal, we can expect Python and C++ to dominate that list in the next few years. =3
After working with Node and Ruby for a while I really miss a static type system. - Typescript was limited by its option to allow non strictness.
Nothing catches my eye, as it’s either Java/.Net and its enterprisey companies or Go, which might not be old but feels like it is, by design. Rust sounds fun, but its usecases don’t align much with my background.
Any advice?
Rust is also a general purpose language, there's no reason you can't use it for just about any problem space.
I guess when you're working at Jane Street, use only their core lib and you get some proper onboarding, things could work great. However you're programming very much in a niche community then.
You might have trouble finding small companies using anything but JS/Ruby/Python. These companies align more with velocity and cost of engineering, and not so much with performance. That's probably why the volume of interpreted languages is greater than that of "enterprisey" or "performance" languages.
I've heard about Java initiatives to improve it, but can you point to examples of how how Java "is maturing into a syntactically nice language"?
I'm tempted to learn it, but wonder whether it would really become nice enough to become a 'go-to' language (over TS in my case)
Check jbang.dev, and then talks by its author Max Rydahl Andersen. That could be a starting point.
Also, a very wide-reaching standard library, good enough type system, and possibly the most advanced runtime with very good tooling.
Here are some actual improvements:
- Record classes
public record Point(int x, int y) { }
- Record patterns
record Person(String name, int age) { }
if (obj instanceof Person(String name, int age)) { System.out.println(name + " is " + age); }
- No longer needing to import base Java types - Automatic casting
if (obj instanceof String s) { // use s directly }
Don't get me wrong, I still find some aspects of the language frustrating:
- all pointers are nullable with support from annotation to lessen the pain
- the use of builder class functions (instead of named parameters like in other languages)
- having to define a type for everything (probably the best part of TS is inlining type declarations!)
But these are minor gripes
What you get is either really old (Java 8 stuck on something nasty like weblogic).
Or companies running either cutting edge or LTS.
Let’s be honest , your realistic options for work are Java, C#, C++, and depending on the industry, Swift, Go, Kotlin, Dart and Rust.
I will hire the one that talks about the joy of FP and/or static typing ANYDAY over the programmer with only JS experience an no visible interest to look beyond.
But use the best tool for the job. Ecosystem matters. What are you planning to build?
But is it general audience? (can every Py/PHP/JS/TS/Java/C# dev become productive in it quickly?)
Also: if you want quick (re)compiles on a larger codebase, Rust is not for you.
Also: good you use Rust in teaching!
But I want a fast on-ramp, quick iterations and clean looking code. (and went with Kotlin because of that -- I like Rust more myself, but I have a business to run, so tradeoffs)
Plus it's a fun language to write. Some people say it's a nicer, higher level Rust.
I also like the look of Kotlin but I've never used it. I think Kotlin and Swift are the two premier modern multi-paradigm languages.
I've since moved to Rust and have not looked back. Importantly, rust-analyzer runs circles around the Swift VSCode plugin (or Xcode for that matter)
Though I must admit it's hard for me to imagine using anything other than Rust for 90% of projects.
I really do wanna try Kotlin at some point as well. Rust, Kotlin, and Swift feel like the future of languages to me.
It also needs to target GNU/Linux, because Apple got rid of their server offerings, thus anyone doing server code for applications on the Apple ecosystem, that wants to stay in a single language needs to be able to write such software on GNU/Linux with Swift.
Windows well, since they have the open source story, it kind of falls from there as complement.
On the revamped website they are quite clear about the identity.
Cloud Services, CLI and Embedded as the main targets for the open source version.
I played with it a long time ago and the IDE (eclipse plugin?) was a bit of a mess, sbt was prone to weird issues and lock ups and the compiler was pretty slow.
Very fun language tho!
Then they decided to focus the remaining effort on Scala Metals, which is based on LSP protocol.
In both cases it is alright for Scala 2, there are some rough edges for Scala 3.
I did that and highly recommend it, at least for new projects - not sure that it's worth the effort of porting an existing project, but ScalaJS just works and I found far fewer rough edges than I expected.
Intellij tools are great.
Some Fintech companies might go a bit outside the norm and allow for Haskell, F#, Scala, which they tend to use as DSLs for some workflows.
Then if you into array languages, banking and fintech is one fo the few domains where that have managed to stay around, but those positions seem hard to get.
Dyalog (APL), J, BQN, Kdb+ (Q)
Since we are at it, faster as well.
As this is, in fact, tacked on, I guess you don't really get much benefit unless you rewrite all your code with these still non-standard annotations.
It does not seem like pointing JSpecify at a standard spring boot codebase will do anything for you. Whereas pointing Mypy to a standard Python codebase will type hints that conform to the language spec will do something for you, as null safety is not tacked on but baked in for python type hints.
Also, the syntax seems awkward and clunky as heck, but I guess for something as essential as null safety it's worth it.
---
From JSpecify, I found the Checker Framework, which also seems really interesting and seems like it supports checking for more types of annotations than just JSpecify annotations. https://checkerframework.org/
That is if you wrote your code base with type hints, just like if you wrote your codebase with jspecify annotations.
And the type hints and behaviour are specified as part of Python, so it's kind of baked in. It's just that the actual type checker is not part of CPython, though Mypy is considered the reference implementation.
We have quite a large code base and very few `# type: ignore[code]` directives.
Some third party code still has poor type hints, but for the most part it's fine, and we get null safety which you don't get for Java.
The only issue is that some libraries (not many!) are still untyped, so you may need to write wrappers or stubs occasionally. But that applies to a small and decreasing minority of libraries.
I only wish Python were faster and had ahead-of-time binary compilation.
Typescript (via Deno) is still a better option IMO.
As the OP, it’s not even the language for me but the implications of companies that use it.
It’s a non starter for startup/scaleups and strongly related with oldish companies and consulting firms, which in turn translates to far worse working conditions (not remote friendly, overtime, dress code, etc).
Mind that it might just be a local culture thing, your mileage may vary.
And it's a non-starter for startups, because it's not hyped enough, not for a technical reason.
Many of them would be way better off with a standard traditional Spring app.
mkay
I won’t deny there’s a lot of bad Java written, but IMO it’s actually one of the best languages for a startup if any of your code needs good performance.
That's 100% a project setting you can turn on or off though, it's like -Wall in C++ projects.
I'm also a back-end dev who's worked in fintech and IME, TypeScript is a great choice.
From my experience with python, none of its type checking libraries are complete enough.
I know, it’s just not realistically up to me. I depend on the team/company culture which I’d rather not have in the picture - I’ve already gone through trying to fight the sea of unknown/any.
Sometimes I'm questioning if it has the potential to become more popular in the future if AI becomes adept at translating Python projects to Nim.
Zig, otoh might be worth another look.
IEEE's methodology[2] is sensible given what's possible, but the data sources are all flawed in some ways (that don't necessarily cancel each other out). The number of search results reported by Google is the most volatile indirect proxy signal. Search results include everything mentioning the query, without promising it being a fair representation of 2025. People using a language rarely refer to it literally as the "X programming language", and it's a stretch to count all publicity as a "top language" publicity.
TIOBE uses this method too, and has the audacity to display it as a popularity with two decimal places, but their historical data shows that the "popularity" of C has dropped by half over two years, and then doubled next year. Meanwhile, C didn't budge at all. This method has a +/- 50% error margin.
[1]: https://redmonk.com/rstephens/2023/12/14/language-rankings-u... [2]: https://spectrum.ieee.org/top-programming-languages-methodol...
Edit: I see they raise this point at length themselves in TFA.
Ada seems pretty popular on Arch
This data is kinda worthless for popularity contests, since they may get picked up by aur packages, but this gives a solid insight into wich languages are foundational
I wish the same was available for other distros
You can do the same with docker images
curl -s https://hub.docker.com/v2/repositories/library/python/ | jq -r ".pull_count"
8244552364
curl -s https://hub.docker.com/v2/repositories/library/golang/ | jq -r ".pull_count"
2396145586
curl -s https://hub.docker.com/v2/repositories/library/perl/ | jq -r ".pull_count"
248786850
curl -s https://hub.docker.com/v2/repositories/library/rust/ | jq -r ".pull_count"
102699482
"Top Languages" doesn't mean "better" nor does it mean "best"https://www.ghs.com/products/ada_optimizing_compilers.html
https://www.ptc.com/en/products/developer-tools/apexada
https://www.ddci.com/solutions/products/ddci-developer-suite...
http://www.irvine.com/tech.html
Perl is almost as active as Javascript. And more useful than Python.
I write Perl to do all sorts of thing every week. Its strange its not in the top 5 list.
The general purpose programming languages today are still- Python, Java, and Perl. Make whatever of this you will.
Larry Wall at one point said, if you make something very specific to a use case(like awk, sed, php etc), it sort of naturally starts to come out of general purpose use.
Its just that Kotlin, Rust, Go, SQL, Julia, SQL, Javascript etc. These are not general purpose programming languages.
Yes, that does not show us how much code is running out there, and some companies might have huge armies with very low churn and so the COBOL stacks in banks don’t show up, but I can’t think of a more useful and directly measurable way of understanding a languages real utility.
43% of all banking systems.
95% of all US ATM transactions.
80% of all in-person credit card transactions.
96% of travel bookings.
This may very well dramatically change in the next few years with such an emphasis on enterprise AI tools to rewrite large COBOL repositories. [2]
[1] https://www.pcmag.com/articles/ibms-plan-to-update-cobol-wit...
[2] e.g. Blitzy https://paper.blitzy.com/blitzy_system_2_ai_platform_topping...
But I asked for a bank statement from my old savings account a few years old and it took two weeks to print out, printed in monospace dot matrix.
Or the betting company that I was a customer that suspends betting everyday 6:30am for an hour for daily maintainance. Ironically, they would accept bets for football matches played at the time, but the system was shut down.
I suspect both are run on COBOL.
LVL3 is pure cobol. It has been recently deprecated but there are many banks who own the code and are still self hosting it, along with it's IBM green screen support.
Vision is a java front end in front of an updated cobol backend. When your reputation is based on your reliability and long term code stability, at what point do you risk making the conversion, versus training new developers to work on your system.
https://www.linkedin.com/jobs/view/business-analyst-afs-visi...
[0] https://www.theregister.com/2022/04/11/gds_gets_over_histori...
If the cobol is still there, it’s not due to risk. If anything, the cobol is a much higher operational risk than replacing it.
But it's very abstracted, part of our main product offering WAS abstracting it. On top of our ready to use applications, we offered APIs for higher-level data retrieval and manipulation. Under the hood, that orchestrates mainframe calls.
But even then that there could be more level of abstractions. Not every bank used screen-level mainframe access. Some used off the shelf mainframe abstractors like JxChange (yes, there's a market for this).
Fintech would be even more abstracted, I imagine. At that point you can only interact with the mainframe a few levels up, but it's still there. Out of sight.
> Working in investment banking, I never saw a single COBOL application
What was the back office settlement or wire transfer system written in? There is a good chance that some part of them was written in COBOL. And while Bloomberg terminals are a vendor product, for a bloody long time, many of their screens had some COBOL.Also, lots of quantitative software at i-banks use LINPACK or BLAS, which use FORTRAN.
But "used in" doesn't mean that it's actively being developed by more then a tiny team for maintaining it.
As this graph we're commenting on is mostly talking about popularity/most used it's never going to rate higher, because for every one Cobol dev there are more then 100 Java devs employed by the same company
Pretty much every business I've worked at to date has had such legacy software, which was inevitably still used in some contexts.
It's not always obvious, because - following with the previous example numbers - only 1-2 Java devs will have to interact with the legacy software again, hence from the perspective of the remaining 98, Cobol doesn't exist anymore.
J2EE would be late 90s and 2000s.
… yes CI would be a lot of these downloads, but it’s at least a useful proxy
Also then you're looking at which languages were popular in the past whereas the interesting stat is which languages are being used to start new projects.
If I'm trying to figure out which language to learn next, knowing what I can get paid for might be more useful, even if it's not that "interesting".
If lots of projects are starting up in Rust, but I can't get interviews because nobody is advertising, how useful is learning Rust?
So in which meaning do you use 'popular'?
> we used Rust quite a lot at my previous company but we didn't advertise for Rust developers at all.
How did you find Rust developers when you needed to hire?That would certainly be the case, if it were not for the fact that [fake job postings][1] are a thing.
[1]: https://globalnews.ca/news/10636759/fake-job-postings-warnin...
i.e. Are you assuming (insinuating) jobs for some programming languages are more likely to be fake
It feels like that metric misses "utility" and instead comes from a very American (or capitalistic maybe is better) mindset.
What about Max/MSP/Jitter? Huge impact in the music scene, probably has very small amount jobs available, so it'd rank fairly low while it's probably the top media/music language out there today. There are tons of languages that provide "the most utility for their domain" yet barely have any public job ads about them at all.
I think such metric would be useful to see the "employability of someone who knows that language" if anything, but probably more pain than gain to link "# of job ads" with "utility".
Everywhere else people hired me because they knew who I was and what I could do and so in place of an "interview" maybe I grab lunch with some people I know and they explain what they want and I say yeah that sounds like a job I'd take and maybe suggest tweaks or focus changes. No shortlist of candidates, no tech interview, no tailoring a CV to match an advert. Nothing -> Lunch or Drinks -> Job offer.
So that can cause some distortion, especially for the niche languages where there are like six experts and you know them - an advert is futile there.
I like this:
https://madnight.github.io/githut/#/pull_requests/2024/1
It gives you a count of public repos on GitHub by language used, going back to 2012.
Use the "right"/better tool from the toolbox, the tool you know best, and/or the tool that the customer wants and/or makes the most money. This might include Ada[0] or COBOL[1]. Or FORTH[2] or Lua[3]. Popularity isn't a measure of much of anything apart from SEO.
0. https://www2.seas.gwu.edu/~mfeldman/ada-project-summary.html
1. https://theirstack.com/en/technology/cobol
2. https://dl.acm.org/doi/pdf/10.1145/360271.360272
3. https://www.freebsd.org/releases/12.0R/relnotes/#boot-loader
Right now, it's apparent to me that LLMs are mostly tuned in the programming space for what n-gate would call "webshit", but I think it is a clear (to me) evolutionary step towards getting much better "porting" ability in LLMs.
I don't think that is in the priority list of the LLM companies. But I think it would be a real economic boon: certainly there is a backlog of code/systems that needs to be "modernized" in enterprises, so there is a market.
Ultimately I wonder if an LLM can be engineered to represent code in an intermediate form that is language-independent to a large extent, and "render" it to the desired language/platform when requested.
That's not a given... I think LLMs are amazing!
In all of these Python is artificially over-represented. Search hits and Stackoverflow questions represent beginners who are force fed Python in university or in expensive Python consultancy sessions. Journal articles are full of "AI" topics, which use Python.
Python is not used in any application on my machine apart from OS package managers. Python is used in web back ends, but is replaced by Go. "AI" is the only real stronghold due to inertia and marketing.
Like the TIOBE index, the results of this so called survey are meaningless and of no relevance to the jobs market.
Besides AI development, Python is used heavily in data processing and data science, also in writing bots of any kind, and as a glue language to do numerous tasks. It is true that it is being replaced by Go in web backends, but it still sees heavy use in that too. Moreover, Python is the only language that many AIs can interactively use in their chat sessions.
I am interested in Zig, but until they can guarantee a stable point for an extended period of time I have limited interest. Same way with Rust gamedev I'm keeping an eye on Bevy but want it to hit 1.0. Some things pre-1.0 is fine, but more core pieces of dev like the language often warrant requiring greater stability guarantees.
I switched to zig from rust for implementing a database since 6 months and have no regrets but just don’t think anyone would use it for writing backend code or other similar smaller things.
I used to think similarly about rust before though so don’t really know anything
I've been thinking about starting a project in Zig rather than Go lately, even though I am skilled at Go. I really like working with more, cracked? or simply crazy people willing to learn an esoteric language, and zig fits the needs in particular I have (mostly very nice C interop)
Would you recommend? How are the average zig contributors vs something like go?
It is definitely an excellent language for doing personal projects imo
I never understood why is go brought up next to rust all so often, when it has barely any unique qualities, and is a high-level GCd language with a dumb type system that outputs a single binary... of which there are 1000 other examples. At least it has good tooling, I guess.
> I switched to zig from rust for implementing a database since 6 months and have no regrets but just don’t think anyone would use it for writing backend code or other similar smaller things.
I don't have a horse in this race, but have you shared more about this decision? It would make for a good blog post and an even better HN discussion.Might consider writing something if my project ends up being useful
Its safety story is basically what Modula-2 (1978) and Object Pascal (1986) already had, but now it gets done with curly brackets instead of a begin/end language.
UAF is an issue, and the tooling to tackle this issue is exactly the same that C and C++ have available for the last 30 years, give it or take it.
It will be another language to talk about, however I doubt it will ever make it into mainstream, like having platfrom vendors and console Devkits care that Zig exists.
I don't think it's aiming for mainstream adoption anyway, it's a very specific niche.
D once upon a time was also hyped due to its Facebook usage, Remedy game engine tooling.
Also, has Zig already gone to space?
https://forum.dlang.org/thread/10614fc$273$1@digitalmars.com
Or used by car companies?
https://forum.dlang.org/thread/evridmtwtnhhwvorohyv@forum.dl...
Anyway, I don't expect any of them to grow beyond their niche userbase.
D has lost its momentum, and Zig isn't really interesting as 21st century language in the AI tooling age.
Elixir behind OCaml? Possible, I guess, but I know of several large Elixir shops and I haven’t heard much of OCaml in a while.
https://en.wikipedia.org/wiki/Declarative_programming#Domain...
To prove me wrong, show an HTML program that does any kind of computation whatsoever.
There’s not one. It’s not Turing complete. I doubt it’s even Turing partial. It’s a markup language, not a PL.
Can these languages do everything or even most computations you would be interested in doing in a computer? Of course not. But why should the definition be restricted to languages that can do everything?
[1] what is a low or high level language? Strongly typed language?
If it’s a programming language, so is Markdown. (It’s not, either.)
<picture> <source media="(prefers-color-scheme: dark)" srcset="logo_dark.svg"> <img src="logo.svg" alt="logo" width="48"> </picture>
I believe a reasonable way to categorize languages as programming or not is simply.. what is it's primary use case. HTML's last two letters tell us exactly that it is not a programming language.
The reason this debate is so strange is because some people think it's gatekeeping to say someone who writes html for a living isn't "programming". It's nonsense.
When asked what they do for a living, they said they were programmers.
Then the police officer went:
– Oh, I see. HTML.
Intuitevely I'd say yes.
In most jobs I've been in, the ratio of backend/system devs to front end devs has been from 3:1 to 20:1 depending on the size. Provided I'm on the backend side and would choose companies accordingly, but still.
Even for web first services, there will be a flurry of people who won't touch a line of html/rendering.
Imagine a PHP backend providing an API for an app. The only HTML ever produced will be the admin backend, and perhaps auth forms for special cases. The surface of the API will produce objects serialized to JSON, and the vast majority of the PHP will be business logic to talk to external services and do what the service is paid for.
Some might not like the language, but whole businesses will run on PHP, with a dedicated react or next.js frontend that only talks to the PHP via API.
Job focused Bachelor courses and curricula highly outnumber rigorous CS courses like the ones you are likely to find in MIT, UCLA-B, IISc, IITs, Oxford, UCL, Tsinghua, Peking, etc.
And that is very okay. (Modern) Java and .NET are excellent choices. There is nothing wrong with them.
Also the VSCode extension for .NET has the same license as VS.
They work, have great tooling, and do whatever is required for customers.
Is it something to do with frameworks like React?
Now that CoffeeScript is gone I would like to see all Ruby become Python.
I would have assumed it might be JS and more specifically React -- isn't that what you often get if you ask an LLM to "make an app that does such-and-such"?
(Experimental anecdata: I just asked Gemini to write an app and it gave me a single HTML file with embedded JS, no TS or React, Tailwind for CSS.)
(Edit to add: then I asked it to "write some code" instead of "write an app", and this time it went with Python.)
The choices of JS and Python were pretty solid for the prompts I gave. Maybe the LLM is, well, making a reasonable decision in context, rather than just defaulting to Python!
"LLMs prefer Python" is an over-simplification. Python was already very popular, so I agree that LLMs are likely to entrench that popularity, but that doesn't mean Python will grow into other areas where it wasn't being used much before.
Of course it is an over-simplification. Should I do an empirical scientific study before I can reply that MAYBE LLMs seem to prefer Python? I was talking from my own personal experience.
Are you using a LLM to write your replies? Because they seem very odd to me.
I didn't use any LLMs for those comments, but that comment style feeds into the training data, so it's not surprising if LLMs like to copy it!
This list calling Arduino a programming language zaps a bunch of credibility from it. It's C++.
OTOH, if you have a programmer that's been only using the Arduino IDE and its libraries, they probably haven't learned any C++ beyond loops and procedures and other atoms. It's technically C++ underneath, but a tiny subset in practice.
Oh, and actual corporations and labs do use them, though. It's really a dead simple way to automate some electronics with literally off the shelf tools. Almost any alternative will be extremely niche, require invasive SDKs and probably either cost 100 times more or need shipping overseas or both. Much like Raspberry Pi that you can get in your corner store and find ready code to go while "professional" SBC's are behind crazy hurdles with outdated sw and no documentation.
Nowdays CLR seems to have changed meaning to C# Language Runtime.
I’m not a fan personally, but its easy to find devs in it, so its popular in firms where language choice is a commodity/utility
using Java since it came out in 1996, alongside many other programming languages like C#, TypeScript, C++, SQL (PL/SQL and Transact-SQL mostly),
Also Android is all about Java, even if Kotlin is the new darling and it uses its own runtime (ART), everything around it is based on the Java ecosystem, the IDE, build tools, most libraries coming out of Maven Central.
Also for backend services Java is a pretty solid option. Just compile a monolithic JAR and 'run' that anywhere, which is much more robust than some node.js app cobbled together from tens of thousands of leftpad-equivalent npm packages ;)
So, so much 00s enterprise legacy code is written in Java. In the early/mid 10s I saw a huge push to rewrite that code, though.
> In the early/mid 10s I saw a huge push to rewrite that code, though.
In what language?The financial sector, insurance sector, healthcare sector all jumped on Java a couple of decades ago, and they have massive repositories of Java code, and are actively migrating their COBOL code to Java.
That was two decades ago – almost a generation! Interesting to think that some of those systems would now be considered “legacy”.
That would make OPs counter re netflix relevant. I don't understand your point
In its day, a lot of 'cool' companies used COBOL, back then. Because it was an ok solution, back then. So to say, today, Netflix is cool and uses Java, thus Java is different and still cool, is not valid. Does not invalidate the point. It is the same situation, just decades later.
Maybe shouldn't have conflated SAP, but they seem to be just all part of the same giant ecosystem of 'current/entrenched' solution that 'we use because we have to, not because it is better'. Not unlike COBOL.
I'm not saying COBOL is a bad language, far from it, which the billions of lines of code running production proably also attests to. The first COBOL program i ever edited, in 2008, was last edited in 1987. It had run flawlessly every day for 20 years. COBOL when invented was invented to allow business people to express business logic programatically, which is also why it has such a large footprint in finance, insurance, etc.
I'm not saying Java is a bad language either. Java is great, much like COBOL was, and like COBOL, Java still evolves today. It has flaws, but so does every other language, and most of the flaws in Java are understood. There is literally also nothing you can't do in Java that you can do in <insert fashionable language of the year>.
We probably shouldn't write web frontends in Java, and most people figured that out a decade or more ago, including the financial institutions.
The typical flow in a financial institution is something like "Angular (in some form) => Java Backend => COBOL on mainframe => DB2", where "=>" can be anything from REST to message queues (i was tempted to write MQ, as most will likely be IBM MQ, but others exist and are used).
Most companies migrating away from mainframe (and thereby often COBOL), have also started implementing microservices instead of giant monoliths, which is what has kept the mainframes of the world running for so long. Most companies i've worked with, have had 45,000 - 90,000 COBOL programs running every night, with almost as many running on demand, and each and every one depends heavily on the output of the previous part of the chain.
Thost giant chunks are now being migrated to microservices with well defined couplings, meaning that when it eventually becomes time to migrate away from Java, it will be somewhat easier as you can eat the elephant one mouthful at a time, and not have to reimplement 50+ years of legacy code and conventions in one go.
I've said it before, and will gladly say it again, if you choose a COBOL career today, you will most likely never be unemployed for long until you retire.
What do you mean by this? To me it sounds like people are saying they are both "old" languages, but I don't know what you mean.
I work in a shop that has lots of both Java and COBOL. We are not "actively migrating" COBOL code to Java. It looks like mainframes will continue to exist for decades to come (i.e. >20 more years). Obviously, brand new applications here are not written in COBOL.
Java was the first language I learned in my CS degree, I still think this was a sensible choice by the CS department, but I don't think I've written a single piece of Java since I left 10 years ago!
It seems like a lot of Java usecases may be big and important but kinda isolated! Something about where they sit in the economic value chain perhaps?
Edit: And maybe some Dart and Kotlin too.
what are the main programming languages used at Google
Result (unformatted):
AI Overview Google utilizes a diverse set of programming languages across its various products and services. The main programming languages used at Google include:
C++: Widely used for performance-critical applications and system-level programming, such as in the core search engine, Google Chrome, and other backend infrastructure. Java: Essential for Android app development and significant portions of Google's backend systems. Python: Employed for a wide range of tasks including scripting, data analysis, machine learning, and web development (e.g., YouTube). JavaScript: Fundamental for web application development and frontend interactions across Google's web-based services. Go (Golang): Google's own open-source language, increasingly used for cloud-based projects, microservices, and network programming due to its efficiency and concurrency features.
SQL:
Crucial for managing and interacting with databases, which are integral to almost all data-driven applications at Google.
While these are the primary languages, Google also utilizes other languages such as Rust (for projects like Fuchsia OS), Kotlin (for Android development), and Dart (for Flutter framework development) for specific use cases and projects. The choice of language often depends on the project's requirements for performance, scalability, development time, and existing infrastructure.
Dive deeper in AI Mode
AI responses may include mistakes. Learn moreSo it looks like the main ones I missed were Rust and SQL.
Dang! I should have thought of SQL, at least for their IT ops, but I was thinking only about their customer-facing apps.
Anyway ...
There the world is not like at Google. Sap, Java and .net are what people work with.
In terms of programming languages, Google is very much a microcosm of the industry.
(.net is the exception though. Not much of that at Google).
You write something and it stays written, mostly because everyone moves the logic far away from accidental complexities, so maintainance is very low.
Clojure seems more popular than other FP languages such as Haskell or even F#
Both Java the language and JVM is great. A lot of the important work for JVM just landed. I am not even sure if there are anything that is really missing anymore. But the whole ecosystem is so vast I wonder if anyone would want to just craft out a subset of Java.
No Zig, Crystal, Odin, but Julia and Elixir is there just without numbers.
For game engine developers that works for their game targeting Android, especially for commercial mature game, you have no choice but Unity engine. And take a look at Google announce [a donation of $250,000 to the Rust Foundation](https://www.linkedin.com/feed/update/urn:li:activity:7376353...) but there still constraint on using Rust library for Android app.
Sure. Until we need to. Then we face some apparently tiny concern, which is actually deeply intricated with the rest of this whole mess, and we are ready for a ride in the rabbit hole.
> most developers today don’t pay much attention to the instruction sets and other hardware idiosyncrasies of the CPUs that their code runs on, which language a program is vibe coded in ultimately becomes a minor detail.
This can be very misguided from my part but I have the feeling they are two very different cases here. Ok, not everyone is a ffmpeg level champion who will thrive in code-golfing ASM til the last drop of cycle gain.
But there are also probably reasons why third-generation programming language lasted without any other subsequent proposal completely displacing them. It’s all about a tradeoff of expressiveness and precision. What we want to keep in the focus zone, and what we want to delegate to mostly uncontrolled details.
If to go faster we need to get rid of a transparent glasses, we will need very sound and solid alternative probes to report what’s going on ahead.
If it was even slightly true then we wouldn’t be generating language syntax at all, we’d be generating raw machine code for the chip architectures we want to support. Or even just distributing the prompts and letting an AI VM generate the target machine code later.
That may well happen one day, but we’re not even close right now
You absolutely can, and probably _should_, leverage AI to learn many things you don't understand at all.
Simple example: try picking up or learning a programming language like C with or without LLMs. With is going to be much more efficient. C is one of the languages that LLMs have seen the most, they are very, very good at it for learning purposes (also at bug hunting).
I have never learned as much about computing as in the last 7/8 months of using LLMs to assist me at summarizing, getting information, finding bugs, explaining concepts iteratively (99% of Software books are crap: poorly written and quickly outdated, often wrong), scanning git repositories for implementation details, etc.
You people keep committing the same mistake over and over: there's a million uses to LLMs, and instead of defining the context of what you're discussing about you conflate everything with vibe coding making ultimately your comments nonsense.
If you want to use LLMs for learning, that's altogether a different proposition.
and his shit actually works by the way, topping leaderboards on hackerone and having a decent amount of clients.
your colleagues might be retarded or just don’t know how to use llms
Would you understand why some code is less performant than it could be if you've never written and learned any C yourself? How would you know if the LLM output is gibberish/wrong?
They're not wrong; it's just not black-and-white. LLMs happen to sometimes generate what you want. Often times, for experienced programmers who can recognize good C code, the LLMs generate too much garbage for the tokens it costs.
I think some people are also arguing that some programmers ought to still be trained in and experienced with the fundamentals of computing. We shouldn't be abandoning that skill set completely. Some one will still need to know how the technology works.
The parent I answered said you shouldn't use LLMs for things you don't understand while I advocate you should use them to help you learn.
You seem to describe very different use cases.
In any case, just to answer your (unrelated to mine) comment, here[1] you can see a video of one of the most skilled C developers on the planet finding very hard to spot bugs in the Redis codebase.
If all your arguments boil down to "lazy people are lazy and misuse LLMs" that's not a criticism of LLMs but of their lack of professionalism.
Humans are responsible for AI slop, not AI. Skilled developers are enhanced by such a great tool that they know how and when to use.
How do people using LLMs this way know that the generated code/text doesn’t contain errors or misrepresentations? How do they find out?
Someone else interpretation is not the author's saying. :)
Since the tone is so aggressive, it doesn't feel like it would be easy to build any constructive discussion on this ground.
Acting prudently is not blind rejection, the latter being not wiser than blind acceptance.
Excerpted from Tony Hoare's 1980 Turing Award speech, 'The Emperor's Old Clothes'... "At last, there breezed into my office the most senior manager of all, a general manager of our parent company, Andrew St. Johnston. I was surprised that he had even heard of me. "You know what went wrong?" he shouted--he always shouted-- "You let your programmers do things which you yourself do not understand." I stared in astonishment. He was obviously out of touch with present day realities. How could one person ever understand the whole of a modern software product like the Elliott 503 Mark II software system? I realized later that he was absolutely right; he had diagnosed the true cause of the problem and he had planted the seed of its later solution."
My interpretation is that whether shifting from delegation to programmers, or to compilers, or to LLMs, the invariant is that we will always have to understand the consequences of our choices, or suffer the consequences.
Applied to your specific example, yes, LLMs can be a good assistants for learning. I would add that triangulation against other sources and against empirical evidence is always necessary before one can trust that learning.
Have you heard about how much AI slop has been submitted as "bug" - but always turn out to be not a bug - to the curl project?
Some of mine:
* Converse with the LLM on deeper concepts
* use the `/explain` hook in VSCode for code snippets I'm struggling with
* Have it write blog-style series on a topic, replete with hyperlinks
I have gotten in some doom loops though when having it try to directly fix my code, often because I'm asking it to do something that is not feasible, and its sycophantic tendencies tend to amplify this. I basically stopped using agentic tools to implement solutions that use tech I'm not already comfortable with.
I've used it for summarization as well, but I often find that a summary of a man page or RFC is insufficient for deeper learning. It's great for getting my feet wet and showing me gaps in my understanding, but always end up having to read the spec at the end
Now we have an additional layer of abstraction, where we can instruct an LLM in natural language to write the high-level code for us.
natural language -> high level programming language -> assembly
I'm not arguing whether this is good or bad, but I can see the bigger picture here.
Also almost every software has know unknowns in terms of dependencies that gets permanently updated. No one can read all of its code. Hence, in real life if you compile on different systems (works on my machine) or again but after some time has passed (updates to compiler, os libs, packages) you will get a different checksum for your build with unchanged high level code that you have written. So in theory given perfect conditions you are right, but in practice it is not the case.
There are established benchmarks for code generation (such as HumanEval, MBPP, and CodeXGLUE). On these, LLMs demonstrate that given the same prompt, the vast majority of completions are consistent and pass unit tests. For many tasks, the same prompt will produce a passing solution over 99% of the time.
I would say yes there is a gap in determinism, but it's not as huge as one might think and it's getting closer as time progresses.
In my opinion, their target audience are scientists rather than programmers, and a scientist most often think of code as a tool to express his ideas (hence, perfect AI generated code is kind of a graal). The faster he can express them, even if the code is ugly, the better. He does not care to reuse the code later most of the time.
I have the hint that scientists and not programmers are the target audience as other things may trigger only one category but not the other, for example, they consider Arduino a language, This makes totally sense for scientists, as most of the ones using Arduino dont necessarily know C++, but are proud to be able to code in Arduino.
For a professional programmer, code and what it does is the object of study. Saying the programmer shouldn’t look at the code is very odd.
But actually, producing easy to read code when you don't have specifications, because you don't know yet if the idea will work, and you are discovering problems on that idea as you go doesn't lead to readable code naturally.
You refactor all the time, but then something that you misunderstood becomes a concern, and you need to refactorer again everything, and again and again.. You loose much time, and research is fast paced.
Scientists that spend too much time cleaning code often miss deadlines and deliverables that are actually what they need to produce. Nobody cares about their code, as when the idea is fully developed, other scientist will just rewrite a better software with full view of the problem. (some scientists rewrite their full software when everything is discovered)
I think a sensible goal would be easy to write code instead of easy to read for scientists.
We are clearly not there yet, but I feel that the article is pushing in that direction, maybe to push research in that direction.
There was a long time ago an article from the creators of Mathematica or maple, I don't remember that said something similar. The question was: why do we learn about matrix operations at school, when (modern) tools are able to perform everything. We should teach at school matrix algebra and let students use the software (a little bit like using calculators). This would allow to make children learn more abstract thinking and test way more interesting ideas. (if someone has the reference I'm interested)
I feel the article follow the same lines. But with current tools.
(of course I'm skipping the fact that Mathematica is deterministic in doing algebra, and LLMs are far from it)
They are indeed very different. If your compiler doesn't emit the right output for your architecture, or the highly optimized library you imported breaks on your hardware, you file a bug and, depending on the third party, have help in fixing the issue. Additionally, those types of issues are rare in popular libraries and languages unless you're pushing boundaries, which likely means you are knowledgeable enough to handle those type of edge cases anyway.
If your AI gives you the wrong answer to a question, or outputs incorrect code, it's entirely on you to figure it out. You can't reach out to OpenAI or Anthropic to help you fix the issue.
The former allows you to pretty safely remain ignorant. The latter does not.
Was thinking of learning some spring boot and create a small project or two to reinforce what I've learned. However it feels like tutorials for spring boot is of so much lower quality compared to newer language/frameworks like JS/React/Python. Often times it's just a talking head over a powerpoint presentation talking for 30 minutes.
Could people recommend me a good tutorial for spring boot (or anything java that is being used in enterprises)?
My opinion: learn to create Android apps in Java. Tutorials are good and you get a new set of skills (if not already). After that, focus on learning POJO which is the fundamental knowledge in Java.
Everyone writes stuff in Java/C++ where I work, but less and less Spring Boot is encouraged because of the bloat do debug and performance troubles.
> I've been re-learning java using the excellent University of Helsinki's MOOC course.
This sounds interesting! Can you share a link?How does searching "Helsinki MOOC java" not immediately give you the result you are after?
https://www.youtube.com/watch?v=eFrggyDXdUk&list=PL2s7AeEJ2f...
And I mean, I do feel like that as long as they aren't actively harming the "ethos" of hackernews, then we can cut each other a little slack each other
I feel like I have sometimes done a disservice like this too to HN where I ask for links sometimes and maybe they just wanted to confirm if this was the right course or they might be coming from a job tired and didn't think this tooo much y'know?
But i mean I also understand your standpoint that you want less clutter on HN which is why I feel a bit :/
From the perspective of a useful thread, I agree with you
Yeah I can also understand it, but I just saw their comments and scrolled down to find it ``` Nah, DotNET is amazing these days. At the risk of starting a holy war, it is neck-and-neck with Java, and I say that as a Java fanboi. I think it is good to have healthy competition between languages (and ecosystems), like C++ and Rust (and a little bit Zig) or GCC and Clang or Java and DotNet or Python and Ruby or NodeJS and Deno. Plenty of people are compiling and deploying to Linux after DotNetCore went open source. Plus, you can use JetBrains Rider, which is a cross-platform IDE for C#, from the makers of IntelliJ ```
It just seems that their use of words like boi etc. makes them (genz?-ish)
I am genz too (maybe younger than them, still in HS) but yeah maybe they just write one liners which can be quite common but I see that more on the reddit kind of things and not hackernews lol. I can definitely see our generation's attention span being cooked that when I write remotely as long as this, they say nah I ain't reading it. So well, how are they gonna write it for the most part :/
It might be a bot but then again, what is the point? The point I see of HN points is that I might create a new account and be the same guy and kind of accrue them back because I once had it y'know while being myself not writing some one liners lol.
The fact that I like about HN is that I have talked to soooo many people that I admire whether its a one liner from jose valim or etc. and I am happy that hackernews exists to a somewhat degree :>
Like just out of curiosity, has someone ever got any job or smth through HN in the sense that they had their karma in part of their resume and the company was impressed lol, I can see it to a somewhat degree in maybeee startups
No hint on how old and moldy it is. Does it teach a relatively recent Java or 1995 Java?
So asking if that is the right one doesn't seem out of line.
Supposedly enterprises are finally starting to upgrade to Java 17 and 21.
Example : How Netflix Uses Java - 2025 Edition
Any idea how could it be explained?
Now, time for a Metamucil and a nice nap before my standup.
https://survey.stackoverflow.co/2024/technology#most-popular...
Even with a big uptick in Python and Java due to AI, I don't see Javascript+Typescript losing that much ground year-over-year.
C with some sort of module system and the tooling of Rust would be really nice.
People will also say, "Python is productivity gains because it's the easiest to write," but it's only the easiest to write because you know it the best.
Don't understand why Ruby is so much less popular. Seems like Python is de facto "first language" (it was for me), but I would advocate for Ruby.
;o)
hackthemack•4mo ago
In an alternate universe, if LLM only had object oriented code to train on, would anyone push programming forward in other styles?
mock-possum•4mo ago
AstroBen•4mo ago
mock-possum•4mo ago
a96•4mo ago
christophilus•4mo ago
fuzztester•4mo ago
I had looked at it recently while checking out C-like languages. (Others included Odin and C3.) I read some of the Hare docs and examples, and had watched a video about it on Kris Jenkins' Developer Voices channel, which was where I got to know about it.
christophilus•4mo ago
fuzztester•4mo ago
fuzztester•4mo ago
Just one note for anyone else wanting to check it out:
There are a few sections in the tutorial which are empty and marked as TODO. E.g. "Using default values" and "Handling allocation failure" (the empty sections seen so far, there may be others below).
Still going to check the language out.
zenmac•4mo ago
Not only that they also tend to answer using the the more popular languages or tool event when it is NOT necessary. And when you call it out on it, it will respond with something like:
"you are absolutely right, this is not necessary and potentially confusing. Let me provide you with a cleaner, more appropriate setup...."
Why doesn't it just respond that the first time? And the code it provided works, but very convoluted. if wasn't checked carefully by an experienced dev person to ask the right question one would never get the second answer, and then that vibe code will just end up in git repo and deployed all over the place.
Got the feeling some big corp may just paid some money to have their plugin/code to on the first answer even when it is NOT necessary.
This could be very problematic, I'm sure people in advertising are just all licking their chops on how they can capitalized on that. If one thing currently ad industry is bad, wait until that is infused into all the models.
We really need ways to
1. Train our own models in the open, with weight and the data it is trained on. Kinda like the reproducible built process that Nix is doing for building repos.
2. Ways to debug the model on inference time. The <think> tag is great, and I suspect not everything is transparent in that process.
Is there something equivalent of formal verification for model inference?
raincole•4mo ago
Yes.
But today the only two reasons to use niche languages are[0] 1) you have existing codebases or libraries in that language 2) you're facing quite domain-specific problems where the domain experts all use that language.
In either cases you won't just use Java because LLMs are good at Java.
[0]: besides for fun or 'for resume'