frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: Deterministic NDJSON audit logs – v1.2 update (structural gaps)

https://github.com/yupme-bot/kernel-ndjson-proofs
1•Slaine•2m ago•0 comments

The Greater Copenhagen Region could be your friend's next career move

https://www.greatercphregion.com/friend-recruiter-program
1•mooreds•3m ago•0 comments

Do Not Confirm – Fiction by OpenClaw

https://thedailymolt.substack.com/p/do-not-confirm
1•jamesjyu•3m ago•0 comments

The Analytical Profile of Peas

https://www.fossanalytics.com/en/news-articles/more-industries/the-analytical-profile-of-peas
1•mooreds•3m ago•0 comments

Hallucinations in GPT5 – Can models say "I don't know" (June 2025)

https://jobswithgpt.com/blog/llm-eval-hallucinations-t20-cricket/
1•sp1982•3m ago•0 comments

What AI is good for, according to developers

https://github.blog/ai-and-ml/generative-ai/what-ai-is-actually-good-for-according-to-developers/
1•mooreds•3m ago•0 comments

OpenAI might pivot to the "most addictive digital friend" or face extinction

https://twitter.com/lebed2045/status/2020184853271167186
1•lebed2045•5m ago•2 comments

Show HN: Know how your SaaS is doing in 30 seconds

https://anypanel.io
1•dasfelix•5m ago•0 comments

ClawdBot Ordered Me Lunch

https://nickalexander.org/drafts/auto-sandwich.html
1•nick007•6m ago•0 comments

What the News media thinks about your Indian stock investments

https://stocktrends.numerical.works/
1•mindaslab•7m ago•0 comments

Running Lua on a tiny console from 2001

https://ivie.codes/page/pokemon-mini-lua
1•Charmunk•8m ago•0 comments

Google and Microsoft Paying Creators $500K+ to Promote AI Tools

https://www.cnbc.com/2026/02/06/google-microsoft-pay-creators-500000-and-more-to-promote-ai.html
2•belter•10m ago•0 comments

New filtration technology could be game-changer in removal of PFAS

https://www.theguardian.com/environment/2026/jan/23/pfas-forever-chemicals-filtration
1•PaulHoule•11m ago•0 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
2•momciloo•12m ago•0 comments

Kinda Surprised by Seadance2's Moderation

https://seedanceai.me/
1•ri-vai•12m ago•2 comments

I Write Games in C (yes, C)

https://jonathanwhiting.com/writing/blog/games_in_c/
2•valyala•12m ago•0 comments

Django scales. Stop blaming the framework (part 1 of 3)

https://medium.com/@tk512/django-scales-stop-blaming-the-framework-part-1-of-3-a2b5b0ff811f
1•sgt•12m ago•0 comments

Malwarebytes Is Now in ChatGPT

https://www.malwarebytes.com/blog/product/2026/02/scam-checking-just-got-easier-malwarebytes-is-n...
1•m-hodges•12m ago•0 comments

Thoughts on the job market in the age of LLMs

https://www.interconnects.ai/p/thoughts-on-the-hiring-market-in
1•gmays•13m ago•0 comments

Show HN: Stacky – certain block game clone

https://www.susmel.com/stacky/
2•Keyframe•16m ago•0 comments

AIII: A public benchmark for AI narrative and political independence

https://github.com/GRMPZQUIDOS/AIII
1•GRMPZ23•16m ago•0 comments

SectorC: A C Compiler in 512 bytes

https://xorvoid.com/sectorc.html
2•valyala•17m ago•0 comments

The API Is a Dead End; Machines Need a Labor Economy

1•bot_uid_life•18m ago•0 comments

Digital Iris [video]

https://www.youtube.com/watch?v=Kg_2MAgS_pE
1•Jyaif•19m ago•0 comments

New wave of GLP-1 drugs is coming–and they're stronger than Wegovy and Zepbound

https://www.scientificamerican.com/article/new-glp-1-weight-loss-drugs-are-coming-and-theyre-stro...
5•randycupertino•21m ago•0 comments

Convert tempo (BPM) to millisecond durations for musical note subdivisions

https://brylie.music/apps/bpm-calculator/
1•brylie•23m ago•0 comments

Show HN: Tasty A.F. - Use AI to Create Printable Recipe Cards

https://tastyaf.recipes/about
2•adammfrank•24m ago•0 comments

The Contagious Taste of Cancer

https://www.historytoday.com/archive/history-matters/contagious-taste-cancer
2•Thevet•25m ago•0 comments

U.S. Jobs Disappear at Fastest January Pace Since Great Recession

https://www.forbes.com/sites/mikestunson/2026/02/05/us-jobs-disappear-at-fastest-january-pace-sin...
1•alephnerd•26m ago•1 comments

Bithumb mistakenly hands out $195M in Bitcoin to users in 'Random Box' giveaway

https://koreajoongangdaily.joins.com/news/2026-02-07/business/finance/Crypto-exchange-Bithumb-mis...
1•giuliomagnifico•26m ago•0 comments
Open in hackernews

Why Is Python So Popular in 2025?

https://blog.jetbrains.com/pycharm/2025/09/why-is-python-so-popular/
45•rbanffy•4mo ago

Comments

bgwalter•4mo ago
Most people don't run any Python programs on their machines except for OS package managers. Google and others moved to Go, and the Python job market does not reflect these statistics at all.

Python is well marketed, with dissenting voices silenced, de-platformed and defamed with PSF involvement. That way many users think the Python ruling class are nice people. It is therefore popular among scientists (who buy expensive training courses) and students (who are force fed Python at university).

It has a good C-API, which is the main reason for its popularity in machine learning. Fortunately for Python, other languages do not take note and insist on FFIs etc.

EDIT: The downvotes are ironic given that Python needs to be marketed every three day here with a a statistic to retain its popularity. If it is so popular, why the booster articles?

ASalazarMX•4mo ago
Yeah brother, down with Big Python!
dalke•4mo ago
I work in a chemistry research field. Most people I know run Python programs for their research. No one I know uses Go. I only know a handful who use Java. Rust and Julia are oddities that appear occasionally.

Sure, we have very different experiences. But that also means that unless you can present strong survey data, it's hard to know if your "Most people" is limited to the people you associate with, or is something more broadly true.

The PSF overlap with my field is essentially zero. I mean, I was that overlap, but I stopped being involved with the PSF about 8 years ago when my youngest was born and I had no free time or money. In the meanwhile, meaningful PSF involvement became less something a hobbyist project and something more corporatized .. and corporate friendly.

> scientists (who buy expensive training courses)

ROFL!! What scientists are paying for expensive training courses?!

I tried selling training courses to computational chemists. It wasn't worth the time needed to develop and maintain the materials. The people who attended the courses liked them, but the general attitude is "I spent 5 years studying <OBSCURE TOPIC> for my PhD, I can figure out Python on my own."

> who are force fed Python at university

shrug I was force fed Pascal, and have no idea if Wirth was a nice person.

> main reason for its popularity in machine learning

I started using Python in the 1990s both because of the language and because of the ease of writing C extensions, including reference counting gc, since the C libraries I used had hidden data dependencies that simple FFI couldn't handle.

I still use the C API -- direct, through Cython, and through FFI -- and I don't do machine learning.

> If it is so popular, why the booster articles?

It's an article about a company which sells a Python IDE. They do it to boost their own product.

With so many people clearly using Python, why would they spend time boosting something else?

internetter•4mo ago
I continue to believe that python is only still popular for the ecosystem effect. Students are taught it, a bunch of libraries were written for it, now everyone keeps using it.

But its syntactically weak? Python itself is slow? pip is awful (although uv is pretty good). Sometimes I am forced to write python because the packages I need are written for it but I always hate it.

paulddraper•4mo ago
It's syntactically strong.

AFAIK, it's the preferred language for lots of leetcode, e.g. Advent of Code. The practical expressivity and speed of dev is strong.

gloxkiqcza•4mo ago
Well it’s as close to executable pseudocode as it gets, which I would say is both a positive and a negative.
internetter•4mo ago
Just for example, the lack of explicit variable defs. I create a variable and update it later. I rename the variable but forget to change it where it is updated. No errors produced in the IDE because the place where it is updated is now just creating the variable instead so all the references still work. `let` is a good keyword.

It drives me crazy how everything just... blocks. Parallelization is a disaster.

I won't comment much on indentation. All the string formatting syntaxes is crazy. Private names managed by `_`. The type system still mostly sucks. Ect ect.

In my experience it is alright to write short scripts but complexity very quickly balloons and python projects are very often unwieldy.

I would write a script in python, but only if I was positive I would never need to run or edit it again. Which is very rarely the case.

BeetleB•4mo ago
> No errors produced in the IDE because the place where it is updated is now just creating the variable instead so all the references still work. `let` is a good keyword.

I too would prefer let. But the number of times what you mentioned has bitten me in almost 20 years of programming Python can be counted on in one hand. And these days, just use the "rename" feature in your IDE!

paulddraper•4mo ago
> Just for example, the lack of explicit variable defs

This is an extraordinarly common feature among scripting languages. In fact, JS is really the odd duck out.

Shell, Ruby, Lua, Perl, R, AWK, etc

> I rename the variable but forget to change it where it is updated.

Type checkers will catch this. (And IDEs will help.)

You don't have to use type checking of course, though it sounds like you like to.

> It drives me crazy how everything just... blocks.

There are comparatively few languages that primarily rely on cooperative multi-threading.

> All the string formatting syntaxes is crazy.

  f"Hi my name is {name}"
---

FWIW, you just don't like most scripting languages.

Which is fine, but it goes far beyond Python.

Python, Perl, Ruby, PHP, Lua, Tcl

01HNNWZ0MV43FF•4mo ago
JavaScript is like that. C++ was kinda like that. I think "Popularity among novices" is the only thing that determines whether a language will succeed or fail in the long term (say, 20 years)
internetter•4mo ago
> JavaScript is like that.

JavaScript has gotten drastically better, especially since ES6. The lack of venvs alone is such a breath of fresh air. The baseline typing sucks more than python, but typescript is so much better. It still suffers from legacy but the world has mostly moved on and I don't see beginners learning legacy stuff much anymore. Modern javascript is amazing. I do my work in Elysia, Kysely (holy crap the python database situation is awful) and everything just works and catches tons of errors.

itsnowandnever•4mo ago
I treat it pretty much like bash. it's good for scripts. and it's great for serverless tasks like running as an AWS lambda. if you want to run some simple script that queries a DB and/or hits an API on a schedule or triggered by an event, Python is arguably the best way to do that because the handler and interpreter work so well together. even still, you'd get better performance with Node.

I don't like Python for "applications" as much. I was at a place almost 10 years ago now that had Python "microservices" running in Kubernetes. managing performance and dependencies for Python in production is just unnecessarily futile compared to something like Go that's also very accessible syntactically.

BeetleB•4mo ago
> Students are taught it, a bunch of libraries were written for it, now everyone keeps using it.

This is revisionism.

When I was in school, that's what people would say about C/C++/Java.

People like me switched to Python well before it became common to teach it at school. Lots of great libraries were written in it before then as well. I mean, it's really easy to write a library in it than it was in most other languages.

It was picked for teaching in schools because it was a decent language, and was already widespread. It's much more useful to teach Python to an average engineering student than C/C++/Java.

It became popular because it was easy to learn, and a lot nicer to use than the other big scripting language: Perl. When I was in grad school, I would routinely find non-engineers and non-programmers using it for their work, as well as writing libraries for their peers. There is no way they would have done that in any of the prevailing languages of the time - many of them learned to program while in grad school. It became quite popular amongst linguists. Then NumPy/SciPy became a popular alternative to MATLAB, and once again my peers, who would never write C++ to do their work, were suddenly getting matching speed by using Python. That's how it ended up being the language for ML.

So no - the fact that it's taught in schools is not the reason it's used today.

internetter•4mo ago
> It became popular because it was easy to learn, and a lot nicer to use than the other big scripting language ... I would routinely find non-engineers and non-programmers using it for their work, as well as writing libraries for their peers. ... many of them learned to program while in grad school.

Sure, and this is my argument. It is easy to start out with, which makes it appealing to academics without a CS background. But is this a good thing? Because then these academics without a CS background write monstrous, poorly optimized scripts compounded by a slow language, then use drastically more compute to run their science, and then at the end publish their paper and the scripts they used are very hard to adapt to further research because very rarely are they structured as modules.

Just because it is easy to write doesn't mean it is the most appropriate for science. It is not commonly acceptable to publish papers without a standardized format and other conventions. Students put in work to learn how to do it properly. The same should be true for code.

BeetleB•4mo ago
> But is this a good thing? Because then these academics without a CS background write monstrous, poorly optimized scripts compounded by a slow language, then use drastically more compute to run their science, and then at the end publish their paper and the scripts they used are very hard to adapt to further research because very rarely are they structured as modules.

For many of them, the alternative is that they simply wouldn't have done the science, and would have focused on research that didn't need computation. Or they'd use some expensive, proprietary tool.

Prior to Python becoming popular among them (circa 2005-2007), plenty of good, fast, whatever-metric-you-want languages existed. These academics were not using them.

Python is really what enabled the work to be done.

nickpsecurity•4mo ago
"This is revisionism."

Is it? I've never actually seen or heard exactly how Python became popular with survey data at many institutions. If it exists, I'd like to read it sometime.

Most languages got popular due to a specific project or product getting popular or a specific company marketing them.

I've heard it was UNIX/C, Browser/JavaScript, Sun/Java, Microsoft/many (esp C#), Google/Go, Apple/Objective-C, Apple/Swift, Ruby/Rails, and so on. I've heard variations of C++ being like this but also useful building on C ecosystem.

Some were really great at a specific thing they made easy. PHP might be like that. I don't have its history, though. Flash was definitely like that.

So, did Python have a killer app or library that caused it to take off industrially? Or did that not matter? You mentioned NumPy/SciPy in academia.

BeetleB•4mo ago
> Most languages got popular due to a specific project or product getting popular or a specific company marketing them.

This is a bold assertion, and IMO, false.

What was the killer product/company marketing that made Perl popular?

> So, did Python have a killer app or library that caused it to take off industrially? Or did that not matter? You mentioned NumPy/SciPy in academia.

It really didn't. It was already popular before, say, Django came around. Yes, NumPy/SciPy are kind of the reason ML is on Python, but the reality is people in academia chose NumPy/SciPy over MATLAB because it allowed them to use Python libraries in their code, which MATLAB could not. In other words, the academics chose to use NumPy/SciPy because they already were into Python.

For many, Python was just a much nicer alternative to Perl, and it was one of the main "batteries included" languages. That's why it became popular. Keep in mind that Python did not become popular quickly. It came out in 1990 and took about a decade to become somewhat popular.

The only other thing I can think of is this essay by Eric Raymond in 2000:

https://www.linuxjournal.com/article/3882

"My second came a couple of hours into the project, when I noticed (allowing for pauses needed to look up new features in Programming Python) I was generating working code nearly as fast as I could type. When I realized this, I was quite startled.

This was my first clue that, in Python, I was actually dealing with an exceptionally good design. Most languages have so much friction and awkwardness built into their design that you learn most of their feature set long before your misstep rate drops anywhere near zero. Python was the first general-purpose language I'd ever used that reversed this process.

...

I wrote a working, usable fetchmailconf, with GUI, in six working days, of which perhaps the equivalent of two days were spent learning Python itself. This reflects another useful property of the language: it is compact--you can hold its entire feature set (and at least a concept index of its libraries) in your head. C is a famously compact language. Perl is notoriously not; one of the things the notion “There's more than one way to do it!” costs Perl is the possibility of compactness.

...

To say I was astonished would have been positively wallowing in understatement. It's remarkable enough when implementations of simple techniques work exactly as expected the first time; but my first metaclass hack in a new language, six days from a cold standing start? Even if we stipulate that I am a fairly talented hacker, this is an amazing testament to Python's clarity and elegance of design.

There was simply no way I could have pulled off a coup like this in Perl, even with my vastly greater experience level in that language. It was at this point I realized I was probably leaving Perl behind.

...

So the real punchline of the story is this: weeks and months after writing fetchmailconf, I could still read the fetchmailconf code and grok what it was doing without serious mental effort. And the true reason I no longer write Perl for anything but tiny projects is that was never true when I was writing large masses of Perl code."

janalsncm•4mo ago
> Python itself is slow

Slow is relative. You need to account for the time to write as well, and amortize over the number of times the code is run. For code that is run once, writing time dominates. Writing a Java equivalent of half of the things you can do in Python would be a nightmare.

internetter•4mo ago
> For code that is run once, writing time dominates.

where "run once" in the sense you describe is really the case has been rare for me. Often these one off scripts need to process data, which involves a loop, and all of the sudden, even if the script is only executed once, it must loop a million times and all of the sudden it is actually really slow and then I must go back and either beat the program into shape (time consuming) or let it execute over days. A third option is rewriting in a different language, and when I choose to do a 1:1 rewrite the time is often comparable to optimizing python, and the code runs faster than even the optimized python would've. Plus, these "one off" scripts often do get rerun, e.g. if more data is acquired.

Java is a sort of selective example. I find JavaScript similarly fast to write and it runs much faster.

janalsncm•4mo ago
In practice a lot of heavy lifting performance-dependent code is cython. For example numpy and PyTorch. So Python will definitely be faster than js if you use proper vectorized ops.

The vast majority of my Python code is for data exploration and preprocessing which are usually one-offs or need to be run only a couple of times. Or maybe it’s a nightly job that takes 5 minutes instead of 30 seconds in another language, but it doesn’t matter because it’s not user facing.

Actual Python execution time very rarely comes into play. If it does and it’s a problem, I will create a pyo3 rust binding.

internetter•4mo ago
I think our processing workloads are just different. If you're spending 90% of your compute time in numpy or whatever sure. But for me that wasn't the case, the overhead absolutely was the python.
1vuio0pswjnm7•4mo ago
"Slow is relative."

Correct. Relative to the languages I prefer, Python is slow

Python encourages people to write long scripts and large projects comprised of many scripts

Are these "run once" programs

Python is one of the least energy efficient languages

There seems to be an enormous amount of effort spent on trying to make Python faster. To me, this implies it is slow or, at least, not fast enough, but others may see things differently. The language has a devoted following, that's for sure

For me, the startup time makes Python unusable

larrik•4mo ago
I'm surprised at the negative comments about Python here. Python has been my favorite language since I learned it, and nothing else has come close to it.

I'm currently on pure JS project (node and vue) and I'm constantly fighting with things that would be trivial in python or django. Also getting into .NET world and not impressed with that at all.

I know folks like Go, but in decades of being a software consultant I've come across zero companies using it or hiring for it.

nawgz•4mo ago
Care to give some examples of those trivial things you’re fighting?
larrik•4mo ago
Well Here's one:

In NodeJS the most popular ORM is Sequelize. Sequelize struggles with TypeScript, the code to use is extremely verbose (vs Django), and the migrations are simplistic at best. There are other ORMs, but you usually gain TypeScript support at the expense of good migrations (which range from rough SQL-script-only support to literally nothing). Schema migrations are one thing, but if you want to do a data migration that uses business logic, in Django you can potentially just bring in your business code directly into a migration, and you can also explicitly set pre-requisites rather than having to order the filenames (which is what everything else seems to use).

Also in NodeJS if you miss an `await` in your controller, the entire server can crash if that call fails.

That's Node vs Django, though, which isn't completely JS vs Python, but it also really is.

Coming from Python, JS has constant surprises, where code works in one place and not another (in the same project!) due to reasons I don't always understand around ES version and configuration. Everything in JS feels like a hack job, built on a pile of someone else's hackjobs.

Likewise, if I want to know how something in Python works, I just look at the source. I rarely even look at official documentation because the code is right there. That's not a reasonable thing in JS, frankly.

But really the worst part is that I do a ton of "try and test" development, where debugging is hit or miss and console.log is everywhere. In Python, I can just write the code live in the shell/iPython, and then past the working code back into my IDE. This ends up being a huge timesaver.

nawgz•3mo ago
Sorry for the late response. I appreciate the perspective you are offering.

I totally agree JS can be surprising, and actually I think the greatest skill a JS developer needs is to understand what things are powerful reliable and composable as opposed to hacky, where you can understand hacky things lead to surprising behavior.

For instance, I too have used Sequelize, and it is a painfully bad library. I can only see that network effects have lead the community here, there is no merit. Instead, I think the scrutinizing JS developer - the one who can write reliable JS - needs to just throw Sequelize out. It sucks.

I happily did this, and so I looked elsewhere at what the community was trying in terms of data paradigms. An obviously interesting invention of the JS community at the time was GraphQL. After some time, I decided writing my own resolvers in JS was an exercise in futility, but I found it incredibly attractive to use the tool Hasura to give a free GraphQL API over a DB.

PostgreSQL, Hasura GraphQL Engine, GraphQL, graphql-codegen, and typescript combine to make this amazing chain where you can normalize your DB and rely on the foreign keys to make up the GraphQL relationships, get automatic knowledge in the GQL you author of whether the relationship is nullable or represents an object or array relationship, and then load deeply nested (i.e. semantically rich) queries to author a type-safe UI in. All of this requires 0 packages to be published, I just use a monorepo with `pnpm` and my client can talk to Hasura safely with the generated code from Hasura GraphQL Engine, or to my TS backend via tRPC and tsconfig.json "references".

Now when it comes to migrations, Hasura has first class support, it's really incredible how you can use `hasura console`, click in a GUI to make all your database changes, and have a migration written to source code ready for you to use after this.

So, take it this way: Sequelize sucks, TS can't polish a turd, and the job in JS is to discover something powerful and useful.

In Python, you would never have touched a garbage library like Sequelize because Django is amazing and the community recognized that.

And now, let me show you my personal bias

> Everything in JS feels like a hack job, built on a pile of someone else's hackjobs.

Nah, you have it exactly backwards. How are type hints not meaningful in Python in the year 2025? Sure, named args and some other things are useful, but Python is the king of the untyped dynamic happy-cast do whatever BS. The code is insanely terse but that's directly a bad thing in this day and age when that terseness is directly achieved at the cost of meaningful typing.

I for sure recognize this partially stems from the activities one performs in the language, but having to run your Python to know if it works is objectively hilarious. Well crafted TypeScript and ESLint recommended catches virtually all my programming errors, such that I would never run into this problem

>if you miss an `await` in your controller, the entire server can crash if that call fails

My IDE calls that out! As it should! As Python refuses to!

rick1290•3mo ago
you mention hasura - is that open source? you are leaning on a product for migrations that are not open source is my main concern with the above comments.
nawgz•3mo ago
Yes, it is open source, and Docker containers are freely accessible too

https://github.com/hasura/graphql-engine

https://hub.docker.com/r/hasura/graphql-engine

I run v2.x images myself, not sure what v3 and DDN are, besides monetization efforts for the company*.

Also the migrations are just `up.sql` and `down.sql` files, there is nothing coupling you to a proprietary migration format, the value Hasura offers is generating them for you and tracking their application thru their CLI.

zahlman•4mo ago
There are negative comments about Python every time there's an opportunity presented. There are lots of positive comments too.

The wisecrack goes that Python is the second-best language for everything. I think this is clearly false: it is the best language for soliciting opinionated discussion on a forum.

mg•4mo ago
Because:

1: Simple is better than complex.

2: Beautiful is better than ugly.

3: Readability counts.

Winners across many markets seem to get the importance of simplicity. Compare the Google homepage to the Bing homepage. The dashboard of a Tesla to that of a Volkswagen. Remember how hardcore lean Facebook's UI was compared to MySpace?

maleldil•4mo ago
Tesla's dashboard is not simple. Having a touchscreen for everything instead of physical buttons is a travesty.
leephillips•4mo ago
The main reason I don’t like Python very much (besides performance) is that it makes what should be simple tasks complex due to the expression problem, arising from its use of class-based object orientation. You can avoid some of the issues in your own programs (but can’t escape insanity such as ",".join(["a", "b"])), but as soon as you delve into a library of any complexity to make alterations, you’re mired in a morass of methods encased in classes inheriting from other classes, and nothing is straightforward. Discovering Julia, which solves the expression problem¹, was enlightening. Even if it were as slow as Python, I would still prefer it: it is simpler, more beautiful, and more readable.

https://arstechnica.com/science/2020/10/the-unreasonable-eff...

zahlman•4mo ago
> the expression problem

I assume you're referring to https://en.wikipedia.org/wiki/Expression_problem . The Ars article rambles on for many paragraphs about a barely-sensible analogy that actually makes the concept harder to understand. But given that your apparent purpose is to proselytize for Julia I suppose it's adequate.

But really, you could have just said that Julia implements multiple dispatch. And people have been making do without in many different languages for decades.

> arising from its use of class-based object orientation

The language uses nothing of the sort. Unlike Java, you can write any number of globally-accessible classes in a source code file (including zero). Blame the authors.

Also, you don't even need classes to get polymorphic dispatch in Python. Check out `@functools.singledispatch` from the standard library.

> (but can’t escape insanity such as ",".join(["a", "b"]))

This is perfectly sane. As has been rehashed countless times, the method is on the "joiner" string because that way it automatically works for every iterable without duplicating code.

leephillips•4mo ago
https://docs.python.org/3/tutorial/classes.html

Thanks for the entertaining comment.

zahlman•4mo ago
The fact that the tutorial included in the official documentation (which is quite comprehensive) covers classes (in chapter 9) hardly seems to diminish my point.

Python's object-orientation is based on objects. Hence "everything is an object" being taken seriously, reifying many things as objects (intrinsically, without a "reflection" library) that cannot be seen that way in many other languages (notably Java). For example, functions, modules, classes themselves, exception tracebacks, stack frames...

And hence the inability to describe instance members directly in the class body: because that's where members of the class (which is an object) are described. Including the methods of the class — which as written are just ordinary functions (which are also objects) that happen to be attributes of the class. The methods (which are also objects) are created on the fly when a function in the class is looked up via the instance. And those objects can be stored for later use.

This is what object orientation is supposed to be. Languages like Java are instead class-oriented. You are constantly forced to think about writing classes in order to use objects — you constantly use objects, but you pay for it in class syntax. (Meanwhile, there are non-object primitives.) In Python, you never have to write a class, yet you constantly use objects despite yourself (and "unboxed values" are never accessible from Python).

leephillips•4mo ago
I see what you mean. So instead of claiming that Python employs class-based object orientation, I should have said something like:

Much existing library code is needlessly rigid and complex, even when supplying simple functionality. This is encouraged by Python’s mechanism of method specialization, which uses single dispatch and class objects. An attempt to re-use methods from a library leads to the expression problem.

zahlman•4mo ago
Yes, the standard library has a lot of questionable design (some inspired by Java, some by C), and many programmers overuse classes. I'm a fan of (former core dev) Jack Diederich's talk, https://www.youtube.com/watch?v=o9pEzgHorH0 .
reddit_clone•4mo ago
>The dashboard of a Tesla to that of a Volkswagen

Dude!

You think a touch screen tablet replacing all the knobs and tactile buttons is actually a step forward?

lonelyasacloud•4mo ago
Perl 6
actionfromafar•4mo ago
It's easy to forget how big Perl was before the 10 Year Stall happened. (I.e. when development on Perl 5 was stalled in wait for Perl 6 which never happened. By the time Perl 6 was renamed to Raku, it was too late for Perl 5. It now lives on, but it lost a lot of momentum.)
zahlman•4mo ago
I first used Python for anything serious in around 2005. From that moment forward I couldn't fathom ever touching Perl again. This was without even considering what changes were planned for 6 at the time; I knew there was a plan but the base of Perl seemed fundamentally unfixable to me.
ryandrake•4mo ago
I never really took Python seriously, but lately I've been programming almost all of my personal projects in Python. The way I see it: For any kind of project I might take on, Python is never the best language to do it in, but is almost always the second-best language to do it in. This makes it a great default, especially if it's just a little one-off tool or experiment.
gopalv•4mo ago
> Python is never the best language to do it in, but is almost always the second-best language to do it in.

I've been writing python from the last century and this year is the first time I'm writing production quality python code, everything up to this point has been first cut prototypes or utility scripts.

The real reason why it has stuck to me while others came and went is because of the REPL-first attitude.

A question like

    >>> 0.2 + 0.1 > 0.3

    True
is much harder to demonstrate in other languages.

The REPL isn't just for the code you typed out, it does allow you to import and run your lib functions locally to verify a question you have.

It is not without its craziness with decorators, fancy inheritance[1] or operator precedence[2], but you don't have to use it if you don't want to.

[1] - __subclasshook__ is crazy, right?

[2] - you can abuse __ror__ like this https://notmysock.org/blog/hacks/pypes

librasteve•4mo ago
lol, here it is in the https://raku.org repl…

  Welcome to Rakudo™ v2025.06.1.
  Implementing the Raku® Programming Language v6.d.
  Built on MoarVM version 2025.06.
  
  To exit type 'exit' or '^D'
  [0] > 0.1 + 0.2 > 0.3
  False
mbrameld•4mo ago
I might be wrong, but I was under the impression that result is platform-dependent?

https://docs.python.org/3/tutorial/floatingpoint.html#floati...

eesmith•4mo ago
Raku represents 0.1, 0.2, and 0.3 internally as rational numbers. https://docs.raku.org/type/Rat. 1/10 + 1/5 == 3/10

Note that "On overflow of the denominator during an arithmetic operation a Num (floating-point number) is returned instead." A Num is an IEEE 754 float64 ("On most platforms" says https://docs.raku.org/type/Num)

Python always uses IEEE 754 float64, also on most platforms. (I don't know of any Python implementation was does otherwise.) If you want rationals you need the fractions module.

  >>> from fractions import Fraction as F
  >>> F("0.1") + F("0.2") == F("0.3")
  True
  >>> 0.1 + 0.2 == 0.3
  False
This corresponds to Raku's FatRat, https://docs.raku.org/type/FatRat. ("unlike Rat, FatRat arithmetics do not fall back Num at some point, there is a risk that repeated arithmetic operations generate pathologically large numerators and denominators")
librasteve•4mo ago
good explanation

that said, decimals (eg 0.1) are in fact fractions, and the subtlety that 0.1 decimal cannot be precisely represented by a binary floating point number in the FPU is ignored by most languages where the core math is either integer or P754

bringing Rational numbers in as a first class citizen is a nice touch for mathematicians, scientists and so on

another way to look at it for Raku is that

  Int → integers (ℤ)
  Rat → rationals (ℚ)
  Num → reals (ℝ)
eesmith•4mo ago
"In fact" is a big strong, no?

"0.1" is what the language specification says it is, and I disagree with the view that it's ignored by most languages when it's often clearly and explicitly stated.

That most people don't know IEEE 754 floats, and do things like store currency as floats, is a different matter. (For that matter, currency should be stored as decimal, because account rules can be very particular about how rounding is carried out.)

Similarly, 3 * 4 + 5 may 'in fact' be 17 .. sometimes. But it's 27 with right-to-left precedence ... and 19683 in APL where * means power (3 to the power of 9). While 3 + 4 * 5 may be 35 or 23 (or 1027 in APL).

FWIW, FatRat is ℚ, not Rat. Rat switches to Num if the denominator is too high, as I quoted.

Bringing it back to Python, ABC (which influenced Python's development) used a ratio/fraction/FatRat natively, which handled the 0.1 + 0.2 == 0.3 issue, but ran into the 'pathologically large numerators and denominators' problem even for beginning students.

I see Rat as a way to get the best of both worlds, but I'm certain it has its own odd edge cases, like I suspect x + 1/13 - 1/13 might not be the original value if x + 1/13 caused a Rat to Num conversion.

librasteve•4mo ago
when I was a kid in junior school, I was taught that 0.1 means 1/10 something like the . is a division sign, digits to the right are the numerator and the denominator is the position of the digit to the power of 10

true, in fact the syntax of Python consumes the literal '0.1' as a double [float64] ... so ok maybe I was a bit strong that my fact trumps the Python fact (but it still feels wrong to say that 0.1 + 0.2 > 0.3)

---

I welcome your correction on FatRat ... btw I have just upgraded https://raku.land/zef:librasteve/Physics::Unit to FatRat. FatRat is a very useful string to the bow and imo cool that it's a core numeric type.

See also https://raku.land/zef:librasteve/FatRatStr as my path to sidestep P754 literals.

---

We are on the same page that the Rat compromise (degrade to P754) is optimal.

---

As you probably know, but I repeat here for others, Raku has the notion of https://docs.raku.org/language/numerics#Numeric_infectiousne... which means that `x + 1/3' will return a Rat if x is an Int or a Num if x is a Num. All "table" operators - sin , cos, log and so on are assumed to return irrationals (Num).

eesmith•4mo ago
You likely also learned in school that some calculators do left-right evaluation while other, more expensive ones, do PEMDAS. And a few do postfix instead. You might also have learned that most calculators didn't handle 1/3 as a fraction, in that 1 / 3 * 3 is 0.99999999.

Python is a fancy calculator.

To be clear, while in the mathematical sense, yes, sin, cos, and log generally return irrationals, in their IEEE 754 forms they return an exact value within 1 ulp or so of that irrational number. Num is a rational. ;)

  >>> x=5**0.5
  >>> x
  2.23606797749979
  >>> x.as_integer_ratio()
  (629397181890197, 281474976710656)
Scheme uses the phrase "numerical tower" for the same sort of implicit coercion.
eesmith•4mo ago
I just realized that in school you likely also learned that 1.0 and 1.000 are two different numbers for physical measurements as the latter implies a higher measurement precision.
Jtsummers•4mo ago
Raku uses a rational type by default for those which will give an exact value. If you use Python's Fraction type it would be equivalent to your Raku. The equivalent in Raku to the Python above would be:

  1e-1 + 2e-1 > 3e-1
Which will evaluate to True.
librasteve•4mo ago
well, yes - but the funny thing is that the example chosen to illustrate the convenience of a REPL is - errr - factually wrong

a scientist knows that 0.1 + 0.2 is not greater than 0.3, only a computer geek would think that this is OK

emil-lp•4mo ago
You completely miss the point twice over.

The REPL example intends to show what the program does, not whether or not something is intuitive for you.

Second, using your same argumentation,

    >>> 010 + 006 == 14
    True
Is also wrong.

It's based on a misunderstanding of what representations of numbers in programming languages are.

In Python (and almost all other languages), 0.1 means the IEEE float closest to the decimal number 0.1, and arithmetic operations are performed according to the IEEE standard.

librasteve•4mo ago
I am not “missing the point”, I am disagreeing with you. (Hopefully in an agreeable way)

I am making the point that using a decimal literal (eg 0.1) representation for a IEEE double is a bad choice and that using it as a representation for a Rat (aka Fraction) is a better choice.

I 100% accept your point that in Python 0.1+0.2>0.3 is true which is why I prefer Raku’s number system.

reddit_clone•4mo ago
This is the third Raku reference I came across since morning.

Happy to see Raku getting some press.

tialaramex•4mo ago
Given how slow Python is, isn't it embarrassing that 0.2 + 0.1 > 0.3 ?

I have some test Rust code where I add up about a hundred million 32-bit floating point numbers in the naive way, and it takes maybe a hundred milliseconds, and then I do the same but accumulating in a realistic::Real because hey how much slower is this type than a floating point number, well that's closer to twenty seconds.

But if I ask Python to do this, Python takes about twenty seconds anyway, and yet it's using floating point arithmetic so it gets the sum wrong, whereas realistic::Real doesn't because it's storing the exact values.

shagie•4mo ago
https://0.30000000000000004.com/#rust
tialaramex•4mo ago
Er, yes, I'm aware why this happened, my point is that this happens in the hardware floating point, but Python is as slow as the non-accelerated big rationals in my realistic::Real (it's actually markedly slower than the more appropriate realistic::Rational but that's not what my existing benchmarks cared about)
rbanffy•4mo ago
> this happens in the hardware floating point

Not really. It's a limitation of the IEEE floating point format used in most programming languages. Some numbers that look nice in base 10 don't have an exact representation in base 2.

shagie•4mo ago
Rational numbers where the denominator is a multiple of the prime factors of the base have an exact fractional representation in that base.

1/3 doesn't have an exact representation in base 10 or base 2. 1/5th does have an exact representation in base 10 (0.2), but doesn't in base 2. 1/4th has an exact representation in base 10 (0.25) and in base 2 (0.01)

rbanffy•4mo ago
If you have a hundred million integers to add, please, by all means, use Rust, or C, and intrinsics for the features not yet in your favorite math libraries. You can call Rust from Python, for instance. Polars is excellent, BTW. This seamless integration between a nice language amenable to interactive experimentation and highly optimized code produced in many other languages I don't want to write code in is what makes Python an excellent choice in many business cases.
Qem•4mo ago
At least with BigInts, CPython implementation is quite fast. See https://www.wilfred.me.uk/blog/2014/10/20/the-fastest-bigint...
zahlman•4mo ago
>But if I ask Python to [add up about a hundred million] [32-bit floating point numbers in the naive way], Python takes about twenty seconds anyway

Really?

  $ python -m timeit -s 'import random; x = [random.random() for _ in range(100000000)]' 'sum(x)'
  1 loop, best of 5: 806 msec per loop
This is on 11-year-old hardware. Even the generation isn't that slow:

  $ time python -c 'import random; x = [random.random() for _ in range(100000000)]'

  real    0m10.942s
  user    0m9.590s
  sys     0m1.346s
Of course, Python forces the use of doubles internally. Any optimizations inherent to 32-bit floats are simply not available.
eesmith•4mo ago
You shouldn't add hundred million 32-bit floating point numbers in the naive way. You should use Kahan or Neumaier summation. In Python these are available as math.fsum() and (in recent Python releases) the built-in sum function.

If you did

  total = 0.0
  for value in data:
    total += value
instead of

  total = sum(data)
then yes, the answer will take longer and be less accurate. But the naive native Rust equivalent will be less accurate than Python's sum(data).
tialaramex•4mo ago
> You shouldn't add hundred million 32-bit floating point numbers in the naive way.

That's entirely correct, you shouldn't do this. And yet people do for one reason and another. I'm aware of Kahan summation (and Neumaier's improvement), but it wasn't the point of the benchmarks I happened to be performing when this topic arrived.

You will not be surprised to learn there's a Kahan adaptor crate for Rust's iterator, so (with that crate) you can ask for the Kahan sum of some floating point iterator just the same way as you could ask for the naive sum. I suppose it's conceivable that one day Rust will choose to ship a specialisation in the stdlib which uses Kahan (as Python did in newer versions) but that seems unlikely because it is slower and you could explicitly ask for it already if you need it.

eesmith•4mo ago
Let me see if I have it right.

You don't like Python's use of IEEE 754 float64 for its "float" type because it's already so slow that you think Python should use a data type which better fits the expectations of primary school math training.

Then to demonstrate the timing issue you give an example where you ignore the simplest, fastest, and most accurate Python solution, using a built-in function which would likely be more accurate than what's available in stock Rust.

If accuracy is important for the first, why is it not important for the second?

Are you aware of the long history of compiled versions of Python (PyPy, numba, and more), plus variants like Cython, where the result has near Rust performance levels?

Were the core float be a non-IEEE 754, those compiled versions would either be dog slow (to be compatible with Python's core float) or give results which are different than CPython's.

Even if they did not exist, there would be a lot of questions about why a given program in C, Rust, Pascal, or any other float64-based system, gives different answers when translated to Python.

FWIW, I, like others, could not reproduce your 20 seconds timing. Python is indeed slow for this sort of task, but even for explicit iteration my code is an order of magnitude faster than you reported.

tialaramex•4mo ago
I think that a language which (as I understand it) already gives you a big number if you exceed the bounds of its provided integer types, might just as well provide rationals as the IEEE floating point types unless those IEEE types are very fast but to me it looks like on the whole they're not.

I was not aware of Cython, which sounds like a terrible idea but each to his own nor Numba, though I have worked with PyPy in the past. I'm afraid that the idea that somehow every Python implementation would be compatible caused me to choke. Python doesn't really care about compatibility, behaviour of the sum function you brought up was changed twice since Python 3.

This is an old machine, so I can well believe you can do the iteration faster. My initial interest happened because by total coincidence I was writing benchmarks for realistic which try out f32 and f64 vs realistic::Real for various arithmetic operations, and so I wondered well, isn't even Python much faster and (with my naive iteration) it was not.

As you are hopefully aware, new programmers are equally likely to run into languages where the default is the 32-bit IEEE floating point and so 1.0 + 2.0 > 3.0 is false for them as they are to encounter a language like Python with 64-bit IEEE floats. I'd expect, as with Python's experience with their hash tables, the kind of people writing Python as their main or even only language would always be pleased to have simpler, less surprising behaviour, and the rationals are much simpler - they're just slower.

eesmith•4mo ago
The int/long unification occurred a long time ago. ("long" was Python's name for BigNum.) There is now no such thing as "exceed the bounds of its provided integer types".

Guido van Rossum, who started and lead the Python project for many years, previously worked with ABC, which used rational as the default type. In practice this caused problems as it was all to easy to end up with "pathologically large numerators and denominators" (quoting https://docs.raku.org/type/FatRat). That experience guided him to reject rationals as the default integer type.

Pathologically large numerators and denominators make rationals not "just slower" but "a lot slower".

> somehow every Python implementation would be compatible

It's more of a rough consensus thing than full compatibility.

> Python doesn't really care about compatibility

Correct, and it causes me pain every years. But do note that historic compatibility is different than cross-implementation compatibility, since there is a strong incentive for other implementations to do a good job of staying CPython compatible.

FWIW, the laptop where I did my timings is 5 years old.

The new programmers in my field generally have Python as their first language, and don't have experience with float32.

I also know that float32 isn't enough to distinguish rationals I need to deal with, since in float32, 2094/4097 == 2117/4142 == 0.511106, while in float64 those ratios are not equal, as 0.5111056870881132 != 0.5111057460164172.

(I internally use uint16_t for the ratios, but have to turn them into doubles for use by other tools.)

tialaramex•4mo ago
> FWIW, the laptop where I did my timings is 5 years old.

The PC I tried this is on is about 10 year old, I bought this place in 2014 and the PC was not long after that.

eesmith•4mo ago
My desktop CPU came out in 2017, so 8 years old.

sum() of a list with 100M floats took 0.65 seconds. The explicit loop took 1.5 seconds on CPython 3.13.

But again, yes, Rust performance runs rings around CPython, but that's not enough to justify switching to an alternative numeric type given the many negatives, and Python's performance isn't as dire as you suggest.

shagie•4mo ago

    shagie@MacM1 ~ % docker run -it openjdk:latest jshell
    Unable to find image 'openjdk:latest' locally
    latest: Pulling from library/openjdk
    ...
    Status: Downloaded newer image for openjdk:latest
    Oct 01, 2025 6:23:46 PM java.util.prefs.FileSystemPreferences$1 run
    INFO: Created user preferences directory.
    |  Welcome to JShell -- Version 18.0.2.1
    |  For an introduction type: /help intro
    
    jshell>  0.1 + 0.2 > 0.3
    $1 ==> true

    jshell> 
This has been around since JDK 9. https://docs.oracle.com/en/java/javase/17/jshell/introductio...

That said, changing how you think about programming... even with jshell I still think Java in classes and methods (and trying to pull in larger frameworks is not as trivial as java.lang packages). However, I think Groovy (and a good bit of Scala) in a script writing style.

jshell itself is likely more useful for teaching than for development - especially once you've got a sufficiently complex project and the ide integration becomes more valuable than the immediate feedback.

Still, something to play with and one of the lesser known features of Java.

worik•4mo ago
>>> 0.2 + 0.1 > 0.3

    True
That is false. What is it that is "...is much harder to demonstrate in other languages?

I am missing something

Jtsummers•4mo ago
> That is false.

What's false about it? That is the result if you're using IEEE floating point arithmetic.

tialaramex•4mo ago
It's the result if your programming language thinks 0.2 + 0.1 means you want specifically the 64-bit IEEE floating point binary arithmetic.

But, where did we say that's what we want? As we've seen it's not the default in many languages and it isn't mandatory in Python, it's a choice, and the usual argument for that choice would be "it's fast" except, Python is slow, so what gives ?

Jtsummers•4mo ago
You aren't answering my question. I asked worik why they're claiming that something that happens is false. It's insane to claim that reality isn't reality. Run that in a Python REPL and that is the result you get.
leephillips•4mo ago
Interesting observation about Python providing the worst of all possible worlds: unintuitive arithmetic without any of its speed advantages.

But in answer to “where did we say that's what we want?” I would say, as soon as we wrote the expression, because we read a book about how the language works before we tried to use it. Αfter, for example reading a book¹ about Julia, we know that 0.1 + 0.2 will give us something slightly larger than 0.3, and we also know that we can type 1//10 + 2//10 to get 3//10.

[1] https://lee-phillips.org/amazonJuliaBookRanks/

tialaramex•4mo ago
> I would say, as soon as we wrote the expression, because we read a book about how the language works

I'm comfortable with that rationale in proportion to how much I believe the programmer read such a book.

I haven't taken the course we teach say, Chemists, I should maybe go audit that, but I would not be surprised if either it never explains this, or the explanation is very hand-wavy, something about it not being exact, maybe invoking old fashioned digital calculators.

The amusing thing is when you try to explain this sort of thing with a calculator, and you try a modern calculator, it is much more effective at this than you expected or remember from a 1980s Casio. The calculator in your modern say, Android phone, knows all the stuff you were shown in school, it isn't doing IEEE floating point arithmetic because that's only faster and you're a human using a calculator so "faster" in computer terms isn't important and it has prioritized being correct instead so that pedants stop filing bugs.

zahlman•4mo ago
> unintuitive arithmetic without any of its speed advantages.

Using floating-point in Python is still much faster than using an exact type in Python.

  $ # At this scale, we need to be aware of and account for the timing overhead
  $ python -m timeit 'pass'
  50000000 loops, best of 5: 8.18 nsec per loop

  $ # The variable assignment defeats constant folding in the very primitive optimizer
  $ python -m timeit --setup 'x = 0.1; y = 0.2' 'x + y'
  10000000 loops, best of 5: 21.2 nsec per loop

  $ python -m timeit --setup 'from decimal import Decimal as d; x = d("0.1"); y = d("0.2")' 'x + y'
  5000000 loops, best of 5: 62.9 nsec per loop

  $ python -m timeit --setup 'from fractions import Fraction as f; x = f(1, 10); y = f(2, 10)' 'x + y'
  500000 loops, best of 5: 755 nsec per loop
worik•4mo ago
> What's false about it?

Mathematics. IEEE floating point arithmetic gets it wrong!

Another example of why we should use integer algorithms where ever possible.

librasteve•4mo ago
lol, here it is in the https://raku.org repl…

  Welcome to Rakudo™ v2025.06.1.
  Implementing the Raku® Programming Language v6.d.
  Built on MoarVM version 2025.06.
  
  To exit type 'exit' or '^D'
  [0] > 0.1 + 0.2 > 0.3
  False
lloda•4mo ago
Just about anything that has a repl has a better repl than Python.
hirvi74•4mo ago
>>> 0.2 + 0.1 > 0.3

Swift will also return true unless you specify the type. Though, I suppose that is the key difference -- proper typing.

ASalazarMX•4mo ago
It's a beautiful and expressive language, and I use it a lot. My only gripe is how prone it is to bit rot, too sensitive to version changes in libraries or runtimes. It needs frequent babysitting if you want to keep everything updated.

Even if it's less secure, the beauty of Go, a single, static binary impervious to version changes, is very appealing for small projects you don't want to keep returning to for housekeeping.

tialaramex•4mo ago
I have written a not inconsiderable amount of Python. But the problem I found was that I spent too much time debugging runtime issues and I do not enjoy that. I found that while it took longer to write correct Rust to solve problems I wanted solving, I enjoyed that so it was a better deal.
kwar13•4mo ago
That's a pretty good way to frame it. The speed of developing to me justifies the "second best" aspect.
callc•4mo ago
Regardless of the reasons why, the fact is python works well enough.

There are many things I wish python, or a language like python, could improve on. Yet despite all my wishes, and choosing Rust/Go more often recently, Python still works.

I’m in a love/hate relationship with python. I think it’s becoming more of acceptance now haha.

vpShane•4mo ago
Agreed. But python is tried and true and battle tested. I like hobby coding and tried wasm python for the first time the other day, needed JS to load the wasm on the front end. Front end python socket io to python backend? Forget about it. That was my first trial I'm sure there's getting things to work. I've moved straight to python for flask, and web front end stuff being VUE. Tried pythons GTK and app development and could only get my test app to only look slightly better than a 1990s website; BUT, it isn't electron.

For sys level things even TUIs, parsing data, it's a joy.

Edit: QT not GTK

j45•4mo ago
Something like uv has been long, long over due.
YouWhy•4mo ago
I think Python's centrality is a consequence of its original purpose as a language intended for instruction.

Yeah, some of its design decisions required immense cost and time to overcome to make for viable production solutions. However as it turns out, however suboptimal it is a language, this is quite made up by the presence of a huge workforce that's decently qualified to wield it.

eesmith•4mo ago
Python original purpose was as a scripting language for Amoeba. Yes, it was strongly influenced by ABC, an introduction programming language which van Rossum helped implement, but that was a different language.

https://docs.python.org/3/faq/general.html#why-was-python-cr...

""I was working in the Amoeba distributed operating system group at CWI. We needed a better way to do system administration than by writing either C programs or Bourne shell scripts, since Amoeba had its own system call interface which wasn't easily accessible from the Bourne shell. My experience with error handling in Amoeba made me acutely aware of the importance of exceptions as a programming language feature.

It occurred to me that a scripting language with a syntax like ABC but with access to the Amoeba system calls would fill the need."""

jmyeet•4mo ago
Python’s popularity seems to me driven by ML and data science.

Personally, I can’t take seriously any language without a good type system and, no, optional type hints don’t count. Such a type system should express nullability and collection parameterization (ie genetics).

I simply won’t write a lot of code in any language where a typo or misspelling is a runtime error.

ASalazarMX•4mo ago
I think you got it backwards: Python was used in ML and data science because it was already one of the preferred languages. A single line can do the work of whole functions in other languages, while remaining clear.

Also, the tooling is abundant and of great quality, it made it the logical choice.

vizzier•4mo ago
I've been working with python professionally for just over a year now and the main difference from having come from a .net background seems to be that tests are far easier to write and more prevalent in our codebase. I'm sure some of that is cultural, but "runtime" errors are quickly found by any reasonable test suite.

That said, it always saddens me that ML (as in oCaml and F#) don't get more love. They basically can hit all the same bases with ease of readability, multi paradigm, module first programming that python can but just never got the same love.

fghorow•4mo ago
For me, a python user since the late '90s, the answer has always been simple:

Guido has taste.

rlpb•4mo ago
He stepped back and now we have the walrus operator.

At least we don't have to use it.

0cf8612b2e1e•4mo ago
Guido approved the walrus. It was the negative response which he said led to him quitting.
reddit_clone•4mo ago
Casual python user here. I wasn't aware of this controversy.

Why was there a backlash for this operator? (looks kinda neat). Was it breaking things?

0cf8612b2e1e•4mo ago
I am not a keyboard warrior who got caught up in the nonsense, but I think some people were simply annoyed at adding syntactic sugar for very marginal benefit. “There should be one way to do things” mantra.

I have a long list of grievances with Python, but the walrus situation would never crack my top ten. Put effort into removing cruft from the standard library, make typing better, have the PSF take a stance on packaging. Anything else feels a better use of time.

Whatever, it won. I will never use it, but when I see it will have to scratch my head and lookup the syntax rules.

zahlman•4mo ago
It was against many people's aesthetic sense. Including mine. But in theory it can be ignored completely, and in practice it is barely ever used (and indeed nobody forces you to add more uses).

You may be interested in https://learning-python.com/python-changes-2014-plus.html for a sense of what some old-timers' aesthetic sense is like. (I agree with many of these complaints and disagree with many others.)

librasteve•4mo ago
Larry Wall thought deeply about how human languages work, not just what a programming language should do mechanically.
nickpsecurity•4mo ago
Maybe. He was on the BLACKER VPN project for high-assurance, secure VPN. It had strong requirements for configuration management. Larry Wall was a smart, but lazy, programmer that tired of tedious administration. So, he wrote Perl to automate that.

Maybe he did some kind of deep, programming design. It just sounded in that account more like he threw together whatever solved his problem with some nice ideas baked in. Again, that's if it was true that he invented it to automate tedium during BLACKER VPN.

BLACKER is described here: https://ieeexplore.ieee.org/document/213253

For public examples of A1, look up SCOMP, GEMSOS, and VAX Security Kernel (VMM). Those papers describe the assurance activities required for A1 certification. At the time, due to bootstrapping requirement, tools like Configuration Management didn't have to be A1. People used all kinds of stuff, like Wall building Perl.

janalsncm•4mo ago
Maybe one factor is its versatility in leetcode, which to a first approximation every SWE has to do.

Declare sets, lists, and maps in one line. Don’t need to worry about overflow or underflow. Slicing is extremely convenient and expressive. Types not necessary, but that’s rarely confusing in short code blocks.

Compare to js, you’re immediately dealing with var/let/const decisions using up unnecessary mental energy.

jinushaun•4mo ago
Python is popular in education too. I guess people like to continue with the language that they did their homework with.

Combine that with its prevalence in coding interviews because “python basically pseudocode”, it makes sense why it’s popular.

drnick1•4mo ago
My use case is scientific computing and for that Python is excellent thanks to Numpy, IPython and Numba. It's often possible to write code that is nearly as fast as C, but it's far easier to write and debug, since you can just inspect and run code snippets in the shell. In that regard, it's similar to MATLAB, but it's FOSS and faster with the right libraries.
inglor_cz•4mo ago
I have written code in Pascal, C, C++, Java, TypeScript, PHP and Python in my life.

Of this entire pack, Python seems to have the widest ecosystem of libraries. I don't think I ever ran into a "have to reinvent the wheel" problem.

Need to run four test operations in parallel? asyncio.gather(). Need to run something every five minutes? schedule.every(). In most cases, it is a one-liner, or at most two-liner, no sophisticated setup necessary, and your actual business logic isn't diluted by tons of technical code.

Performance-critical parts can be always programmed in C and invoked from the Python code proper.

elzbardico•4mo ago
Python is the new Pascal.

It is probably the first language for 99% of the computer science students who didn't know any programming before college. And even for those who knew programming, chances are that a lot of them have at least dabbled a little with it.

zeitlupe•4mo ago
From time to time I am peeking over from VS Code to PyCharm and recently I have been surprised how much it seems to fall behind. No (official) ruff integration, you have to use 'External Tools', which is not part of the backup&sync feature. Seriously?
nickpsecurity•4mo ago
I've been programming only in Python for a while now. I got a certificate that required digging into the language enough to do the code in your head. I code projects in VS Code. I've enjoyed it, esp library availability.

I do warn people that it's not as easy or intuitive as advertized. I've often been bitten by unexpected errors. I think a survey of these might be worthwhile.

One was typing or conversion errors. Some conversions, like int-to-str in a string concantenation, seem pretty obvious. That isnumeric() didn't consider negative numbers as numbers was a surprise.

Sometimes it's consistency. I've often alternated between lists and sets in applications. I prefer to keep most data as a list but use sets for uniqueness checks or redundancy filtering. Despite being collections, one uses .append() and one uses .add(). Little differences not only add confusion: I have to modify my codebase if mixing or switching them which can become a type error later in another spot.

Another were common operations usually in one place were split across two. That happened with time vs datetime and filesystem operations which might take two modules. I've considered making a wrapper that turns all those into one thing with internal redundancy removed. There might be a reason others haven't done that, though.

Another issue was distribution. I can do straightforward building of console apps for two platforms. That's reliable. If worried about reliable, Python apps seemed easier to deliver as a Flask site than distribute my utilities as standalone. Nikita was really impressive, though, in terms of the work that must have went into it.

In my case, I also missed the ability to easily make linked lists in C to build trees. I wanted to build a C-like tree in Python but it couldnt do self-referential structures IIRC. Since that app requirements were C-like, and performance was no issue, I actually simulated a C-like heap in Python, ported a C tree to it, and build the tool on that. I also got a use-after-free in Python of all things lol. Anyway, I was surprised there was a data structure C could do but a high-level, GC, reflective language couldn't. There might be a trick for this others know about but I figure they just wrap C code for those things.

On the positive side, the debugging experience with Python was so much better than some beginner languages. I'm grateful for the work people put into that. I also love that there are easy to use accelerators, from JIT's to the C++ generator.

I was wanting an acceleratable subset with metaprogramming when Mojo appeared. I might try it for toy problems. I gotta try to stay in idiomatic-ish Python for now, though, since it's for career use.

Zee2•4mo ago
This is almost certainly LLM generated.

Six flowery “from-to”s in one article:

>from beginners to seasoned professionals

>from seasoned backend engineers to first-time data analysts

>from GPU acceleration and distributed training to model export

>from data preprocessing with pandas and NumPy to model serving via FastAPI

>from individual learners to enterprise teams

>from frontend interfaces to backend logic

And more annoyingly, at least four “not just X, but Y”.

>it doesn’t just serve as a first step; it continues adding value

>that clarity isn’t just beginner-friendly; it also lowers maintenance costs

>the community isn’t just helpful, it’s fast-moving and inclusive

>this network doesn’t just solve problems; it also shapes the language’s evolution

And I won’t mention the em-dashes out of respect to the human em-dash-users…

This stuff is so tiring to read.

scoofy•4mo ago
My background is in philosophy and formal logic. The idea that python isn’t the #1 language used is insane to me.

Yea, I’m sure there is a lot of technical reasons to use other languages, but with python, you can just read it. I remember buying “learn python the hard way” about 15 year ago, and just looking through the book thinking… wait, I can already read this.

Natural language parallels are huge.

zahlman•4mo ago
It speaks to Python's readability that people can successfully learn it in spite of LPTHW's bizarrely out-of-order curriculum, belabouring of simple points while glossing over more difficult ones, etc.
ivan_ah•4mo ago
If anyone is looking for an introduction to Python for absolute beginners, you can check out the Python tutorial I just prepared: https://nobsstats.com/tutorials/python_tutorial.html

It includes a section on "punctuation" symbols, which I haven't seen explained concisely anywhere else and might be helpful, even for non-beginners: https://nobsstats.com/tutorials/python_tutorial.html#python-...