But its syntactically weak? Python itself is slow? pip is awful (although uv is pretty good). Sometimes I am forced to write python because the packages I need are written for it but I always hate it.
AFAIK, it's the preferred language for lots of leetcode, e.g. Advent of Code. The practical expressivity and speed of dev is strong.
It drives me crazy how everything just... blocks. Parallelization is a disaster.
I won't comment much on indentation. All the string formatting syntaxes is crazy. Private names managed by `_`. The type system still mostly sucks. Ect ect.
In my experience it is alright to write short scripts but complexity very quickly balloons and python projects are very often unwieldy.
I would write a script in python, but only if I was positive I would never need to run or edit it again. Which is very rarely the case.
I too would prefer let. But the number of times what you mentioned has bitten me in almost 20 years of programming Python can be counted on in one hand. And these days, just use the "rename" feature in your IDE!
This is an extraordinarly common feature among scripting languages. In fact, JS is really the odd duck out.
Shell, Ruby, Lua, Perl, R, AWK, etc
> I rename the variable but forget to change it where it is updated.
Type checkers will catch this. (And IDEs will help.)
You don't have to use type checking of course, though it sounds like you like to.
> It drives me crazy how everything just... blocks.
There are comparatively few languages that primarily rely on cooperative multi-threading.
> All the string formatting syntaxes is crazy.
f"Hi my name is {name}"
---FWIW, you just don't like most scripting languages.
Which is fine, but it goes far beyond Python.
Python, Perl, Ruby, PHP, Lua, Tcl
JavaScript has gotten drastically better, especially since ES6. The lack of venvs alone is such a breath of fresh air. The baseline typing sucks more than python, but typescript is so much better. It still suffers from legacy but the world has mostly moved on and I don't see beginners learning legacy stuff much anymore. Modern javascript is amazing. I do my work in Elysia, Kysely (holy crap the python database situation is awful) and everything just works and catches tons of errors.
I don't like Python for "applications" as much. I was at a place almost 10 years ago now that had Python "microservices" running in Kubernetes. managing performance and dependencies for Python in production is just unnecessarily futile compared to something like Go that's also very accessible syntactically.
This is revisionism.
When I was in school, that's what people would say about C/C++/Java.
People like me switched to Python well before it became common to teach it at school. Lots of great libraries were written in it before then as well. I mean, it's really easy to write a library in it than it was in most other languages.
It was picked for teaching in schools because it was a decent language, and was already widespread. It's much more useful to teach Python to an average engineering student than C/C++/Java.
It became popular because it was easy to learn, and a lot nicer to use than the other big scripting language: Perl. When I was in grad school, I would routinely find non-engineers and non-programmers using it for their work, as well as writing libraries for their peers. There is no way they would have done that in any of the prevailing languages of the time - many of them learned to program while in grad school. It became quite popular amongst linguists. Then NumPy/SciPy became a popular alternative to MATLAB, and once again my peers, who would never write C++ to do their work, were suddenly getting matching speed by using Python. That's how it ended up being the language for ML.
So no - the fact that it's taught in schools is not the reason it's used today.
Sure, and this is my argument. It is easy to start out with, which makes it appealing to academics without a CS background. But is this a good thing? Because then these academics without a CS background write monstrous, poorly optimized scripts compounded by a slow language, then use drastically more compute to run their science, and then at the end publish their paper and the scripts they used are very hard to adapt to further research because very rarely are they structured as modules.
Just because it is easy to write doesn't mean it is the most appropriate for science. It is not commonly acceptable to publish papers without a standardized format and other conventions. Students put in work to learn how to do it properly. The same should be true for code.
For many of them, the alternative is that they simply wouldn't have done the science, and would have focused on research that didn't need computation. Or they'd use some expensive, proprietary tool.
Prior to Python becoming popular among them (circa 2005-2007), plenty of good, fast, whatever-metric-you-want languages existed. These academics were not using them.
Python is really what enabled the work to be done.
Is it? I've never actually seen or heard exactly how Python became popular with survey data at many institutions. If it exists, I'd like to read it sometime.
Most languages got popular due to a specific project or product getting popular or a specific company marketing them.
I've heard it was UNIX/C, Browser/JavaScript, Sun/Java, Microsoft/many (esp C#), Google/Go, Apple/Objective-C, Apple/Swift, Ruby/Rails, and so on. I've heard variations of C++ being like this but also useful building on C ecosystem.
Some were really great at a specific thing they made easy. PHP might be like that. I don't have its history, though. Flash was definitely like that.
So, did Python have a killer app or library that caused it to take off industrially? Or did that not matter? You mentioned NumPy/SciPy in academia.
This is a bold assertion, and IMO, false.
What was the killer product/company marketing that made Perl popular?
> So, did Python have a killer app or library that caused it to take off industrially? Or did that not matter? You mentioned NumPy/SciPy in academia.
It really didn't. It was already popular before, say, Django came around. Yes, NumPy/SciPy are kind of the reason ML is on Python, but the reality is people in academia chose NumPy/SciPy over MATLAB because it allowed them to use Python libraries in their code, which MATLAB could not. In other words, the academics chose to use NumPy/SciPy because they already were into Python.
For many, Python was just a much nicer alternative to Perl, and it was one of the main "batteries included" languages. That's why it became popular. Keep in mind that Python did not become popular quickly. It came out in 1990 and took about a decade to become somewhat popular.
The only other thing I can think of is this essay by Eric Raymond in 2000:
https://www.linuxjournal.com/article/3882
"My second came a couple of hours into the project, when I noticed (allowing for pauses needed to look up new features in Programming Python) I was generating working code nearly as fast as I could type. When I realized this, I was quite startled.
This was my first clue that, in Python, I was actually dealing with an exceptionally good design. Most languages have so much friction and awkwardness built into their design that you learn most of their feature set long before your misstep rate drops anywhere near zero. Python was the first general-purpose language I'd ever used that reversed this process.
...
I wrote a working, usable fetchmailconf, with GUI, in six working days, of which perhaps the equivalent of two days were spent learning Python itself. This reflects another useful property of the language: it is compact--you can hold its entire feature set (and at least a concept index of its libraries) in your head. C is a famously compact language. Perl is notoriously not; one of the things the notion “There's more than one way to do it!” costs Perl is the possibility of compactness.
...
To say I was astonished would have been positively wallowing in understatement. It's remarkable enough when implementations of simple techniques work exactly as expected the first time; but my first metaclass hack in a new language, six days from a cold standing start? Even if we stipulate that I am a fairly talented hacker, this is an amazing testament to Python's clarity and elegance of design.
There was simply no way I could have pulled off a coup like this in Perl, even with my vastly greater experience level in that language. It was at this point I realized I was probably leaving Perl behind.
...
So the real punchline of the story is this: weeks and months after writing fetchmailconf, I could still read the fetchmailconf code and grok what it was doing without serious mental effort. And the true reason I no longer write Perl for anything but tiny projects is that was never true when I was writing large masses of Perl code."
Slow is relative. You need to account for the time to write as well, and amortize over the number of times the code is run. For code that is run once, writing time dominates. Writing a Java equivalent of half of the things you can do in Python would be a nightmare.
where "run once" in the sense you describe is really the case has been rare for me. Often these one off scripts need to process data, which involves a loop, and all of the sudden, even if the script is only executed once, it must loop a million times and all of the sudden it is actually really slow and then I must go back and either beat the program into shape (time consuming) or let it execute over days. A third option is rewriting in a different language, and when I choose to do a 1:1 rewrite the time is often comparable to optimizing python, and the code runs faster than even the optimized python would've. Plus, these "one off" scripts often do get rerun, e.g. if more data is acquired.
Java is a sort of selective example. I find JavaScript similarly fast to write and it runs much faster.
The vast majority of my Python code is for data exploration and preprocessing which are usually one-offs or need to be run only a couple of times. Or maybe it’s a nightly job that takes 5 minutes instead of 30 seconds in another language, but it doesn’t matter because it’s not user facing.
Actual Python execution time very rarely comes into play. If it does and it’s a problem, I will create a pyo3 rust binding.
Correct. Relative to the languages I prefer, Python is slow
Python encourages people to write long scripts and large projects comprised of many scripts
Are these "run once" programs
Python is one of the least energy efficient languages
There seems to be an enormous amount of effort spent on trying to make Python faster. To me, this implies it is slow or, at least, not fast enough, but others may see things differently. The language has a devoted following, that's for sure
For me, the startup time makes Python unusable
I'm currently on pure JS project (node and vue) and I'm constantly fighting with things that would be trivial in python or django. Also getting into .NET world and not impressed with that at all.
I know folks like Go, but in decades of being a software consultant I've come across zero companies using it or hiring for it.
In NodeJS the most popular ORM is Sequelize. Sequelize struggles with TypeScript, the code to use is extremely verbose (vs Django), and the migrations are simplistic at best. There are other ORMs, but you usually gain TypeScript support at the expense of good migrations (which range from rough SQL-script-only support to literally nothing). Schema migrations are one thing, but if you want to do a data migration that uses business logic, in Django you can potentially just bring in your business code directly into a migration, and you can also explicitly set pre-requisites rather than having to order the filenames (which is what everything else seems to use).
Also in NodeJS if you miss an `await` in your controller, the entire server can crash if that call fails.
That's Node vs Django, though, which isn't completely JS vs Python, but it also really is.
Coming from Python, JS has constant surprises, where code works in one place and not another (in the same project!) due to reasons I don't always understand around ES version and configuration. Everything in JS feels like a hack job, built on a pile of someone else's hackjobs.
Likewise, if I want to know how something in Python works, I just look at the source. I rarely even look at official documentation because the code is right there. That's not a reasonable thing in JS, frankly.
But really the worst part is that I do a ton of "try and test" development, where debugging is hit or miss and console.log is everywhere. In Python, I can just write the code live in the shell/iPython, and then past the working code back into my IDE. This ends up being a huge timesaver.
I totally agree JS can be surprising, and actually I think the greatest skill a JS developer needs is to understand what things are powerful reliable and composable as opposed to hacky, where you can understand hacky things lead to surprising behavior.
For instance, I too have used Sequelize, and it is a painfully bad library. I can only see that network effects have lead the community here, there is no merit. Instead, I think the scrutinizing JS developer - the one who can write reliable JS - needs to just throw Sequelize out. It sucks.
I happily did this, and so I looked elsewhere at what the community was trying in terms of data paradigms. An obviously interesting invention of the JS community at the time was GraphQL. After some time, I decided writing my own resolvers in JS was an exercise in futility, but I found it incredibly attractive to use the tool Hasura to give a free GraphQL API over a DB.
PostgreSQL, Hasura GraphQL Engine, GraphQL, graphql-codegen, and typescript combine to make this amazing chain where you can normalize your DB and rely on the foreign keys to make up the GraphQL relationships, get automatic knowledge in the GQL you author of whether the relationship is nullable or represents an object or array relationship, and then load deeply nested (i.e. semantically rich) queries to author a type-safe UI in. All of this requires 0 packages to be published, I just use a monorepo with `pnpm` and my client can talk to Hasura safely with the generated code from Hasura GraphQL Engine, or to my TS backend via tRPC and tsconfig.json "references".
Now when it comes to migrations, Hasura has first class support, it's really incredible how you can use `hasura console`, click in a GUI to make all your database changes, and have a migration written to source code ready for you to use after this.
So, take it this way: Sequelize sucks, TS can't polish a turd, and the job in JS is to discover something powerful and useful.
In Python, you would never have touched a garbage library like Sequelize because Django is amazing and the community recognized that.
And now, let me show you my personal bias
> Everything in JS feels like a hack job, built on a pile of someone else's hackjobs.
Nah, you have it exactly backwards. How are type hints not meaningful in Python in the year 2025? Sure, named args and some other things are useful, but Python is the king of the untyped dynamic happy-cast do whatever BS. The code is insanely terse but that's directly a bad thing in this day and age when that terseness is directly achieved at the cost of meaningful typing.
I for sure recognize this partially stems from the activities one performs in the language, but having to run your Python to know if it works is objectively hilarious. Well crafted TypeScript and ESLint recommended catches virtually all my programming errors, such that I would never run into this problem
>if you miss an `await` in your controller, the entire server can crash if that call fails
My IDE calls that out! As it should! As Python refuses to!
https://github.com/hasura/graphql-engine
https://hub.docker.com/r/hasura/graphql-engine
I run v2.x images myself, not sure what v3 and DDN are, besides monetization efforts for the company*.
Also the migrations are just `up.sql` and `down.sql` files, there is nothing coupling you to a proprietary migration format, the value Hasura offers is generating them for you and tracking their application thru their CLI.
The wisecrack goes that Python is the second-best language for everything. I think this is clearly false: it is the best language for soliciting opinionated discussion on a forum.
1: Simple is better than complex.
2: Beautiful is better than ugly.
3: Readability counts.
Winners across many markets seem to get the importance of simplicity. Compare the Google homepage to the Bing homepage. The dashboard of a Tesla to that of a Volkswagen. Remember how hardcore lean Facebook's UI was compared to MySpace?
https://arstechnica.com/science/2020/10/the-unreasonable-eff...
I assume you're referring to https://en.wikipedia.org/wiki/Expression_problem . The Ars article rambles on for many paragraphs about a barely-sensible analogy that actually makes the concept harder to understand. But given that your apparent purpose is to proselytize for Julia I suppose it's adequate.
But really, you could have just said that Julia implements multiple dispatch. And people have been making do without in many different languages for decades.
> arising from its use of class-based object orientation
The language uses nothing of the sort. Unlike Java, you can write any number of globally-accessible classes in a source code file (including zero). Blame the authors.
Also, you don't even need classes to get polymorphic dispatch in Python. Check out `@functools.singledispatch` from the standard library.
> (but can’t escape insanity such as ",".join(["a", "b"]))
This is perfectly sane. As has been rehashed countless times, the method is on the "joiner" string because that way it automatically works for every iterable without duplicating code.
Thanks for the entertaining comment.
Python's object-orientation is based on objects. Hence "everything is an object" being taken seriously, reifying many things as objects (intrinsically, without a "reflection" library) that cannot be seen that way in many other languages (notably Java). For example, functions, modules, classes themselves, exception tracebacks, stack frames...
And hence the inability to describe instance members directly in the class body: because that's where members of the class (which is an object) are described. Including the methods of the class — which as written are just ordinary functions (which are also objects) that happen to be attributes of the class. The methods (which are also objects) are created on the fly when a function in the class is looked up via the instance. And those objects can be stored for later use.
This is what object orientation is supposed to be. Languages like Java are instead class-oriented. You are constantly forced to think about writing classes in order to use objects — you constantly use objects, but you pay for it in class syntax. (Meanwhile, there are non-object primitives.) In Python, you never have to write a class, yet you constantly use objects despite yourself (and "unboxed values" are never accessible from Python).
Much existing library code is needlessly rigid and complex, even when supplying simple functionality. This is encouraged by Python’s mechanism of method specialization, which uses single dispatch and class objects. An attempt to re-use methods from a library leads to the expression problem.
Dude!
You think a touch screen tablet replacing all the knobs and tactile buttons is actually a step forward?
I've been writing python from the last century and this year is the first time I'm writing production quality python code, everything up to this point has been first cut prototypes or utility scripts.
The real reason why it has stuck to me while others came and went is because of the REPL-first attitude.
A question like
>>> 0.2 + 0.1 > 0.3
True
is much harder to demonstrate in other languages.The REPL isn't just for the code you typed out, it does allow you to import and run your lib functions locally to verify a question you have.
It is not without its craziness with decorators, fancy inheritance[1] or operator precedence[2], but you don't have to use it if you don't want to.
[1] - __subclasshook__ is crazy, right?
[2] - you can abuse __ror__ like this https://notmysock.org/blog/hacks/pypes
Welcome to Rakudo™ v2025.06.1.
Implementing the Raku® Programming Language v6.d.
Built on MoarVM version 2025.06.
To exit type 'exit' or '^D'
[0] > 0.1 + 0.2 > 0.3
Falsehttps://docs.python.org/3/tutorial/floatingpoint.html#floati...
Note that "On overflow of the denominator during an arithmetic operation a Num (floating-point number) is returned instead." A Num is an IEEE 754 float64 ("On most platforms" says https://docs.raku.org/type/Num)
Python always uses IEEE 754 float64, also on most platforms. (I don't know of any Python implementation was does otherwise.) If you want rationals you need the fractions module.
>>> from fractions import Fraction as F
>>> F("0.1") + F("0.2") == F("0.3")
True
>>> 0.1 + 0.2 == 0.3
False
This corresponds to Raku's FatRat, https://docs.raku.org/type/FatRat. ("unlike Rat, FatRat arithmetics do not fall back Num at some point, there is a risk that repeated arithmetic operations generate pathologically large numerators and denominators")that said, decimals (eg 0.1) are in fact fractions, and the subtlety that 0.1 decimal cannot be precisely represented by a binary floating point number in the FPU is ignored by most languages where the core math is either integer or P754
bringing Rational numbers in as a first class citizen is a nice touch for mathematicians, scientists and so on
another way to look at it for Raku is that
Int → integers (ℤ)
Rat → rationals (ℚ)
Num → reals (ℝ)"0.1" is what the language specification says it is, and I disagree with the view that it's ignored by most languages when it's often clearly and explicitly stated.
That most people don't know IEEE 754 floats, and do things like store currency as floats, is a different matter. (For that matter, currency should be stored as decimal, because account rules can be very particular about how rounding is carried out.)
Similarly, 3 * 4 + 5 may 'in fact' be 17 .. sometimes. But it's 27 with right-to-left precedence ... and 19683 in APL where * means power (3 to the power of 9). While 3 + 4 * 5 may be 35 or 23 (or 1027 in APL).
FWIW, FatRat is ℚ, not Rat. Rat switches to Num if the denominator is too high, as I quoted.
Bringing it back to Python, ABC (which influenced Python's development) used a ratio/fraction/FatRat natively, which handled the 0.1 + 0.2 == 0.3 issue, but ran into the 'pathologically large numerators and denominators' problem even for beginning students.
I see Rat as a way to get the best of both worlds, but I'm certain it has its own odd edge cases, like I suspect x + 1/13 - 1/13 might not be the original value if x + 1/13 caused a Rat to Num conversion.
true, in fact the syntax of Python consumes the literal '0.1' as a double [float64] ... so ok maybe I was a bit strong that my fact trumps the Python fact (but it still feels wrong to say that 0.1 + 0.2 > 0.3)
---
I welcome your correction on FatRat ... btw I have just upgraded https://raku.land/zef:librasteve/Physics::Unit to FatRat. FatRat is a very useful string to the bow and imo cool that it's a core numeric type.
See also https://raku.land/zef:librasteve/FatRatStr as my path to sidestep P754 literals.
---
We are on the same page that the Rat compromise (degrade to P754) is optimal.
---
As you probably know, but I repeat here for others, Raku has the notion of https://docs.raku.org/language/numerics#Numeric_infectiousne... which means that `x + 1/3' will return a Rat if x is an Int or a Num if x is a Num. All "table" operators - sin , cos, log and so on are assumed to return irrationals (Num).
Python is a fancy calculator.
To be clear, while in the mathematical sense, yes, sin, cos, and log generally return irrationals, in their IEEE 754 forms they return an exact value within 1 ulp or so of that irrational number. Num is a rational. ;)
>>> x=5**0.5
>>> x
2.23606797749979
>>> x.as_integer_ratio()
(629397181890197, 281474976710656)
Scheme uses the phrase "numerical tower" for the same sort of implicit coercion. 1e-1 + 2e-1 > 3e-1
Which will evaluate to True.a scientist knows that 0.1 + 0.2 is not greater than 0.3, only a computer geek would think that this is OK
The REPL example intends to show what the program does, not whether or not something is intuitive for you.
Second, using your same argumentation,
>>> 010 + 006 == 14
True
Is also wrong.It's based on a misunderstanding of what representations of numbers in programming languages are.
In Python (and almost all other languages), 0.1 means the IEEE float closest to the decimal number 0.1, and arithmetic operations are performed according to the IEEE standard.
I am making the point that using a decimal literal (eg 0.1) representation for a IEEE double is a bad choice and that using it as a representation for a Rat (aka Fraction) is a better choice.
I 100% accept your point that in Python 0.1+0.2>0.3 is true which is why I prefer Raku’s number system.
Happy to see Raku getting some press.
I have some test Rust code where I add up about a hundred million 32-bit floating point numbers in the naive way, and it takes maybe a hundred milliseconds, and then I do the same but accumulating in a realistic::Real because hey how much slower is this type than a floating point number, well that's closer to twenty seconds.
But if I ask Python to do this, Python takes about twenty seconds anyway, and yet it's using floating point arithmetic so it gets the sum wrong, whereas realistic::Real doesn't because it's storing the exact values.
Not really. It's a limitation of the IEEE floating point format used in most programming languages. Some numbers that look nice in base 10 don't have an exact representation in base 2.
1/3 doesn't have an exact representation in base 10 or base 2. 1/5th does have an exact representation in base 10 (0.2), but doesn't in base 2. 1/4th has an exact representation in base 10 (0.25) and in base 2 (0.01)
Really?
$ python -m timeit -s 'import random; x = [random.random() for _ in range(100000000)]' 'sum(x)'
1 loop, best of 5: 806 msec per loop
This is on 11-year-old hardware. Even the generation isn't that slow: $ time python -c 'import random; x = [random.random() for _ in range(100000000)]'
real 0m10.942s
user 0m9.590s
sys 0m1.346s
Of course, Python forces the use of doubles internally. Any optimizations inherent to 32-bit floats are simply not available.If you did
total = 0.0
for value in data:
total += value
instead of total = sum(data)
then yes, the answer will take longer and be less accurate. But the naive native Rust equivalent will be less accurate than Python's sum(data).That's entirely correct, you shouldn't do this. And yet people do for one reason and another. I'm aware of Kahan summation (and Neumaier's improvement), but it wasn't the point of the benchmarks I happened to be performing when this topic arrived.
You will not be surprised to learn there's a Kahan adaptor crate for Rust's iterator, so (with that crate) you can ask for the Kahan sum of some floating point iterator just the same way as you could ask for the naive sum. I suppose it's conceivable that one day Rust will choose to ship a specialisation in the stdlib which uses Kahan (as Python did in newer versions) but that seems unlikely because it is slower and you could explicitly ask for it already if you need it.
You don't like Python's use of IEEE 754 float64 for its "float" type because it's already so slow that you think Python should use a data type which better fits the expectations of primary school math training.
Then to demonstrate the timing issue you give an example where you ignore the simplest, fastest, and most accurate Python solution, using a built-in function which would likely be more accurate than what's available in stock Rust.
If accuracy is important for the first, why is it not important for the second?
Are you aware of the long history of compiled versions of Python (PyPy, numba, and more), plus variants like Cython, where the result has near Rust performance levels?
Were the core float be a non-IEEE 754, those compiled versions would either be dog slow (to be compatible with Python's core float) or give results which are different than CPython's.
Even if they did not exist, there would be a lot of questions about why a given program in C, Rust, Pascal, or any other float64-based system, gives different answers when translated to Python.
FWIW, I, like others, could not reproduce your 20 seconds timing. Python is indeed slow for this sort of task, but even for explicit iteration my code is an order of magnitude faster than you reported.
I was not aware of Cython, which sounds like a terrible idea but each to his own nor Numba, though I have worked with PyPy in the past. I'm afraid that the idea that somehow every Python implementation would be compatible caused me to choke. Python doesn't really care about compatibility, behaviour of the sum function you brought up was changed twice since Python 3.
This is an old machine, so I can well believe you can do the iteration faster. My initial interest happened because by total coincidence I was writing benchmarks for realistic which try out f32 and f64 vs realistic::Real for various arithmetic operations, and so I wondered well, isn't even Python much faster and (with my naive iteration) it was not.
As you are hopefully aware, new programmers are equally likely to run into languages where the default is the 32-bit IEEE floating point and so 1.0 + 2.0 > 3.0 is false for them as they are to encounter a language like Python with 64-bit IEEE floats. I'd expect, as with Python's experience with their hash tables, the kind of people writing Python as their main or even only language would always be pleased to have simpler, less surprising behaviour, and the rationals are much simpler - they're just slower.
Guido van Rossum, who started and lead the Python project for many years, previously worked with ABC, which used rational as the default type. In practice this caused problems as it was all to easy to end up with "pathologically large numerators and denominators" (quoting https://docs.raku.org/type/FatRat). That experience guided him to reject rationals as the default integer type.
Pathologically large numerators and denominators make rationals not "just slower" but "a lot slower".
> somehow every Python implementation would be compatible
It's more of a rough consensus thing than full compatibility.
> Python doesn't really care about compatibility
Correct, and it causes me pain every years. But do note that historic compatibility is different than cross-implementation compatibility, since there is a strong incentive for other implementations to do a good job of staying CPython compatible.
FWIW, the laptop where I did my timings is 5 years old.
The new programmers in my field generally have Python as their first language, and don't have experience with float32.
I also know that float32 isn't enough to distinguish rationals I need to deal with, since in float32, 2094/4097 == 2117/4142 == 0.511106, while in float64 those ratios are not equal, as 0.5111056870881132 != 0.5111057460164172.
(I internally use uint16_t for the ratios, but have to turn them into doubles for use by other tools.)
The PC I tried this is on is about 10 year old, I bought this place in 2014 and the PC was not long after that.
sum() of a list with 100M floats took 0.65 seconds. The explicit loop took 1.5 seconds on CPython 3.13.
But again, yes, Rust performance runs rings around CPython, but that's not enough to justify switching to an alternative numeric type given the many negatives, and Python's performance isn't as dire as you suggest.
shagie@MacM1 ~ % docker run -it openjdk:latest jshell
Unable to find image 'openjdk:latest' locally
latest: Pulling from library/openjdk
...
Status: Downloaded newer image for openjdk:latest
Oct 01, 2025 6:23:46 PM java.util.prefs.FileSystemPreferences$1 run
INFO: Created user preferences directory.
| Welcome to JShell -- Version 18.0.2.1
| For an introduction type: /help intro
jshell> 0.1 + 0.2 > 0.3
$1 ==> true
jshell>
This has been around since JDK 9. https://docs.oracle.com/en/java/javase/17/jshell/introductio...That said, changing how you think about programming... even with jshell I still think Java in classes and methods (and trying to pull in larger frameworks is not as trivial as java.lang packages). However, I think Groovy (and a good bit of Scala) in a script writing style.
jshell itself is likely more useful for teaching than for development - especially once you've got a sufficiently complex project and the ide integration becomes more valuable than the immediate feedback.
Still, something to play with and one of the lesser known features of Java.
True
That is false. What is it that is "...is much harder to demonstrate in other languages?I am missing something
What's false about it? That is the result if you're using IEEE floating point arithmetic.
But, where did we say that's what we want? As we've seen it's not the default in many languages and it isn't mandatory in Python, it's a choice, and the usual argument for that choice would be "it's fast" except, Python is slow, so what gives ?
But in answer to “where did we say that's what we want?” I would say, as soon as we wrote the expression, because we read a book about how the language works before we tried to use it. Αfter, for example reading a book¹ about Julia, we know that 0.1 + 0.2 will give us something slightly larger than 0.3, and we also know that we can type 1//10 + 2//10 to get 3//10.
I'm comfortable with that rationale in proportion to how much I believe the programmer read such a book.
I haven't taken the course we teach say, Chemists, I should maybe go audit that, but I would not be surprised if either it never explains this, or the explanation is very hand-wavy, something about it not being exact, maybe invoking old fashioned digital calculators.
The amusing thing is when you try to explain this sort of thing with a calculator, and you try a modern calculator, it is much more effective at this than you expected or remember from a 1980s Casio. The calculator in your modern say, Android phone, knows all the stuff you were shown in school, it isn't doing IEEE floating point arithmetic because that's only faster and you're a human using a calculator so "faster" in computer terms isn't important and it has prioritized being correct instead so that pedants stop filing bugs.
Using floating-point in Python is still much faster than using an exact type in Python.
$ # At this scale, we need to be aware of and account for the timing overhead
$ python -m timeit 'pass'
50000000 loops, best of 5: 8.18 nsec per loop
$ # The variable assignment defeats constant folding in the very primitive optimizer
$ python -m timeit --setup 'x = 0.1; y = 0.2' 'x + y'
10000000 loops, best of 5: 21.2 nsec per loop
$ python -m timeit --setup 'from decimal import Decimal as d; x = d("0.1"); y = d("0.2")' 'x + y'
5000000 loops, best of 5: 62.9 nsec per loop
$ python -m timeit --setup 'from fractions import Fraction as f; x = f(1, 10); y = f(2, 10)' 'x + y'
500000 loops, best of 5: 755 nsec per loopMathematics. IEEE floating point arithmetic gets it wrong!
Another example of why we should use integer algorithms where ever possible.
Welcome to Rakudo™ v2025.06.1.
Implementing the Raku® Programming Language v6.d.
Built on MoarVM version 2025.06.
To exit type 'exit' or '^D'
[0] > 0.1 + 0.2 > 0.3
FalseSwift will also return true unless you specify the type. Though, I suppose that is the key difference -- proper typing.
Even if it's less secure, the beauty of Go, a single, static binary impervious to version changes, is very appealing for small projects you don't want to keep returning to for housekeeping.
There are many things I wish python, or a language like python, could improve on. Yet despite all my wishes, and choosing Rust/Go more often recently, Python still works.
I’m in a love/hate relationship with python. I think it’s becoming more of acceptance now haha.
For sys level things even TUIs, parsing data, it's a joy.
Edit: QT not GTK
Yeah, some of its design decisions required immense cost and time to overcome to make for viable production solutions. However as it turns out, however suboptimal it is a language, this is quite made up by the presence of a huge workforce that's decently qualified to wield it.
https://docs.python.org/3/faq/general.html#why-was-python-cr...
""I was working in the Amoeba distributed operating system group at CWI. We needed a better way to do system administration than by writing either C programs or Bourne shell scripts, since Amoeba had its own system call interface which wasn't easily accessible from the Bourne shell. My experience with error handling in Amoeba made me acutely aware of the importance of exceptions as a programming language feature.
It occurred to me that a scripting language with a syntax like ABC but with access to the Amoeba system calls would fill the need."""
Personally, I can’t take seriously any language without a good type system and, no, optional type hints don’t count. Such a type system should express nullability and collection parameterization (ie genetics).
I simply won’t write a lot of code in any language where a typo or misspelling is a runtime error.
Also, the tooling is abundant and of great quality, it made it the logical choice.
That said, it always saddens me that ML (as in oCaml and F#) don't get more love. They basically can hit all the same bases with ease of readability, multi paradigm, module first programming that python can but just never got the same love.
Guido has taste.
At least we don't have to use it.
Why was there a backlash for this operator? (looks kinda neat). Was it breaking things?
I have a long list of grievances with Python, but the walrus situation would never crack my top ten. Put effort into removing cruft from the standard library, make typing better, have the PSF take a stance on packaging. Anything else feels a better use of time.
Whatever, it won. I will never use it, but when I see it will have to scratch my head and lookup the syntax rules.
You may be interested in https://learning-python.com/python-changes-2014-plus.html for a sense of what some old-timers' aesthetic sense is like. (I agree with many of these complaints and disagree with many others.)
Maybe he did some kind of deep, programming design. It just sounded in that account more like he threw together whatever solved his problem with some nice ideas baked in. Again, that's if it was true that he invented it to automate tedium during BLACKER VPN.
BLACKER is described here: https://ieeexplore.ieee.org/document/213253
For public examples of A1, look up SCOMP, GEMSOS, and VAX Security Kernel (VMM). Those papers describe the assurance activities required for A1 certification. At the time, due to bootstrapping requirement, tools like Configuration Management didn't have to be A1. People used all kinds of stuff, like Wall building Perl.
Declare sets, lists, and maps in one line. Don’t need to worry about overflow or underflow. Slicing is extremely convenient and expressive. Types not necessary, but that’s rarely confusing in short code blocks.
Compare to js, you’re immediately dealing with var/let/const decisions using up unnecessary mental energy.
Combine that with its prevalence in coding interviews because “python basically pseudocode”, it makes sense why it’s popular.
Of this entire pack, Python seems to have the widest ecosystem of libraries. I don't think I ever ran into a "have to reinvent the wheel" problem.
Need to run four test operations in parallel? asyncio.gather(). Need to run something every five minutes? schedule.every(). In most cases, it is a one-liner, or at most two-liner, no sophisticated setup necessary, and your actual business logic isn't diluted by tons of technical code.
Performance-critical parts can be always programmed in C and invoked from the Python code proper.
It is probably the first language for 99% of the computer science students who didn't know any programming before college. And even for those who knew programming, chances are that a lot of them have at least dabbled a little with it.
I do warn people that it's not as easy or intuitive as advertized. I've often been bitten by unexpected errors. I think a survey of these might be worthwhile.
One was typing or conversion errors. Some conversions, like int-to-str in a string concantenation, seem pretty obvious. That isnumeric() didn't consider negative numbers as numbers was a surprise.
Sometimes it's consistency. I've often alternated between lists and sets in applications. I prefer to keep most data as a list but use sets for uniqueness checks or redundancy filtering. Despite being collections, one uses .append() and one uses .add(). Little differences not only add confusion: I have to modify my codebase if mixing or switching them which can become a type error later in another spot.
Another were common operations usually in one place were split across two. That happened with time vs datetime and filesystem operations which might take two modules. I've considered making a wrapper that turns all those into one thing with internal redundancy removed. There might be a reason others haven't done that, though.
Another issue was distribution. I can do straightforward building of console apps for two platforms. That's reliable. If worried about reliable, Python apps seemed easier to deliver as a Flask site than distribute my utilities as standalone. Nikita was really impressive, though, in terms of the work that must have went into it.
In my case, I also missed the ability to easily make linked lists in C to build trees. I wanted to build a C-like tree in Python but it couldnt do self-referential structures IIRC. Since that app requirements were C-like, and performance was no issue, I actually simulated a C-like heap in Python, ported a C tree to it, and build the tool on that. I also got a use-after-free in Python of all things lol. Anyway, I was surprised there was a data structure C could do but a high-level, GC, reflective language couldn't. There might be a trick for this others know about but I figure they just wrap C code for those things.
On the positive side, the debugging experience with Python was so much better than some beginner languages. I'm grateful for the work people put into that. I also love that there are easy to use accelerators, from JIT's to the C++ generator.
I was wanting an acceleratable subset with metaprogramming when Mojo appeared. I might try it for toy problems. I gotta try to stay in idiomatic-ish Python for now, though, since it's for career use.
Six flowery “from-to”s in one article:
>from beginners to seasoned professionals
>from seasoned backend engineers to first-time data analysts
>from GPU acceleration and distributed training to model export
>from data preprocessing with pandas and NumPy to model serving via FastAPI
>from individual learners to enterprise teams
>from frontend interfaces to backend logic
And more annoyingly, at least four “not just X, but Y”.
>it doesn’t just serve as a first step; it continues adding value
>that clarity isn’t just beginner-friendly; it also lowers maintenance costs
>the community isn’t just helpful, it’s fast-moving and inclusive
>this network doesn’t just solve problems; it also shapes the language’s evolution
And I won’t mention the em-dashes out of respect to the human em-dash-users…
This stuff is so tiring to read.
Yea, I’m sure there is a lot of technical reasons to use other languages, but with python, you can just read it. I remember buying “learn python the hard way” about 15 year ago, and just looking through the book thinking… wait, I can already read this.
Natural language parallels are huge.
It includes a section on "punctuation" symbols, which I haven't seen explained concisely anywhere else and might be helpful, even for non-beginners: https://nobsstats.com/tutorials/python_tutorial.html#python-...
bgwalter•4mo ago
Python is well marketed, with dissenting voices silenced, de-platformed and defamed with PSF involvement. That way many users think the Python ruling class are nice people. It is therefore popular among scientists (who buy expensive training courses) and students (who are force fed Python at university).
It has a good C-API, which is the main reason for its popularity in machine learning. Fortunately for Python, other languages do not take note and insist on FFIs etc.
EDIT: The downvotes are ironic given that Python needs to be marketed every three day here with a a statistic to retain its popularity. If it is so popular, why the booster articles?
ASalazarMX•4mo ago
dalke•4mo ago
Sure, we have very different experiences. But that also means that unless you can present strong survey data, it's hard to know if your "Most people" is limited to the people you associate with, or is something more broadly true.
The PSF overlap with my field is essentially zero. I mean, I was that overlap, but I stopped being involved with the PSF about 8 years ago when my youngest was born and I had no free time or money. In the meanwhile, meaningful PSF involvement became less something a hobbyist project and something more corporatized .. and corporate friendly.
> scientists (who buy expensive training courses)
ROFL!! What scientists are paying for expensive training courses?!
I tried selling training courses to computational chemists. It wasn't worth the time needed to develop and maintain the materials. The people who attended the courses liked them, but the general attitude is "I spent 5 years studying <OBSCURE TOPIC> for my PhD, I can figure out Python on my own."
> who are force fed Python at university
shrug I was force fed Pascal, and have no idea if Wirth was a nice person.
> main reason for its popularity in machine learning
I started using Python in the 1990s both because of the language and because of the ease of writing C extensions, including reference counting gc, since the C libraries I used had hidden data dependencies that simple FFI couldn't handle.
I still use the C API -- direct, through Cython, and through FFI -- and I don't do machine learning.
> If it is so popular, why the booster articles?
It's an article about a company which sells a Python IDE. They do it to boost their own product.
With so many people clearly using Python, why would they spend time boosting something else?