Freedesktop.org?
For end-user application, it's a mess. You can't compile once and run everywhere.
LSB (Linux Standard Base) tried to that and failed.
flatpak/Snap/AppImage all tried to that, each have its own set of problems.
Building a Win16 application would be more difficult.
But this isn't directly comparable. The issue with Linux is getting a precompiled binary to work properly across all currently up to date distributions.
Yes you can, you said it yourself: flatpak, snap, and appimage.
Okay okay, those technologies have problems. But all technologies do. That's why Microsoft has half a dozen GUI frameworks and then they build their own in-house applications in Electron :P
My assertion: Inertia of user base is by far the largest predictor of what will stick in a market. If you can create a critical mass of users, then you will get a more uniform set of solutions.
For fun, look into bicycles and how standardized (or, increasingly not) they are. Is there a solid technical reason for multiple ways to make shoes that connect to pedals? Why are there several ways to shift from your handlebars? With the popularity of disk brakes, why don't we have a standard size for pads?
I think there are a lot of things that we tell ourselves won from some sort of technical reasoning. The more you learn of things, the less this seems true, though.
Not, btw, that there aren't some things that die due to technical progress.
I think it's not even an issue. Most open source projects are implementations (maybe flawed), and few are new propositions. If something was not fully defined in the standard/protocol/interface, the implementation may come up with its own extensions that are incompatible with others. But that's how you choose implementations, you're supposed to commit to one and not use a mismatch.
So if you're using GNOME, try not to depend on something that depends on having the whole KDE installed. If you're using OpenRC, anything systemd is prohibited. All projects are consistent within themselves, the only thing you need to do is to avoid conflicts by having two things doing the same job and to ollow the installed system guidelines.
I don't mind fragmentation. What I mind is dependency due to laziness and not a real need (like using a couple of glibc specific library for no real reason just because you're working on Debian) or taking the time to support windows and macos, but tying yourself to glibc on Linux because reasons.
I disagree with this bit. When i used a Gtk desktop I still used KDE applications so I had lots of KDE libraries installed. I use KDE now and I still have lots of Gtk applications installed.
Its slightly wasteful of storage and memory but is still a lot less resource hungry than Windows.
After this premise, can you explain to me why I should waste my time to help them? Let's not forget that we are talking about open source software and they could fix it themselves if they so desire.
Because each manufacturer can't put a premium on their pads that way.
Because triangles are a fantastic, high-strength shape, that suits the loads a bicycle is subject to. For the vast majority of cases, it’s a very solid choice. We deviate when specific UX requirements are required (city bikes having a battery and a low stepover to suit a variety of clothing, and the motor makes up for additional weight required.
> Is there a solid technical reason for multiple ways to make shoes that connect to pedals?
All of them attempt to address the same requirement, and make different tradeoffs depending on use-case and environment.
> Why are there several ways to shift from your handlebars?
Price points, performance reqs, handlebar setup, environmental (is it expected to be subject to mud rocks and trees constantly?) and weight.
> With the popularity of disk brakes, why don't we have a standard size for pads?
Same as for shifters: the braking compound and design for a race road bike will be really different to what a DH race bike requires.
All road shoes/pedals interface have the same requirements. All XC ones have the same ones, All Dh ones, same.
> Same as for shifters: the braking compound and design for a race road bike will be really different to what a DH race bike requires.
Regardless if we are talking shifters, breaks or pedals interface among a specific line (road, gravel, xc, DH) the requirements stay the same and more importantly those formats/shapes/interface almost never vary accross price point:
All mtb shifters from a given manufacturer and amount of gears available are usually interchangeable. A deore shifter can operate an XTR derailleur. Same SPD cleats accross all MTB pedals from Shimano. Same brake pad shape is used accross all lines for a given number of pistons. More importantly pads and calipers are usually interchangeable between road and mtb for a given manufacturer. Conpound, requirements and price point as little to do with it as manufacturers release pads with different compound but same shape.
What makes all these formats not standards is because every manufacturer wants to have its own for 2 reasons: 1) think it knows better 2) aim to capture a market and become a monopoly (through cleats format)
Only rarely they discuss between each others or release a standard and don't ask for royalties. Same as proprietary software vendors.
The open source fragmentation only really comes from reason 1.
I really don’t understand how this mentality Uk survives.
for the past 100 years companies have been working to get unfair advantage over each other by creating user lock-in, patent trolling each-other, DRM in games, changing their design to break compatibility with generic products etc.
surely you must realise that many motivations for product difference have no bearing on user benefit, or we would never have region-locking on DVDs or proprietary media formats.
Bicycle market is not a healthy competitive market. Shimano makes almost all gears for all bikes in Europe. For the price of an electric cargo bike that goes 15 mph and has 0.6 kWh battery I can buy an electric motorbike that goes 70 mph and has 6 kWh battery.
They are both about £4,000
Also, I'm not sure what kind of standard this author is pining for. We have Wayland and freedesktop.org. Pretty much any Linux app can already run on pretty much any DE
The best projects have someone who will make unilateral decisions. They might be right or wrong, but it’s done and decided. Companies with an organizational hierarchy do that much better than a decentralized group of OSS developers.
Choice looks like fragmentation. But the existence of alternatives is very important.
I think the really annoying parts were more related to nothing being quite perfect; things tearing, having iffy support for mixed resolution and refresh rates, sometimes browser not using hw acceleration and fans spinning up. Everything you could live with, but dealing papercuts.
Sure, most people's eyes gloss over the moment I try to explain the distinction between Linux and GNU, but if you're really that uncurious about how the system works then what's the point of switching to Linux anyway. Windows / Mac already get you most of the way there, especial now that WSL is a thing.
What is the point of using Windows or Mac when Linux will get you most of the way there?
Linux has a lot of advantages for users who are not interested in the technology. Longer hardware upgrade cycle, ease of maintenance and upgrades, more resource efficient, more secure (partly because it is less targetted as a desktop - but what matters is if you use Linux you are less likely to have issues), better privacy....
The point I am trying to make is that there are advantages for people who are NOT interested in every detail of the system.
See, I don’t necessarily think Linux is hard to use. Most desktop environments are so similar in style to Windows and Mac that they are pretty intuitive.
To me, setup is often the hardest part. It can be easy with some hardware. But if you aren’t lucky, getting a system running can be tricky.
This is one area where having someone who can make decisions is really helpful. Apple can make sure their software works with their exact hardware specs. Microsoft has various compatibility guidelines and enough market share to make sure they are always supported. Linux is fragmented and hardware support lags because of this.
Then you have smaller hardware vendors who support one obscure distro, but if that’s not what you want to run good luck. I spent last week trying to get Debian running on a Pine 64 laptop and still don’t have working sound.
You didn't pay a dime for it, except with your time on things that annoy you, which is indeed not free. But if your time is that precious and you are that exacting, maybe Linux isn't a good fit for you. Maybe you should try paying for the perfectly crafted commercial alternatives. Except in the real world, those alternatives are far from perfect and have tons of issues like spyware, bloat, ads being shoved down the users throat and so on - on top of the papercuts/annoyances that sometimes take years to fix.
Everybody recognizes the hurdles. But they are there because of the nature of the ecosystem. Unpaid skilled maintainers are hard to get and it takes specific kind of people to maintain a driver or a subsystem, for sometimes decades. Those personalities sometimes come with strong opinions and prejudices and such. Does that delay fixing of end user issues? Yes. If there was an easy solution, it would be implemented by now.
Funny you bring up Mandrake because I paid a good chunk of my salary to buy a box that came with the CD and a small printed book. I have been using Linux through thick and thin everyday since then.
> I love Linux
Me too! Let's not be too harsh on our loved one :)
1) Due to the stability of the Linux kernel ABI, switching distros is far easier than between Interactive, SCO, SunOS, IIRC, Ultrix etc...
There have only really been two major bootloaders in x86 Linux, Lilo and Grub, and the disk partitioning was driven by DOS/Windows and is still simpler than the dozens of differences between the commercial UNIXs or even adding on ons like VXVM.
I have switched Linux distros at a data center level several times in my career due to various reasons and it has only ever been constrained by downtime budgets, and typically is faster and easier than a major window upgrade, even before modern tools existed.
Different groups are always going to make different decisions, and those decisions will change over time within even single groups.
Nvidia drivers are a complete mess, even without licence concerns.
IMHO, with modern tools, you should let the application drive distro choices if needed, reserving your preferred distro for more generic needs that don't suffer from app vendor coupling.
That is not feasible for a large portion of users. Even seasoned developers would struggle to maintain a personal fork of anything but the simplest open-source software. Proprietary software also has an escape hatch, it's called "Voting with your wallet", i.e. giving your money to someone else who solves your problem better. The set of people who have money is much larger than the set of computer programmers.
Giving money to software you want to support has proven to be the most reliable way to direct software development. Granted, sometimes you're a Softimage user and Autodesk acquires the company making your software and then kills it to remove competition for their other 3D programs. Those cases are much rarer than the case where you want a consistently supported GUI for your OS and your only option is to write it yourself.
When you think about it, development of proprietary software is a lot more democratised than open-source software, because average users can direct where development goes by voting with their wallets, or even without their wallets in the case of piracy, which still drives development via second-order network effects. BitTorrent has done more for practical software freedom than GNU ever did.
No, but each of those two systems is ruled by one dictator and has one blessed way to do things. For example in the Windows / .NET world, it's WinForms, er I mean WPF, er I mean...
> Also, I'm not sure what kind of standard this author is pining for.
It sounds like a wish for the clarity of a cathedral rather than the chaos of a bazaar.
Individual OSS projects are often good at self-coordination. You don't need standards if you just reinvent the wheel yourself AKA how proprietary software often works.
Was going to say this too, cause competing proprietary software companies generally don't coordinate. Macs don't easily run Windows programs and vice versa. Unless an alliance or some agreement to adhere to some standards body is made, the collaboration issue is part of both worlds.
[1] I prefer using that word because most aren't really standardised.
I don’t think you can take a general lesson from any particular example here. Coordination of complex systems among thousands of competing and cooperative components is very hard and unpredictable, and why things happen depends on random events and personalities in ways that are not generalizable.
Why are you calling open source of market?
These appear to be feuds and battle of ideas fought between contributors, with minimal input from end users. There is no price signal at all
Re. Bicycles, it’s not a healthy market. Shimano dominates with 70% share in gears and brakes. Top 3 manufacturers have what, 95%+? Also look at how cargo bikes cost 3x what a normal bike does, but have same components.
With that structure, users have zero input on size of breakpads.
We see this in open source too - we can coordinate and all agree on a core idea or problem that needs solving, but still end up with different, competing implementations. It's not a bad thing, choice is good and often leads to innovation as different approaches compete and evolve.
I do think you are right on your first point - that inertia of user base is the predictor of what will stick. Even Linux, stuck around due to licensing and availability (and then user inertia from that) rather than any technical superiority.
Yes!
- Toe clips give you more power than just flat pedals. You can use regular shoes with them.
- Straps (similar to toe clips) can and be used with regular shoes / trainers and allow you to control the pedal stroke up and down. Fixed gear riders use them a lot.
- MTB style cleats are easier to unclip and re-clip than Road style cleats as you are more likely to need to clip / unclip quickly.
- Road style cleats provide better power transfer, the shoes are far stiffer as well.
> Why are there several ways to shift from your handlebars?
Two reasons. The first is that the technology has improved over time. I have ridden old racing bikes where the shifter was not on the handlebar and on the downtube. You had to feel the position of the next gear while steering with the other hand. It is difficult for people that haven't ridden a road bike before to get used to.
Secondly the different types of bikes have the rider in different positions and thus their hands will be in a different position. Thus the different shifter types.
The moral of the story is that different requirements, require different solutions.
BTW almost everything else around the pedal and the shifter is standardised. The hub are normally one of a few sizes, wheel sizes have a few standard sizes, bottom brackets are almost all the same sizes, headsets have a few standard sizes. I have a mountain bike from 1995 that I've put a brand new stem on because the headsets are the same size as they were in 1995.
Software, or lack there of, is not the reason for Visa/Mastercard ruling the marketplace.
They may have a duopoly currently but its a matter of when, not if, that it is broken
Why? Unchecked capitalism, lobbying, corruption, politics, societal needs[1], realpolitik[2].
[1]: E.g. legal system requiring KYC to investigate money laundering.
[2]: In the positive sense; see items 2-4 for the negative sense.
You gotta have someone write those language servers for free, and the language servers have to be performant. In 2006 that meant writing in a compiled language, which meant that anyone creating a language server for an interpreted language would need to be an expert in two languages. That was already a small pool of people.
And big multiplayer OSS platforms like GitHub didn't exist until 2008.
I think LSP is only truly useful in two contexts, global symbols (even with namespacing) and confusing imports. In language like C, you mostly have a few structs and functions, and for the ones in libraries, you only need to include a single header. With Python, the imports are concise and a good references gives you the symbol identifier. But with languages that needs an IDE or LSP, you find yourself dealing with many imports and many identifiers to write a few lines of code and it becomes quickly unmanageable if you don't have completion or autoimports.
SourceForge launched in 1999. I think GitHub is better in many ways, but the basic building blocks of hosted repo, issue tracking, and discussions (via email lists) on Sourceforge. I collaborated with folks on a number of projects on SourceForge way back when.
It's called the Common Desktop Environment.
Any desktop program needs to be programmed against some API. In the case of Emacs, it's probably raw Xlib or a wrapper library on top of it.
The problem with that is that (a) your dependency on X11, which is obsolete and has many documented inadequacies, (b) the lack of a modern widget library and toolkit makes extra, unnecessary work for the programmer, and (c) the lack of a cohesive visual language between programs makes the experience worse for the user.
Toolkits like GTK and Qt solve all these problems. By avoiding them, you're just reinventing the wheel, poorly, every time.
So your point is that we should use older technology even when it's been surpassed by better alternatives?
If it's FOSS, at least you have the option of trying to repackage it for your distribution. You're SOL if it's a proprietary application distributed in binary format. , though.
https://blogs.gnome.org/alexl/2017/10/02/on-application-size...
I can't speak for the others, but Flatpak is a layered solution, so files are deduped and shared across the layers (runtimes, applications) that need them.
Coordination is hard. People who are good at coordinating are not necessarily the same people who are happy to contribute their time to FOSS. And FOSS may need to coordinate in ways that vertically integrated companies do not.
Coordinating between loosely aggregated volunteer projects is not the same as coordinating between vested stakeholders either. I would guess that most FOSS projects are more invested in their own survival than in some larger objective. Teams within a company are (presumably) by definition invested in seeing the company mission succeed.
The GNOME / KDE example mentioned elsewhere in this thread is interesting because these are two somewhat equivalent co-existing projects. Any coordination between them is surely not their highest priority. Same with all of the different distros. The each exist to solve a problem, as the fine article says.
I wonder how much the problem is actually "open source can't standardise on a single solution." Let one thousand flowers bloom, sure. But don't expect a homogeneous user experience. The great thing about standards is there are so many to choose from. xkcd 927. etc.
“Which one?”
This is pretty much the cause of a 90% drop off of interest in Linux on the desktop.
I could say use Ubuntu (and I do) to some of the people who I’m close with that are interested in Linux, but they discover Lubuntu, or Linux Mint and Debian, then they get easily confused and give up.
And that is not even getting into the updates and the packaging and heaven forbid anything breaks.
Last time I tried another round of "let's install the most recent versions of popular distros on random laptops I have", Fedora was the most finicky about hardware. As in literally wouldn't even boot into live CD on one of said laptops, and had troubles with graphics on others.
The thing that worked every time? For the past decade or so, it had consistently been Linux Mint for me.
How to spot the Ubuntu user...
And then watch his eyes glaze over as he realizes that he's bitten off a lot more than he can chew. :D
I think the safest way is to stay on a LTS as long as possible. Then, when the time comes to upgrade, get a new drive, install new LTS and then transfer data.
... and this a layer of open source flexibility I never wanted. I don't want alternatives to core system management; I want one correct answer that is rugged, robust, well-tested, and standardized so that I don't have to play the "How is this service configured atop this service manager" game.
OP blames FOSS for not providing an IDE protocol a decade earlier, but doesn't ask the rather obvious question of why language-specific tooling is not only still around, but as market-viable as ever. I'd argue it's because what LSP tries to do is just stupid to begin with, or at least exceptionally hard to get right. All of the best language tooling I've used is ad-hoc and tailored to the specific strengths of a single language. LSP makes the same mistake Microsoft made with UWP: trying to cram the same peg into every hole.
Meanwhile, Microsoft still develops their proprietary Intellisense stuff because it actually works. They competed with themselves and won.
(Minor edit: I forgot that MS alone didn't standardize LSP.)
Could you elaborate why? It looks like a useful protocol.
Unsurprisingly, the vast majority of servers work much better with VSCode than other editors. Whether this was a deliberate attempt by Microsoft to EEE their own product, or simply a convenient result of their own incompetence, is ambiguous.
Exactly the same thing happened with VST audio plugins. Initially Cubase was the reference host, later Ableton Live became the reference and it was impossible to convince plugin developers that they were out of spec because "it works in Ableton".
My impression, having programmed against both the LSP and VST specifications is that defining well-specified interfaces without holes in them is not a common skill. Or perhaps such a spec (maybe ISO C is an example) is too expensive to develop and maintain.
Everybody standardized on Eclipse plugins almost 2 decades earlier anyway. It got replaced because the standard sucked. The new one is better, but by how much is still a question.
But that is not at all how Posix operates or has operated. Posix standardises common denominators between existing implementations. The fact that we now have strlcpy(3) and strlcat(3) in Posix 2024, is not because Posix designed and stipulated them. Rather, they appeared in OpenBSD in 1998, were found useful by other *nix-es over time, spread, and were finally taken aboard by Posix that standardised what was already out there and being used! This to me is the very opposite of the point the author is trying to make!
BSD got them in 1998, it took 17 years for it to go to posix and another 8 years before they made their way to glibc. 25 years to add a clear improvement
https://sourceware.org/legacy-ml/libc-alpha/2000-08/msg00053...
So it wasn't for lack of trying. Yes, Open Source can't coordinate and this is why we can't have nice things.
He is clearly not being rational there, but I could see how his aesthetic tastes might correlate pretty well with robust software. I suppose that saying no to new features is a good default heuristic, these additions could have easily added more problems than they solve, and then you have more surface area to maintain.
That being said, this old-school ideology of maintainers dumping the full responsibility on the user for applying the API "properly" is rather unreasonable. It often sounds like they enjoy having all these footguns they refuse to fix, so they can feel superior and differentiate their club of greybeards who have memorised all the esoteric pitfalls, simply because they were along for the journey, from the masses.
There's also an element of "Linus Torvalds is an antisocial jerk, and he's a genius, therefore if I am an antisocial jerk I must be doing genius-level work." In particular, it's a lot easier to attack someone with empty insults than it is to defend your own position with substantive thought.
There's way too much of this in general. People use a talented individual with problematic behaviors to justify their own problematic behaviors. So many talented ICs that are absolute dickheads to work with.
Gordon Ramsay - "this food is fucking raw!" /throws food
We have sitcom dramas like House glorifying the same thing.
This is the kind of stuff that can make people seriously ill and kills multiple people every year. This isn’t even lack of skill, it’s pure laziness.
"I want to trust your judgement, so I can delegate decision making to you" has always been his poorly-delivered message.
an antisocial ramsay would just throw your stuff no matter what even you did well, for the sake of messing with your head
>an antisocial ramsay would just throw your stuff
A social Ramsy would refuse to eat your food, but not throw it at you or have a giant baby fit about it. Of course no one would watch him on TV if he was calm and collected.
My point being, not that the person isn’t a jerk or that the decision wasn’t wrong, but that one error by one jerk doesn’t tell us much.
Because there is no easy way to determine if they actually blocked hundreds of bad ideas.
I think being qualified as a jerk or not is orthogonal to the need of gatekeeping (required in my opinion) or the quantities (higher for more popular).
Perhaps you're not seeing because those never existed? I mean, even though you're clearly speaking in hypotheticals, you're fabricating an outlandish scenario where somehow you associate being a jerk with a 100s-to-1 success rate. But there is nothing to support or even suggest that's even remotely real, plausible, or even conceivable. You have only concrete evidence of someone rejecting a sound recommendation on the grounds of gatekeeping mixed with NIH. Gatekeeping and NIH are not quality gates, are they? If that is the process, obviously you cannot expect a positive outcome.
As opposed to what? Unbiased and dispassionate? There's no such thing. What you're probably thinking of is careerist and authoritarian within a corporation. It's not more efficient than the darwinism of open source.
Naturally, passionate builders and experts who rise to prominence controlling a tool will feel strongly about the vision for that tool. That's how it gets made in the first place.
Calling them "emotional" is just cheap.
Your so called "rationality" is easy when you're not the one pouring your intense effort into something.
You keep diminishing and attacking these "arrogant" creators while you're clearly the model of rationality who habe built... No, you use what they build. Funny that.
Maybe take a humble pill.
This is no utopia, and it is not rare, it's pretty basic professionalism and engineering discipline. If you really care about the problem you are solving, you'll push the rest of the baggage aside, especially your ego.
Surely name-calling and making unfounded gut judgements based on us-vs-them tribalism, like is seen in that response, is not very productive. He demonstrated no intention to solve the problem, no acknowledgements that it exists, no explanation why the solution is not appropriate, what alternative solutions might be better... He had no interest in working together to find the best path forward. He was simply being territorial and scaring off those that did not align with his Holy Taste, whatever that is.
You're like companies claiming that "we make decisions based on data".
Believe your own Kool aid but reality is much more nuanced and power/leadership/intuition based than "data based".
I don't want to get into politics but it would be extremely easy for me to find several examples where you'll claim something and when I say that's emotional and tribal you'll decide I must be <label>.
I don't even care about this specific example but about your initial generalization from it. Either you talk about this specific case only or you make and prove your generalizations in a "rational and unemotional" way, right?
Often the reason for these pitfalls is that they exist because they enable some performance optimizations. The respective maintainer does care about performance.
Intel, AMD and Apple would very likely be willing to invest an insane amount of money for a 10 % performance increase. So, if this indeed increases the performance by about 10 %, I'd call it a very good idea.
I'd wager the other 90% are OSS maintainers being jerks. Occam's Razor and all that.
If the core developers/maintainers are putting in thousands of hours over several years, and a patch comes along, it is rightfully at the discretion of those doing 80-95% of the work.
But as negotiating rationally discusses, we value our work more than others—and there’s some emotional attachment. We need to learn to let that go and try to find the best solution, and be open to the bigger picture.
https://en.m.wikipedia.org/wiki/The_Cathedral_and_the_Bazaar
https://www.simonandschuster.com/books/Negotiating-Rationall...
People developing proprietary software will not be any less emotional or any more rational. The difference is that it does not happen publicly.
He was not that wrong!
A better example would be signed integer overflow, which a conspiracy of spec authors who don't work in the real world and compiler maintainers with a perverse sense of humor have decided means "Anything goes."
if (strlcpy(dst, src, dst_len) >= dst_len) {
// Truncation!
}open source people create reusable interfaces. i'd argue they go one step further and create open and public internet communities with standards, practices and distribution/release channels.
Thus far, open source has optimized for maximum utility for individuals who can write code... but AI may be changing that soon enough.
Proprietary, money driven development, is top down and has coordination in general. In very large software, it starts failing sometimes (I’m looking at you Oracle)
Open source handles conflict by forking. I wouldn’t call that good coordination.
But, at the same time, I don’t see a better (or less worse) solution so I shut up and I take their code =)
Forking is far from the first step in conflict resolution; it is the ultima ratio between projects in the open-source world, when all dialogue breaks down. In other words, the worst outcome is that people agree to disagree and go their separate ways, which is arguably as good a solution as is possible.
In the corporate world, coordination mostly exists within companies through top-down decision-making, as you said. Between them, however, things look much grimmer. Legal action is often taken lightly, and more often than not, a core business goal is to not just dominate, but to annihilate the competition by driving them out of business.
Coordination between corporations, such as through consortia, is only ever found if everyone involved stands to profit significantly and risks are low. And ironically, when it does happen, it often takes the form of shared development of an open-source project, to eliminate the perceived risk of being shafted.
You also do a fork if you simply want to try out some rather experimental changes. In the end, this fork can get merged into the mainstream version, stay independent, or become abandoned. People wanting to try out new things has barely anything to do with all dialogue breaking down.
The latter is where you need the coordination. In a company someone high up can say "we're doing it's like this" and everyone has to fall in line.
> But then, how can Linux exist? How does that square with “never break the user space?”
Hot take: This catch phrase is out of date. For Linux desktop normies like me who don't really care about the stability of the Linux user space API, user space does break when GUI libraries (and the myriad of libraries dependencies) change their APIs. For example, I mostly use KDE, which depends upon Qt libraries for its GUI. Qt regularly introduces breaking changes to their API during each version increment: 4->5->6, etc. (I don't hate them for it; it is normally carefully done and well-documented.)Introducing breaking changes with major version releases is standard software development practice. Very few projects go out of their way to always keep backwards compatibility.
I guess y'all know better :)
I don't think its the mediocre interface thats holding Linux back...
Whether its the abomination thats Windows 11, having to fight against an ad ridden interface in 10 or otherwise. Teams hasn't dominated because of a coherent interface, or even because anyone actually wants to use it.
Besides, you say 1000 desktops, but there is really only 2 (well 1 since Gnome is the primary interface for the big 3 distros) along with couple of hobby ones that you have to seek out to even learn they exist and a lot of toys that no one outside HN has even heard of.
But I agree it's only partly because of that. The bigger issues are poor hardware support (especially for laptops) and over reliance on the CLI and config files.
We will have to just agree to disagree on that one.
Windows 10 reverted back to a warmed over 7, but with flashy boxes to add noise, and the injection of adverts over time.
The multi-desktop implementation was half baked, and when applications request focus you can end up with an unresponsive desktop until you find it.
11 is an interface that only a mother could love. Pointless changes like centering the bar make me wonder if you ever used it. It's a step back in every way, but without a guts to try something different like 8.
8 didn't work for laptops/desktops, but at least there was an attempt.
I agree that is dumb, but it's a trivial setting to make it left-aligned again. It's literally the only thing I've changed. Everything else is great.
> the injection of adverts over time.
I use the IoT LTSC version and have zero ads. I guess there's no guarantee it will last but it's great for now.
Further try Little Snitch on macos to see how completely out of control it is. Since they implemented the sealed volume it is also a lot harder to configure permanently, so fixing it is less feasible.
Linux and FOSS grew up (congrats!) and the important work got super big and complex.
Unfortunately, online communication - typically the channel preferred by FOSS projects - has much lower bandwidth than teams working full time in an office. Thus limiting the "depth" of collaboration.
It's not all bad. FOSS does have some profound success to be proud of. For small and well-defined projects "benevolent dictator for life" works! Anything one person can lead - a desktop tool or a library - FOSS produces really good outcomes. But above say, a package manager or a distro.. things get wonky.
Again it's not all bad. FOSS is rolling up its sleeves. Folks are organically doing the "go down every avenue people are motivated to go down" thing. You could call it Darwinism. But many motivated communities lack resources to go far enough to reach value. Motivation stalls during projects (we're human!), and FOSS rarely comes with a motivation-boosting paycheck. Plenty of valiant efforts don't reach value and it's never malicious. It's OK!
So is there a way to better concentrate efforts?
If the paths are long, it follows that the community should tackle fewer paths. The path(s) taken should be well defined, charted in advance as much as possible, and not uncovered bit by bit - or the work will take decades.
Growing an entire ecosystem around one path forward (or a few) requires alignment. Can enough trust be fostered in leaders to get people to work on a shared vision?
A vision of what Linux on the desktop should/could converge to is the kind of problem that, if Linux were a company, would be bet-the-company strategic. A company can't afford to go down two paths. So it might lock its smartest people in a room to hash out one true strategy. Or have one smart person dictate one vision and align everyone on it.
Can that be done for FOSS?
In the bounds of a single project it has been proven that it can. But what about an entire ecosystem?
I wonder if that's why open source projects get so much done and at such a high quality with so few people.
Instead of 75% "communication" and 25% work, 90% of the time donated to FOSS is actual work :)
Source on this? There are tons of collaborative, 100% remote companies out there (and they release open source software). I think your assertion may be more the folks aren't as dedicated to open source as contributing is a part time or hobby thing.
I think it's probably quite different if you're a "core contributor" and likely using additional channels like slack and scheduled meetings, more akin to a company operating.
And while there are lots of desktop environments for Linux you can usually run applications targeting one in any of them (I use Gnome's file manager in Enlightenment as it supports accessing CIFS shares directly).
The Linux kernel and GNU in general are projects that hacked around that problem by just copying the decisions of other people who were coordinated by capitalists (UNIX vendors), which worked long enough to bootstrap the ecosystem until some of the key people could be coordinated by Red Hat and others who monetized indirectly. But at every stage, the coordination was being produced by capitalists even though it was hard to see.
In other places where the mimic-and-support model didn't work, open source really struggled. This is most obvious on the desktop. Even there, ultimately this approach has been adopted for large chunks of it. If you play games on Linux today it's because people copied the Win32 API i.e. the coordination was produced by capitalists like Bill Gates.
Now Alex mentions LSP and JetBrains. The reason JetBrains didn't do the LSP isn't because of value capture. After all, IntelliJ has been open source for a long time. Other IDEs could easily have embedded it and used its plugins. The reason JetBrains use a Java API is because it's a lot more productive and effective to design in-process APIs than network protocols. As long as you aren't crossing runtime boundaries they're easier to write, easier to test, easier to reason about statically (especially w.r.t. concurrency), and much more performant. You can exchange complex object graphs in a shared address space and coordinate them using locks. All this is a highly effective way to extend an IDE.
Microsoft did the LSP because they took a bunch of energetic developers who only wanted to do web development, so they used Electron. Also for reasons of sticking with the crowd, .NET being pretty useless for cross-platform desktop stuff... it's not just that experience with desktop programming is fading away. But browsers were never designed for the challenges of large scale desktop programming, in fact they weren't designed for building apps at all. So they don't let you use threads, static typing via TypeScript is an aftermarket hack, V8 has very low maximum heap sizes, and there are many other challenges with doing a clean JetBrains style architecture. To their credit, the VS Code team leaned into the architectural limits of the browser and did their best to turn it into advantages. They introduced this notion of a backend that could run independently of the frontend using a 'standard' protocol. This is technically not really different to the IntelliJ API being open source, but people like the idea of protocols more than embedding a JVM and using stuff in a company-specific namespace, so that created a lot of community good will and excitement for them at the cost of many technical challenges.
Those challenges are why JetBrains only use the LSP style approach for one of their IDEs, which due to historical reasons doesn't share the same architectural approach as all the others. And it's also why, if you look at the Rider protocol, it's some fairly advanced state sync protocol thing, it's not a plain old HTTP RPC style protocol.
Given that both are open source and both are produced by teams of paid developers working in an office coordinated by capitalists, it's probably not right to identify this as an open source vs proprietary difference. It's purely a technical one to do with JVM vs web as foundational platforms.
> But it is also clear why JetBrains didn’t do LSP — why would they? While the right solution on the technical grounds, you aren’t going to get paid for being technically right.
Its also because a lot of the key people in Open Source, and senior hackers generally, don't actually use IDEs.
We should encourage more of the younger generation over to powerful configurable editors such as Emacs, rather than locking everybody into VSCode/JetBrains/etc.
Isn't that precisely what LSP facilitates? I was using IDEs for like four years. Now I'm back to (neo)vim and couldn't be happier! There is no substitute for "jump to definition".
http://www.catb.org/~esr/writings/cathedral-bazaar/cathedral...
Since then Microsoft has had no real answer for "how do I write desktop applications for Windows?" other than "use Electron".
(If they were still introducing new widget sets they'd be converting the 'modern' dialogs to something 'postmodern' while still having Win '95 dialogs in there)
Access 97 depends on some Internet Explorer components and Microsoft has made it all but impossible to install Internet Explorer on the most recent Windows 10 and Window 11.
Apart from that I also tried getting Minesweeper and SkiFree to work on Windows 10 and Windows 10 just straight up refuses to run them with the message "This app can't run on your PC."
https://www.ghacks.net/2023/02/13/reminder-internet-explorer...
Microsoft has been pushing WinUI the past few years, with WinUI 3 being the latest recommended UI toolkit [1]. I think what's interesting about WinUI 3 is it's not built into Windows - you have to ship the whole toolkit with your app, just as you would GTK or Qt on Windows. I find that a perplexing direction for a "native" toolkit.
[1] https://learn.microsoft.com/en-us/windows/apps/winui/winui3/
21 years old bug. XDG is such an obvious improvement, yet getting all projects on board is taking forever.
The “never break the user space” philosophy is limited to the kernel, and the single entity coordinating that realm is called Linus.
Open source is a movement: it's neither an individual nor a committee. And people join it because it is a movement with no central authority.
The article doesn't bring up any critical issue that the world world opensource should suddenly deal with - feels more like a morning rant after his shower - which is also how the post starts - it's basically on the front-page because its title is trying to be provocative.
> The past ten years saw a big shift in how we are writing software: baseline level of “interactive static analysis"
While I am making no judgement on what is 'better', the author's choices have impacts, and sometimes not all projects can work within the costs imposed by static analysis.
For example, remember that Rice's theorm generalizes HALT, and that you always have to under or overestimate with SA. Either introducing over constraints or missing things.
It is horses for courses, sometimes the added friction is worth it, other times it is damaging.
The question of if Nix is a problem is a context specific question.
Be careful about making default assumptions and patterns more than what they are, no matter how sensible they are as a default.
KDE (not to be confused with the Plasma desktop) is just a bunch of C++ libraries that can work on a variety of desktop environments and even OSes (though Hotspot being a perf report alternative is clearly meant for use with Linux).
I just went and downloaded the latest CI build from[0] and it ran just fine on my openSUSE Tumbleweed, running Xorg with Window Maker. I do have a bunch of KDE apps installed, like Kate (my currently preferred text editor), Dolphin (the file manager i use whenever i want thumbnails, usually for videos and images), Spectacle (for screenshots), Falkon (i use it as a "clean" browser to test out things), etc so i also do have the KDE libraries on my system, but that is just a `zypper install` away. Or an `apt-get install` or `pacman -S` or whatever package manager your distro uses, i've used a bunch of them and they all pretty much behaved the same. I'd expect Hotspot to be installable in the same way in any of them (and i'd expect the AppImage to have these libraries bundled in anyway so you probably wont need them[1]).
If there are issues with NixOS (i don't know, i haven't tried it) i think it might actually be a NixOS issue and not a KDE issue.
[0] https://github.com/KDAB/hotspot/releases/tag/continuous
[1] EDIT: i checked with --appimage-extract, it contains pretty much everything
What author describes is about dominance.
Most OSS is fragmented because of different ideas and different people.
People nag about .NET not having much outside OSS because .NET devs (I am one of them) will not use stuff that doesn’t have MSFT badge.
You don’t want such power in general OSS. Lack of coordination is sign of no dominant entity and that is the feature.
In contrast, open source software led by the same handful (typically just one guy) of people over years/decades are well coordinated by the BDFL(s) and have a clear direction.
I think the term the author is looking for is "opinionated". The mantra is "if you don't like it fork it". The apparent lack of coordination is a feature of open source,not a bug.
Successful and popular projects rarely seem uncoordinated. The Linux kernel coordinates thousands of devs over mailing lists. Git was created to facilitate that, and now everyone uses to coordinate development.I would even dare say modern dev coordination is spearheaded by open source projects.
Why editorialize a question mark into titles like this? What does it signify? We are not talking about a factual assertion of debatable veracity, but a statement of opinion along the lines of "considered harmful". Disagreement and debate is the obvious expected result of such a statement. HN admin using special mod powers to editorialize a title which is neither misleading nor clickbait, simply to indicate skepticism, comes across as petty and not a little hypocritical in light of the "don't editorialize titles" site guideline.
Comments that react purely to titles are shallow, so we don't much want them here. I suppose the goal of nearly all the title edits we do, including this one, is to minimize them.
TechPlasma•7mo ago
Valve is maybe the closest?
hackyhacky•7mo ago
Part of the problem, is that "Linux/Unix culture" is very averse to coordination. When someone does try to establish a common baseline, there is inevitable pushback. The classic example is systemd, which fills a desperately needed hole in the Linux ecosystem, but is to this day criticized for being antithetical to the ethos of, I guess, gluing together an operating system with chewing gum and bits of string. The fact is that many users would rather have a pile of software that can be hand-assembled into an OS, instead of an actual cohesive, consistent platform.
So I can't blame people too much for not trying to establish standards. If OSS had created LSP, there would be 20 different incompatible variations, and they would insist "We like it this way."
EDIT: averse, not adverse
Joel_Mckay•7mo ago
The Driver support issues are essentially a theological war between FOSS ideals, and mystery OEM binaries.
Most of the linux kernel code is still the driver modules, and board support packages.
The desktop options have always been a mess of forks and bodged applets to make it useful.
Ubuntu balances the purity of Debian with practical user experience (we could all write a book about UEFI shenanigans.) RedHat focuses more on hardened server use-cases.
Is it worse than Win11 ? depends what you are doing, and what people consider is the low bar for messing with users. =3
clipsy•7mo ago
If the hole is desperately needed, why would you want to fill it?
hackyhacky•7mo ago
Good point. Let me rephrase: "Systemd fills a hole in the Linux ecosystem, which desperately needs to be filled." This version of the sentence is more correct and conveniently functions as a double entendre.
j16sdiz•7mo ago
Better integration for mainstream, sure. but at the end we have less choice.
hackyhacky•7mo ago
This is exactly my point: you want "diverse choices", which is fundamentally at odds with "cohesive functionality."
The article is about LSP, an imperfect standard, but nevertheless a standard. The prioritization of "choice" above all else is why the OSS world is incapable of creating standards.
> systemd killed many projects
The purpose of software is to fulfill a need. Creation of software projects is simply a side-effect of that process. It's good that systemd killed many projects, because those people who had worked on those projects can now work on a problem that hasn't already been solved.
shadowgovt•7mo ago
Choice implies complexity, and there are some places less complexity is quite desirable. I still periodically, when setting up a new Linux machine, have to figure out why the audio frameworks are fighting, for example. The fact that "frameworks" is plural there makes everything harder for me, the end user.
(I compare Python and Node environment management frequently here. Python standardized the protocol for setting up an environment. Wise, better than nothing, but now I have to care whether something is using conda or poetry or some several other options I don't know. Node has npm. If there's a package, it's in npm. To setup a Node service, use npm. One thing to know, one thing to get good at, one thing to use. Environment management with Node is much easier than in Python).
linguae•7mo ago
The difference between open source software versus proprietary software is that if users don't like the changes made to proprietary software, there choices are limited to the following:
1. Dealing with the changes even though they don't like it.
2. Sticking to an older version of the software before the changes took place (which can be difficult due to needing to deal with a changing environment and thus is only delaying the inevitable).
3. Switching to an alternative product, if available.
4. Writing an alternative product (which can be a massive undertaking).
Open source software provides additional options:
5. Fork the older version of the software. If enough people maintain this fork, then this becomes a viable alternative to the changed software.
6. Use the new version of the software, but modify it to one's liking.
This is the blessing and the curse of open source software; we have the power to make our own environments, but some software is quite labor-intensive to write, and we need to rely on other people's libraries, systems, and tools to avoid reinventing wheels, but sometimes those dependencies change in ways that we disagree with.
I think the best way to mitigate this is making software easier to develop and more modular, though inevitably there are always going to be disagreements when using dependencies that we don't directly control.
charcircuit•7mo ago
Despite Android's success the rest of the consumer Linux distributions chose to ignore it and continue on with what they were already doing. Trying to have them coordinate around what is succeeding is seemingly impossible.
hackyhacky•7mo ago
I'm not sure I understand you here. What do you think other Linux distros should have done?
charcircuit•7mo ago
Collectively contributing to getting AOSP running on desktops, and then also working on backwards compatibility to be able to package their preexisting apps into Android apps. This would allow for there to be a common app platform for developers to target Linux with.
hackyhacky•7mo ago
As a common target, AOSP isn't a very good one.
AOSP ran on desktops. (Maybe it still does, haven't tried it in a while.) It was still a mobile OS, though, so it wasn't good on the desktop, but it ran.
It also uses very old kernels.
Other than the kernel, the Android UI is completely different from conventional Linux. Any Gnome or Qt app would have to be completely rewritten to support it, and would probably have to run in the JVM.
Basically, if the Linux community followed your plan, they would have to commit a huge effort to port everything to what is essentially a completely different, incompatible OS in every respect except the kernel, and their reward would be to live in subservience to the whims of Google in supporting their product which Google themselves never had enough faith in to make it a proper desktop OS. It seems that the benefit does not justify the investment.
charcircuit•7mo ago
Which is why it would benefit from people who are trying to optimize it, and extend it to offer a good desktop experience.
>It also uses very old kernels.
It's based off the latest LTS release of the kernel.
>Any Gnome or Qt app would have to be completely rewritten to support it
Which is why my comment said that distros would work on backwards compatibility to avoid such expensive work of requiring a complete rewrite amd try to make it as seamless as possible.
>and would probably have to run in the JVM
Android does not use the JVM. It has ART, the Android Runtime, but you can still use native code.
>and their reward would be to live in subservience to the whims of Google in supporting their product which Google themselves never had enough faith in to make it a proper desktop OS
The benefit is being able to reap the fruits of the billions of dollars Google's is investing into the OS. Along with compatibility with a large amount of apps. As a bonus staple Linux applications may be able to installed to some of the billion existing Android devices today. Google may not have seen the benefit of supporting the desktop, but that's where smaller players can come in to play a role in trying to focus on more niche markets where there is less possible return.
skydhash•7mo ago
charcircuit•7mo ago
Sure the windowing is limited, but it could be extended. I disagree that the IPC is limited though.
>Which means having access to all the ports and coding bespoke protocols. I don't think current android API allows for that.
It's still all open source. The distros could add new APIs to expose new capabilities.
skydhash•7mo ago
Those exist already. With Debian, Alpine, Fedora,... you can put anything on top of the kernel in the userland. Android goes with a restricted version.
It's the same with MacOS. It's Unix, but with proprietary add-ons and systems. And lately, with more restrictions.
charcircuit•7mo ago
By restrictions do you mean having proper capability based security instead of letting all apps have access to everything? These restrictions are a good thing.
skydhash•7mo ago
o11c•7mo ago
hackyhacky•7mo ago
To be fair, Apple and Microsoft have also failed to try to unify desktop UI with mobile.
o11c•7mo ago
Linux has had other major dramas but not failures.
happymellon•7mo ago
As far as I know Google has never accepted patches into Android, so everything has to be maintained outside the project in parallel, which helped kill it.
Google is not your friend, and they will not work with you. Android has diverged several times, and they break everyone else without caring.
dontlaugh•7mo ago
TechPlasma•7mo ago
pxc•7mo ago
It's a TiVo-ized spyware delivery platform, absolute riddled with (often non-removeable, often installed by entities other than the user) badware.
Android is an abject, dismal failure when it comes to very basic things like empowering users.
charcircuit•7mo ago
I disagree, unless you mean that they care about having there OS copy how UNIX worked 50 years ago.
>It's a TiVo-ized spyware delivery platform
Boiling things down to a pile of buzzwords is not productive especially when they aren't accurate.
"TiVo-ized": Android fully supports a user unlockable bootloader. Such a term doesn't even refer to the operating system, but to the device / bootloader, so it doesn't make sense to describe Android like that.
"spyware delivery": I assume this means that it includes a package manager that can install apps automatically. Several other Linux operating systems support that too. That isn't unique.
>absolute riddled with (often non-removeable, often installed by entities other than the user) badware
It is up to the vendor to pick what software they bundle with the OS. It's not inherent that "badware" has to be bundled.
Your criticisms of Android are not even with the operating system itself, but with downstream versions of it.
yencabulator•7mo ago
Android has made some very interesting technical decisions, and Linux workstations should implement something along pervasive sandboxing, for sure. The pretense that Android in itself is something anyone outside of large phone-manufacturing companies should (or are able to) deal with is a little silly.
keyringlight•7mo ago