frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Why Twilio Segment moved from microservices back to a monolith

https://www.twilio.com/en-us/blog/developers/best-practices/goodbye-microservices
89•birdculture•2h ago•64 comments

Recovering Anthony Bourdain's (really) lost Li.st's

https://sandyuraz.com/blogs/bourdain/
51•thecsw•2h ago•7 comments

VPN location claims don't match real traffic exits

https://ipinfo.io/blog/vpn-location-mismatch-report
186•mmaia•3h ago•108 comments

I tried Gleam for Advent of Code

https://blog.tymscar.com/posts/gleamaoc2025/
209•tymscar•6h ago•119 comments

I fed 24 years of my blog posts to a Markov model

https://susam.net/fed-24-years-of-posts-to-markov-model.html
51•zdw•3h ago•17 comments

The Rise of Computer Games, Part I: Adventure

https://technicshistory.com/2025/12/13/the-rise-of-computer-games-part-i-adventure/
27•cfmcdonald•3h ago•3 comments

Want to sway an election? Here’s how much fake online accounts cost

https://www.science.org/content/article/want-sway-election-here-s-how-much-fake-online-accounts-cost
90•rbanffy•2h ago•40 comments

Flat-pack washing machine spins a fairer future

https://www.positive.news/society/flat-pack-washing-machine-spins-a-fairer-future/
14•ohjeez•47m ago•1 comments

Useful patterns for building HTML tools

https://simonwillison.net/2025/Dec/10/html-tools/
215•simonw•3d ago•63 comments

Cryptids

https://wiki.bbchallenge.org/wiki/Cryptids
79•frozenseven•1w ago•12 comments

Ask HN: How can I get better at using AI for programming?

157•lemonlime227•7h ago•202 comments

Go Proposal: Secret Mode

https://antonz.org/accepted/runtime-secret/
139•enz•4d ago•60 comments

From Azure Functions to FreeBSD

https://jmmv.dev/2025/12/from-azure-functions-to-freebsd.html
55•todsacerdoti•5d ago•3 comments

TigerBeetle as a File Storage

https://aivarsk.com/2025/12/07/tigerbeetle-blob-storage/
8•aivarsk•6d ago•1 comments

What is the nicest thing a stranger has ever done for you?

https://louplummer.lol/nice-stranger/
265•speckx•2d ago•206 comments

Are we stuck with the same Desktop UX forever? [video]

https://www.youtube.com/watch?v=1fZTOjd_bOQ
73•joelkesler•4h ago•84 comments

EasyPost (YC S13) Is Hiring

https://www.easypost.com/careers
1•jstreebin•6h ago

Researchers seeking better measures of cognitive fatigue

https://www.nature.com/articles/d41586-025-03974-w
95•bikenaga•3d ago•26 comments

A Giant Ball Will Help This Man Survive a Year on an Iceberg

https://www.outsideonline.com/outdoor-adventure/exploration-survival/how-giant-ball-will-help-man...
24•areoform•7h ago•27 comments

Photographer built a medium-format rangefinder

https://petapixel.com/2025/12/06/this-photographer-built-an-awesome-medium-format-rangefinder-and...
156•shinryuu•1w ago•36 comments

Using Python for Scripting

https://hypirion.com/musings/use-python-for-scripting
76•birdculture•5d ago•65 comments

Will West Coast Jazz Get Some Respect?

https://www.honest-broker.com/p/will-west-coast-jazz-finally-get
63•paulpauper•1w ago•40 comments

Pig Video Arcades Critique Life in the Pen (1997)

https://www.wired.com/1997/06/pig-video-arcades-critique-life-in-the-pen/
6•naryJane•5d ago•1 comments

A Lisp Interpreter Implemented in Conway's Game of Life (2021)

https://woodrush.github.io/blog/posts/2022-01-12-lisp-in-life.html
84•pabs3•20h ago•3 comments

Java FFM zero-copy transport using io_uring

https://www.mvp.express/
94•mands•6d ago•42 comments

Purdue University Approves New AI Requirement for All Undergrads

https://www.forbes.com/sites/michaeltnietzel/2025/12/13/purdue-university-approves-new-ai-require...
35•rmason•2h ago•26 comments

GNU Unifont

https://unifoundry.com/unifont/index.html
317•remywang•1d ago•72 comments

Beautiful Abelian Sandpiles

https://eavan.blog/posts/beautiful-sandpiles.html
133•eavan0•4d ago•22 comments

A 'toaster with a lens': The story behind the first handheld digital camera

https://www.bbc.com/future/article/20251205-how-the-handheld-digital-camera-was-born
74•selvan•5d ago•42 comments

Show HN: I made a spreadsheet where formulas also update backwards

https://victorpoughon.github.io/bidicalc/
229•fouronnes3•2d ago•108 comments
Open in hackernews

Are we stuck with the same Desktop UX forever? [video]

https://www.youtube.com/watch?v=1fZTOjd_bOQ
73•joelkesler•4h ago

Comments

scottjenson•18h ago
I've given dozens of talks, but this one seems to have struck a chord, as it's my most popular video in quite a while. It's got over 14k views in less than a day.

I'm excited so many people are interested in desktop UX!

az09mugen•9h ago
Thanks for that nice talk, it felt like a breeze of fresh air with basic & simple yet powerful but alas "forgotten" concepts of UX.

Will look into your other talks.

calmbonsai•2h ago
I concur though per my earlier post I do feel "desktop stagnation" is inevitable and we're already there. You were channeling Don Norman https://jnd.org/ in the best of ways.
pjmlp•35m ago
It was quite interesting.
NetOpWibby•32m ago
Fantastic talk, I found myself nodding in agreement a lot. In my research on next-generation desktop interfaces, I was referred to Ink & Switch as well and man, I sure wish they were hiring. I missed out on the Xerox and Bell Labs eras. I'm also reading this book, "Inventing the Future" by John Buck that details early Apple (there's no reason the Jonathan Computer wouldn't sell like hotcakes today, IMHO).

In my downtime I'm working on my future computing concept[1]. The direction I'm going for the UI is context awareness and the desktop being more of an endless canvas. I need to flesh out my ideas into code one of these days.

P.S. Just learned we're on the same Mastodon server, that's dope.

---

[1]: https://systemsoft.works

joelkesler•4h ago
Great talk about the future of desktop user-interfaces.

“…Scott Jenson gives examples of how focusing on UX -- instead of UI -- frees us to think bigger. This is especially true for the desktop, where the user experience has so much potential to grow well beyond its current interaction models. The desktop UX is certainly not dead, and this talk suggests some future directions we could take.”

“Scott Jenson has been a leader in UX design and strategic planning for over 35 years. He was the first member of Apple’s Human Interface group in the late '80s, and has since held key roles at several major tech companies. He served as Director of Product Design for Symbian in London, managed Mobile UX design at Google, and was Creative Director at frog design in San Francisco. He returned to Google to do UX research for Android and is now a UX strategist in the open-source community for Mastodon and Home Assistant.”

rolph•4h ago
problem is with pushing a UX at users and enforcing that model when the user changes it to something comfortable when you should be looking at what the users are throwing away, and what they are replacing it with.

MS is a prime example, dont do what MS has been doing, remember whos hardware it actually is, remain aware that what a developer, and a board room understands as improvement, is not experienced in the same way by average retail consumers.

fortyseven•3h ago
You know, sometimes things just work. They get whittled way at until we end up with a very refined endpoint. Just look at cell phones. Black rectangles as far as the eye can see. For good reason. I'm not saying don't explore new avenues ( foldables, etc. ), but it's perfectly fine to come to settle into a metaphor that just works.
7thaccount•2h ago
The Windows 95-XP taskbar is good. Everything else has been downhill.
pdonis•2h ago
I use Trinity Desktop on Linux because it's basically the same as the Windows 95-XP taskbar interface, and has no plans to change.
analogpixel•2h ago
Why didn't Star Trek ever tackle the big issues, like them constantly updating the LCARS interface every few episodes to make it better, or having Geordi La Forge re-writing the warp core controllers in Rust?
AndrewKemendo•2h ago
Because it’s a fantasy space opera show that has nothing to do with reality
thaumaturgy•2h ago
Because, something that a lot of tech-obsessed Trek fans never seem to really come to terms with, is that Trek didn't fetishize technology.

In the Trek universe, LCARS wasn't getting continuous UI updates because they would have advanced, culturally, to a point where they recognized that continuous UI updates are frustrating for users. They would have invested the time and research effort required to better understand the right kind of interface for the given devices, and then... just built that. And, sure, it probably would get updates from time to time, but nothing like the way we do things now.

Because the way we do things now is immature. It's driven often by individual developers' needs to leave their fingerprints on something, to be able to say, "this project is now MY project", to be able to use it as a portfolio item that helps them get a bigger paycheck in the future.

Likewise, Geordi was regularly shown to be making constant improvements to the ship's systems. If I remember right, some of his designs were picked up by Starfleet and integrated into other ships. He took risks, too, like experimental propulsion upgrades. But, each time, it was an upgrade in service of better meeting some present or future mission objective. Geordi might have rewritten some software modules in whatever counted as a "language" in that universe at some point, but if he had done so, he would have done extensive testing and tried very hard to do it in a way that wouldn't've disrupted ship operations, and he would only do so if it gained some kind of improvement that directly impacted the success or safety of the whole ship.

Really cool technology is a key component of the Trek universe, but Trek isn't about technology. It's about people. Technology is just a thing that's in the background, and, sometimes, becomes a part of the story -- when it impacts some people in the story.

amelius•2h ago
I still wonder why not everybody was lingering in the holodeck all the time.

(equivalent of people being glued to their smartphones today)

(Related) This is one explanation for the Fermi paradox: Alien species may isolate themselves in virtual worlds

https://en.wikipedia.org/wiki/Fermi_paradox

RedNifre•51m ago
The lack of capitalism meant that the holodeck program authors had no need to optimize their programs for user retention to show them more ads. So much fewer people suffer from holodeck addiction in Star Trek than are glued to their screens in our world.
d3Xt3r•50m ago
Most likely because this was a star ship (or space station) with a limited number of personnel, all of whom have fixed duties that need to be done. You simply can't afford to waste your time away in holodecks.

The people we saw on screen most of the time also held important positions on the ship (especially the bridge, or engineering) and you can't expect them to just waste significant chunks of time.

Also, don't forget that these people actually like their jobs. They got there because they sincerely wanted to, out of personal interest and drive, and not because of societal pressures like in our present world. They already figured out universal basic income and are living in an advanced self-sufficient society, so they don't even need a job to earn money or live a decent life - these people are doing their jobs because of their pure, raw passion for that field.

Mistletoe•2h ago
Isn't it probably just that they don't really have money in Star Trek so there is no contract promising amazing advances in the LCARS if we just pay this person or company to revamp it? If someone has money to be made from something they will always want to convince you the new thing is what you need.
krapp•2h ago
Remember that in Star Trek humans have evolved beyond the desire to work for money or personal gain, so everyone just volunteers their time, and somehow this just always works.
jfengel•2h ago
Most of Trek's tech is just a way to move the story along. Transporters were introduced to avoid having to land a shuttle. Warp drive is just a way to get to the next story. Communicators relay plot points.

Stories which focus on them as technology are nearly always boring. "Oh no the transporter broke... Yay we fixed it".

krapp•2h ago
>In the Trek universe, LCARS wasn't getting continuous UI updates because they would have advanced, culturally, to a point where they recognized that continuous UI updates are frustrating for users.

Not to be "that guy" but LCARS wasn't getting continuous UI updates because that would have cost the production team money and for TNG at least would have often required rebuilding physical sets. It does get updated between series because as part of setting the design language for that series.

And Geordi was shown constantly making improvements to the ship's systems because he had to be shown "doing engineer stuff."

dragonwriter•1h ago
> In the Trek universe, LCARS wasn't getting continuous UI updates

In the Trek universe, LCARS was continuously generating UI updates for each user, because AI coding had reached the point that it no longer needs specific direction, and it responds autonomously to needs the system itself identifies.

bena•1h ago
LCARS was technically a self-adapting system that was personalized to a degree per user. So it was continuously updating itself. But in a way to reduce user frustration.

Now, this is really because LCARS is "Stage Direction: Riker hits some buttons and stuff happens".

cons0le•22m ago
>Because the way we do things now is immature. It's driven often by individual developers' needs to leave their fingerprints on something, to be able to say, "this project is now MY project", to be able to use it as a portfolio item that helps them get a bigger paycheck in the future.

AKA resume-driven development. I personally know several people working on LLM products, where in private they admit they think LLMs are scams

calmbonsai•1h ago
Trek needs to visibly "sci-fi-up" extant tech in order to have the poetic narrative license to tell its present-day parables.

Things just need to "look futuristic". The don't actually need to have practical function outside whatever narrative constraints are imposed in order to provide pace and tension to the story.

I forget who said it first, but "Warp is really the speed of plot".

Findecanor•1h ago
I have often thought that Star Trek is supposed to show a future in which computer technology and user interfaces have evolved to a steady state that don't need to change that much, and which is superior to our own in ways that we don't yet understand. And because it hasn't been invented yet, the show does not invent it either.

It is for the audience to imagine that those printed transparencies back-lit with light bulbs behind coloured gel are the most intuitive, easy to use, precise user interfaces that the actors pretend that they are.

JuniperMesos•1h ago
Man, I should hope that the warp core controllers on the USS Enterprise were not written in C.

On the other hand, if the writers of Star Trek The Next Generation were writing the show now, rather than 35-40 years ago - and therefore had a more expansive understanding of computer technology and were writing for an audience that could be relied upon to understand computers better than was actually the case - maybe there would've been more episodes involving dealing with the details of Future Sci-Fi Computer Systems in ways a programmer today might find recognizable.

Heck, maybe this is in fact the case for the recently-written episodes of Star Trek coming out in the past few years (that seem to be much less popular than TNG, probably because the entire media environment around broadcast television has changed drastically since TNG was made). Someone who writes for television today is more likely to have had the experience of taking a Python class in middle school than anyone writing for television decades ago (before Python existed), and maybe something of that experience might make it into an episode of television sci-fi.

As an additional point, my recollection is that the LCARS interface did in fact look slightly different over time - in early TNG seasons it was more orange-y, and in later seasons/Voyager/the TNG movies it generally had more of a purple tinge. Maybe we can attribute this in-universe to a Federation-wide UX redesign (imagine throwing in a scene where Barclay and La Forge are walking down a corridor having a friendly argument about whether the new redesign is better or worse immediately before a Red Alert that starts the main plot of the episode!). From a television production standpoint, we can attribute this to things like "the set designers were actually trying to suggest the passage of time and technology changing in the context of the show", or "the set designers wanted to have fun making a new thing" or "over the period of time that the 80s/90s incarnations of Star Trek were being made, television VFX technology itself was advancing rapidly and people wanted to try out new things that were not previously possible" - all of which have implications for real-world technology as well as fake television sci-fi technology.

RedNifre•42m ago
Because the LCARS GUI is only for simple recurring tasks, so it's easy to find an optimal interface.

Complex tasks are done vibe coding style, like La Forge vibe video editing a recording to find an alien: https://www.youtube.com/watch?v=4Faiu360W7Q

I do wonder if conversational interfaces will put an end to our GUI churn eventually...

rzerowan•34m ago
Mostly i believe its that the writers envisioned and were able to wrldbuildinsucha way that the tech was not a subject but was rather a part of the scenery/background with the main object being the people and their relationships. Additionally in some cases where alien tech was interfaced with the characters inthe storysome UI/code rewites were written in, for example in DS9 where the Cardassian interfaces/AI are frustrating to Chief O'Brien and his efforts to remedy/upgrade such gets a recurring role in the story.

Conversly recent versions have taken the view of foregrounding tech aidied with flashy CGI to handwave through a lot.Basically using it as a plot device when the writing is weak.

ares623•2h ago
Are we stuck with the same toothbrush UX forever?
LeFantome•2h ago
I feel like toothbrush UX has improved quite a bit.
AndrewKemendo•2h ago
Toothbrush UX is the same today as it was when we were hunter gatherers: use an abrasive tool to ablate plaque from the teeth and gums without removing enamel

https://www.youtube.com/watch?v=zMuTG6fOMCg

The variety of form factors offered are the only difference

jrowen•2h ago
Yes, whittling down a stick is pretty much the same experience as using an electric toothbrush. Or those weird mouthguard things they have now.

I don't think most people would find this degree of reduction helpful.

AndrewKemendo•2h ago
> Yes, whittling down a stick is pretty much the same experience as using an electric toothbrush

Correct? I agree with this precisely but assume you’re writing it sarcastically

From the point of view of the starting state of the mouth to the end state of the mouth the USER EXPERIENCE is the same: clean teeth

The FORM FACTOR is different: Electric version means ONLY that I don’t move my arm

“Most people” can’t do multiplication in their head so I’m not looking to them to understand

echoangle•1h ago
That’s just not what user experience means, two products having the same start and end state doesn’t mean the user experience is the same. Imagine two tools, one a CLI and one a GUI, which both let you do the same thing. Would you say that they by definition have the same user experience?
AndrewKemendo•1h ago
If you drew both brushing processes as a UML diagram the variance would be trivial

Now compare that variance to the variance options given with machine and computing UX options

you’ll see clearly that one (toothbrushing) is less than one stdev different in steps and components for the median use case and one (computing) is nearly infinite variance (no stable stdev) between median use case steps and components.

The fact that the latter state space manifold is available but the action space is constrained inside a local minima is an indictment on the capacity for action space traversal by humans.

This is reflected again with what is a point action space (physically ablate plaque with abrasive) in the possible state space of teeth cleaning for example: chemical only/non ablative, replace teeth entirely every month, remove teeth and eat paste, etc…

So yes I collapsed that complexity into calling it “UX” which classically can be described via UML

mrob•57m ago
As somebody who's tried using a miswak [0] teeth-cleaning twig out of curiosity, I can say with confidence it's not the same experience as using a modern toothbrush. It's capable of cleaning your teeth effectively, but it's slower and more difficult than a modern toothbrush. The angle of the bristles makes a huge difference. When the bristles face forward like with a teeth-cleaning twig your lips get in the the way a lot more. Sideways bristles are easier to use.

[0] https://en.wikipedia.org/wiki/Miswak

yearolinuxdsktp•2h ago
It’s changed, but is a wash:

On the positive side, my electronic toothbrush allows me to avoid excessive pressure via real-time green/red light.

On the negative side, it guilt trips me with a sad face emoji any time my brushing time is under 2 minutes.

calmbonsai•1h ago
I can imagine some sort of car-wash-like partial mouth insertion interface (think "smart cleaner/retainer"), but it would be cost-prohibitive and, likely, not offer any appreciable cleaning benefits.
ErroneousBosh•52m ago
I was going to say "are we stuck with the same bicycle UX forever".

Because we've been stuck with the same bicycle UX for like 150 years now.

Sometimes shit just works right, just about straight out of the gate.

esafak•30m ago
This is what bicycles originally looked like: https://en.wikipedia.org/wiki/Velocipede#/media/File:Velocip...
ErroneousBosh•21m ago
Yes, something like 200 years ago.

By the 1870s we'd pretty much standardised on the "Safety Bicycle", which had a couple of smallish wheels about two and a half feet in olden days measurements in diameter, with a chain drive from a set of pedals mounted low in the frame to the rear wheel.

By the end of the 1880s, you had companies mass-producing bikes that wouldn't look unreasonable today. All we've done since is make them out of lighter metal, improve the brakes from pull rods to cables to hydraulic discs brakes, and give them more gears (it wouldn't be until the early 1900s that the first hub gears became available, with - perhaps surprisingly - derailleurs only coming along 100 years ago).

https://en.wikipedia.org/wiki/Safety_bicycle

esafak•36m ago
There are electric-, ultrasonic-, mouthpiece-, and irrigating toothbrushes...

Maybe the experience has not changed for the average person, but alternatives are out there.

AndrewKemendo•2h ago
The computer form factor hasn’t changed since the mainframe: look into a screen for where to give input, select visual icons via a pointer, type text via keyboard into a text entry box, hit an action button, recieve result, repeat

it’s just all gotten miniaturized

Humans have outright rejected all other possible computer form factors presented to them to date including:

Purely NLP with no screen

head worn augmented reality

contact lenses,

head worn virtual reality

implanted touch sensors

etc…

Every other possible form factor gets shit on, on this website and in every other technology newspaper.

This is despite almost a century of a attempts at doing all those and making zero progress in sustained consumer penetration.

Had people liked those form factors they would’ve been invested in them early on, such that they would develop the same way the laptops and iPads and iPhones and desktops have evolved.

However nobody’s even interested at any type of scale in the early days of AR for example.

I have a litany of augmented and virtual reality devices scattered around my home and work that are incredibly compelling technology - but are totally seen as straight up dogshit from the consumer perspective.

Like everything it’s not a machine problem, it’s a human people in society problem

nkrisc•2h ago
> Purely NLP with no screen

Cumbersome and slow with horrible failure recovery. Great if it works, huge pain in the ass if it doesn't. Useless for any visual task.

> head worn augmented reality

Completely useless if what you're doing doesn't involve "augmenting reality" (editing a text document), which probably describes most tasks that the average person is using a computer for.

> contact lenses

Effectively impossible to use for some portion of the population.

> head worn virtual reality

Completely isolates you from your surroundings (most people don't like that) and difficult to use for people who wear glasses. Nevermind that currently they're heavy, expensive, and not particularly portable.

> implanted sensors

That's going to be a very hard sell for the vast majority of people. Also pretty useless for what most people want to do with computers.

The reason these different form factors haven't caught on is because they're pretty shit right now and not even useful to most people.

The standard desktop environment isn't perfect, but it's good and versatile enough for what most people need to do with a computer.

AndrewKemendo•2h ago
And most computers were entirely shit in the 1950s

yet here we are today

You must’ve missed the point: people invested in desktop computers when they were shitty vacuum tubes that blow up.

That still hasn’t happened for any other user experience or interface.

> it's good and versatile enough for what most people need to do with a computer

Exactly correct! Like I said it’s a limitation of the human society, the capabilities and expectations of regular people are so low and diffuse that there is not enough collective intelligence to manage a complex interface that would measurably improve your abilities.

Said another way, it’s the same as if a baby could never “graduate” from Duplo blocks to Lego because lego blocks are too complicated

AnimalMuppet•1h ago
I do not see laptop computers as the same form factor as mainframes. At. All.

Even more, I don't see phones as the same form factor as mainframes.

mcswell•1h ago
Since mainframes, you say. Well, sonny, when I first learned programming on a mainframe, we had punch cards and fan-fold printouts. Nothing beats that, eh?
immibis•1h ago
Phone UIs are still screen UIs, but they are not desktop UIs, and that's not because of the shape of the device.
AndrewKemendo•56m ago
Tell me how that’s not a phone and a desktop:

https://www.instagram.com/reel/DPtvpkSExfA/

calmbonsai•2h ago
For desktops, basically, yes. And that's OK.

Take any other praxis that's reached the 'appliance' stage that you use in your daily life from washing machines, ovens, coffee makers, cars, smartphones, flip-phones, televisions, toilets, vacuums, microwaves, refrigerators, ranges, etc.

It takes ~30 years to optimize the UX to make it "appliance-worthy" and then everything afterwards consists of edge-case features, personalization, or regulatory compliance.

Desktop Computers are no exception.

Hammershaft•2h ago
All of the other examples you gave are products constrained by physical reality with a small set of countable use-cases. I don't think computer operating systems are simply mature appliance-like products that have been optimized down their current design. I think there is a lot of potential that hasn't been realized because the very few players in the operating system space have been been hill-climbing towards a local maxima set by path dependence 40 years ago.
calmbonsai•1h ago
To be precise, we're talking about "Desktop Computers" and not the more generic "information appliances".

For example, we're not remotely close to having a standardized "watch form-factor" appliance interface.

Physical reality is always a constraint. In this case, keyboard+display+speaker+mouse+arms-length-proximity+stationary. If you add/remove/alter _any_ of those 6 constraints, then there's plenty of room for innovation, but those constraints _define_ a desktop computer.

pegasus•43m ago
That's just the thing, desktops computers have always been in an important way the antithesis of a specialized appliance, a materialization of Turing's dream of the Universal Machine. It's only in recent years that this universality has come under threat, in the name of safety.
mrob•1h ago
I can think of two big improvements to desktop GUIs:

1. Incremental narrowing for all selection tasks like the Helm [0] extension for Emacs.

Whenever there is a list of choices, all choices should be displayed, and this list should be filterable in real time by typing. This should go further than what Helm provides, e.g. you should be able to filter a partially filtered list in a different way. No matter how complex your filtering, all results should appear within 10 ms or so. This should include things like full text search of all local documents on the machine. This will probably require extensive indexing, so it needs to be tightly integrated with all software so the indexes stay in sync with the data.

2. Pervasive support for mouse gestures.

This effectively increases the number of mouse buttons. Some tasks are fastest with keyboard, and some are fastest with mouse, but switching between the two costs time. Increasing the effective number of buttons increases the number of tasks that are fastest with mouse and reduces need for switching.

[0] https://emacs-helm.github.io/helm/

danans•12m ago
> Take any other praxis that's reached the 'appliance' stage that you use in your daily life from washing machines, ovens, coffee makers, cars ...

I wish the same could be said of car UX these days but clearly that has regressed away from optimal.

sprash•2h ago
Unpopular take: Windows 95 was the peak of Desktop UX.

GUI elements were easily distinguishable from content and there was 100% consistency down to the last little detail (e.g. right click always gave you a meaningful context menu). The innovations after that are tiny in comparison and more opinionated (things like macos making the taskbar obsolete with the introduction of Exposé).

fragmede•1h ago
Heh, the number of points you've probably gotten for that comment, I don't think that it's that unpopular. Win 98 was my jam but it looks hella dated today, but as you said, buttons were clearly marked, but also menus were navigatible via keyboard, soms support for themes and custom coloring, UIs were designable via a GUI builder in VB or Visual Studio using MFC which was very resource friendly compared to using Electron today. Because smartphones and tablets, but even the wide variety of screen sizes also didn't exist so it was a simpler time. I can't believe how much of a step back Electron is for UI creation compared to MFC, but that wasn't cross-platform and usually elements were absolute positioned instead of the relative resizable layout that's required today.
mattkevan•2h ago
Really interesting. Going to have to watch in detail.

I’m in the process of designing an os interface that tries to move beyond the current desktop metaphor or the mobile grid of apps.

Instead it’s going to use ‘frames’ of content that are acted on by capabilities that provide functionality. Very much inspired by Newton OS, HyperCard and the early, pre-Web thinking around hypermedia.

A newton-like content soup combined with a persistent LLM intelligence layer, RAG and knowledge graphs could provide a powerful way to create, connect and manage content that breaks out of the standard document model.

linguae•2h ago
I enjoyed this talk, and I want to learn more about the concept of “learning loops” for interface design.

Personally, I wish there were a champion of desktop usability like how Apple was in the 1980s and 1990s. I feel that Microsoft, Apple, and Google lost the plot in the 2010s due to two factors: (1) the rise of mobile and Web computing, and (2) the realization that software platforms are excellent platforms for milking users for cash via pushing ads and services upon a captive audience. To elaborate on the first point, UI elements from mobile and Web computing have been applied to desktops even when they are not effective, probably to save development costs, and probably since mobile and Web UI elements are seen as “modern” compared to an “old-fashioned” desktop. The result is a degraded desktop experience in 2025 compared to 2009 when Windows 7 and Snow Leopard were released. It’s hamburger windows, title bars becoming toolbars (making it harder to identify areas to drag windows), hidden scroll bars, and memory-hungry Electron apps galore, plus pushy notifications, nag screens, and ads for services.

I don’t foresee any innovation from Microsoft, Apple, or Google in desktop computing that doesn’t have strings attached for monetization purposes.

The open-source world is better positioned to make productive desktops, but without coordinated efforts, it seems like herding cats, and it seems that one must cobble together a system instead of having a system that works as coherently as the Mac or Windows.

With that said, I won’t be too negative. KDE and GNOME are consistent when sticking to Qt/GTK applications, respectively, and there are good desktop Linux distributions out there.

gtowey•2h ago
It's because companies are no longer run by engineers. The MBAs and accountants are in charge and they could care less about making good products.

At Microsoft, Satya Nadella has an engineering background, but it seems like he didn't spend much time as an engineer before getting an MBA and playing the management advancement game.

Our industry isn't what it used to be and I'm not sure it ever could.

Normal_gaussian•1h ago
It's great to hear from someone who thinks these people still care! It has rarely been my experience, but I haven't been everywhere yet.
linguae•59m ago
I feel a major shift happened in the 2010s. The tech industry became less about making the world a better place through technology, and more about how to best leverage power to make as much money as possible, making a world a better place be damned.

This also came at a time when tech went from being considered a nerdy obsession to tech being a prestigious career choice much like how law and medicine are viewed.

Tech went from being a sideshow to the main show. The problem is once tech became the main show, this attracts the money- and career-driven rather than the ones passionate about technology. It’s bad enough working with mercenary coworkers, but when mercenaries become managers and executives, they are now the boss, and if the passionate don’t meet their bosses’ expectations, they are fired.

I left the industry and I am now a tenure-track community college professor, though I do research during my winter and summer breaks. I think there are still niches where a deep love for computing without being overly concerned about “stock line go up” metrics can still lead to good products and sustainable, if small, businesses.

jack_tripper•39m ago
>The tech industry became less about making the world a better place through technology

When the hell was even that?

eek2121•1h ago
For the same reason we don't reinvent the wheel. Or perhaps, the same reason we don't constantly change things like a vehicle. It works well, and introducing something new means a learning curve that 99% of folks won't want to deal with, so at that point, you are designing something new for the other 1% of folks willing to tackle it. Unless it's an amazing concept, it won't take off.
ErroneousBosh•52m ago
> Or perhaps, the same reason we don't constantly change things like a vehicle.

Are we stuck with the same brake pedal UX forever?

whatever1•1h ago
Desktop is dead. Gamers will move to consoles and Valve-like platforms. Rest of productivity is done on a single window browser anyway. Llms will accelerate this

Coders are the only ones who still should be interested in desktop UX, but even in that segment many just need a terminal window.

snovv_crash•1h ago
For content consumption sure.

For content creation though, desktop still rules.

immibis•1h ago
Sounds like a dead market. Nobody needs to create content any more now that we have AI.
linguae•1h ago
Is it dead because people don’t want the desktop, or is it dead because Big Tech won’t invest in the desktop beyond what’s necessary for their business?

Whether intentional or not, it seems like the trend is increasingly locked-down devices running locked-down software, and I’m also disturbed by the prospect of Big Tech gobbling up hardware (see the RAM shortage, for example), making it unaffordable for regular people, and then renting this hardware back to us in the form of cloud services.

It’s disturbing and I wish we could stop this.

xnx•33m ago
Desktop is all about collaboration and interaction with other apps. The ideal of every contemporary SaaS is that you can never download your "files" so you stay locked in.
hollerith•1h ago
>productivity is done on a single window browser anyway

When I need to get productive, sometimes I disable the browser to stop myself from wasting time on the web.

whatever1•1h ago
And you likely open the browser that happens to be called VS Code, Figma etc
hollerith•54m ago
The point though is that my vscode window does not have an address bar I can use to visit Youtube or Pornhub at any time.

I guess the larger point is that you need a desktop to run vscode or Figma, so the desktop is not dead.

sprash•1h ago
It's not dead. It's being murdered. Microsoft, Apple, Gnome and KDE are making the experience worse with each update. Productive work becomes a chore. And the last thing we need is more experiments. We need more performance, responsiveness, consistency and less latency. Everything got worse on all 4 points for every desktop environment despite hardware getting faster by several orders of magnitude.

This also means that I heavily disagree with one of the points of the presenter. We should not use the next gen hardware to develop for the future Desktop. This is the most nonsensical thing I heard all day. We need to focus on the basics.

silisili•46m ago
I agree with this. I remember when Gnome 3 came out, there were a lot of legitimate complaints that were handwaved away by the developers as "doesn't work well on a mobile interface", despite Gnome having approximately zero install cases onto anything mobile. AFAICT that probably hasn't changed, all these years later.
kvemkon•7m ago
> Gnome

I can't imagine what I'd be doing without MATE (GNOME 2 fork ported to GTK+ 3).

Recently I've stumbled upon:

> I suspect that distro maintainers may feel we've lost too many team members so are going with an older known quantity. [1]

This sounds disturbing.

[1] https://github.com/mate-desktop/caja/issues/1863#issuecommen...

migueldeicaza•1h ago
Scrubbed the talk, saw “M$” in a slide, flipped the bozo bit
immibis•1h ago
I don't want to see what any of today's companies would come up with to replace the desktop. Microsoft has tried a few times and they all sucked.
xnx•55m ago
I felt rage baited when he crossed out Jakob Nielsen and promoted Ed Zitron (https://youtu.be/1fZTOjd_bOQt=1852). Bad AI is not good UI, but objecting based on AI being "not ethically trained" and "burning the planet" aren't great reasons.
GaryBluto•47m ago
https://www.youtube.com/watch?v=1fZTOjd_bOQ&t=1852s You're missing the ampersand.

It's really strange how he spins off on this mini-rant about AI ethics towards the end. I clicked on a video about UI design.

xnx•37m ago
Same. AI is absolutely the future of human computer interaction (exactly the article from Jakob Nielsen that he crossed out). Even the father of WIMP, Douglas Engelbart, thought it was flawed: ""Here's the language they're proposing: You point to something and grunt". AI finally gives us the chance to instruct computers as humans.
jhhh•44m ago
I understand the desire to want to fix user pain points. There are plenty to choose from. I think the problem is that most of the UI changes don't seem to fix any particular issue I have. They are just different, and when some changes do create even more problems there's never any configuration to disable them. You're trying to create a perfect, coherent system for everyone absent the ability to configure it to our liking. He even mentioned how unpopular making things configurable is in the UI community.

A perfect pain point example was mentioned in the video: Text selection on mobile is trash. But each app seems to have different solutions, even from the same developer. Google Messages doesn't allow any text selection of content below an entire message. Some other apps have opted in to a 'smart' text select which when you select text will guess and randomly group select adjacent words. And lastly, some apps will only ever select a single word when you double tap which seemed to be the standard on mobile for a long time. All of this is inconsistent and often I'll want to do something like look up a word and realize oh I can't select the word at all (G message), or the system 'smartly' selected 4 words instead, or that it did what I want and actually just picked one word. Each application designer decided they wanted to make their own change and made the whole system fragmented and worse overall.

DonHopkins•28m ago
Golan Levin quotes Joy Mountford in his "TED Talk, 2009: Art that looks back at you":

>A lot of my work is about trying to get away from this. This a photograph of the desktop of a student of mine. And when I say desktop, I don't just mean the actual desk where his mouse has worn away the surface of the desk. If you look carefully, you can even see a hint of the Apple menu, up here in the upper left, where the virtual world has literally punched through to the physical. So this is, as Joy Mountford once said, "The mouse is probably the narrowest straw you could try to suck all of human expression through." (Laughter)

https://flong.com/archive/texts/lectures/lecture_ted_09/inde...

https://en.wikipedia.org/wiki/Golan_Levin

https://www.flong.com/

https://en.wikipedia.org/wiki/Joy_Mountford

https://www.joymountford.com/

christophilus•20m ago
No, we’re not. Niri + Dank Material Shell is a different and mostly excellent approach.