EDIT: the snow stops _but after a few seconds_. I toggled it to find out if it would turn off the snow, and the immediate feedback was that it doesn't. That's really _really_ bad UX.
fwiw I went through the whole article with the snow effect on and found it just fine too.
And it's actually making a serious point about something I care about, so I actually wanted to read the article (which I did through the snow).
This is such a common failure mode with coders who do their own design. Just because you can do something doesn't mean you should.
The article is good but the choice of distracting snowflakes or radioactive piss burning your retina is not a welcome one.
1. I scrolled through the article getting more and more frustrated with the snow
2. I scrolled all the way back to the top and saw the snowflake icon
3. I clicked the snowflake, saw the hideous yellow, said WTF and clicked again to go back to blue
4. **I never noticed** that the snowflake *does* stop the snow, but *only* stops *new* snow, so the existing snow continues to fall across the screen
5. I clicked several other things, then came here to complain and saw this thread > although then you get a yellow background
Yes, and this is arguably worse. I ended up using Immersive Reader mode in Edge.I ended up using reader mode to read the page. The whole site design undermined the point being made. One of the first things mentioned is not to be distracting. Yet they went out of their way to make their own site distracting. "Do as I say, not as I do."
Edit: I'm not implying it's a good thing.
It's a joke that did not drop. Given the audience is trying to read text. And he's making it annoying to read text.
It's like a stand-up comedian telling a joke, in a wild accent; where the audience cannot discern what he's saying.
[1] https://web.archive.org/web/20251207071946/https://tonsky.me...
I'd imagine he thought the blog was mostly going to be about UI. "Haha, a blog talking about UI, with bad UI. Isn't that funny!"
Like there's no way the flashlight UI was done for usability purposes. It has to be a joke.
Funny, i disliked this exact detail. I thought turning it off hadn't worked for a few seconds and i retoggled it on and off a bunch of times before i got it
And yes, I did think "this is terrible, there must be a way to change it", clicking the snowflake icon. The colour changed to a new colour but otherwise it didn't seem to change, so I just clicked back.
Because, as you noted, the snowflakes slowly end, which I didn't realize until seeing your comment.
It's fun. Looks neat. It's an extremely poor idea for a site trying to convey textual information.
They already did, as the grandparent comment already explained:
>> TBF, you can turn it off by clicking on the snowflake icon in the top right corner.
TBF, I felt so perfectly trolled with this one I couldn't help but chuckle... :)
Also do check his privacy policy https://tonsky.me/personal-information/
<blink>
Thank goodness for Firefox reader mode. That animation is so incredibly distracting.
I can't stand animations while I'm trying to read something, and this one is particularly egregious.
(TBF, it slowly fades the animation out, probably for aesthetic reasons, to avoid a jarring sudden stop. I do agree, though, that a sudden stop would probably be more appropriate in this context)
Let me introduce you to "neko"...
There's also JS to add a neko to a webpage.
Cute, but also not good when trying to focus on reading.
I’m trying to read but my eyes keep jumping to the movement and I lose my place.
I understand people think it’s “fun” but I think it’s just so disrespectful to the reader.
Yes, it is a bit hypocritical, but you can look at the content of the message and judge it without judging the presentation of the message, even if it talks about usability of interfaces in computer software.
As the worst UX expert in the world, you can obviously feel free to criticize others, but you're probably going to lose a lot of people after the first sentence if you're using 2003 MySpace-style blinking text and animated GIFs to make your point.
But, if a 4-year old boy finds "there's more of them with bigger guns", and a general has a personal interest in hurting someone without you knowing that, you'd be unwise to not consider the words of the boy as you prep your military strategy.
Note that you were careful to establish hard-to-prove circumstances ("served in similar circumstances"), which seems to say that you don't want to discount what the non-expert is saying too easily either.
Whether or not the author cares will certainly be influenced by the fact it’s just a personal blog. I wouldn’t expect them to change anything for that reason alone, but the criticism stands nonetheless.
it absolutely would not, it would be more akin to someone wearing a fat-suite for a joke and criticizing someone for running with bad form
but you are taking this so seriously I can't quite tell if you're joking anyway
And it got noticeably warm.
That explains my other comment, which speculated the snow as the cause for my iPhone instantly overheating, followed by screen-dimming throttling.
Also: this is not a plea to stop putting snow/etc on pages. I miss the days of such things in earlier internet. I'd trade back janky plugins and Flash player crashes for the humanizing & personalized touch many sites had back then.
Do you not realize that the stakes are different between that and a whole OS?
That snowflakes were the author’s preference? That’s too much madness for one day.
It's a bit like adding new emojis in an OS release. There's been reports that new emojis are one of the drivers for getting people to upgrade. No one cares about a zero day security flaw, but that new kiss emoji everyone wants.
But, having worked with users I've seen first hand out tons of internal improvements are ignored while one small UI change makes 'everything seem new'.
It was a brave move to spend a major release without adding feature. And people were grateful for it, once it happened.
The analogy I use is that no one thinks about plumbing until it's not working. I could stand up and tell people we have the best plumbing ever, it's been improved, is less likely to break, etc... and as long as it works at a surface level it seems the vast majority of user don't care. We actually save little UI tweaks/fixes to point to when doing major behind the scenes upgrades so users 'see' we're doing something. It's silly, but /shrug.
Apple has released incremental upgrades to macOS for years, and I've never heard this criticism of them. On the contrary, I ofter hear people missing Leopard design, and when UI has changed I've heard pushback (ie, when System Settings was renamed and redesigned). On macOS people care about the apps and interactions, not wether the buttons got a new look.
> There's been reports that new emojis are one of the drivers for getting people to upgrade. No one cares about a zero day security flaw, but that new kiss emoji everyone wants.
I agree with this. New emojis are new functionality; you can now express something you couldn't. A zero day security flaw brings no new functionality. Equally, updates to to apps and interactions bring new functionality. A re-skin of the OS doesn't.
Between any of the big 3 companies putting out major OSes (Apple, Microsoft, Google), Apple is the best for sticking to tried and true designs. It's certainly gotten worse the past couple of years (like the Photos app redesign they immediately changed again in iOS 26) and I hope with their new design lead he can pull things back to somewhere sensible, but compared to Android or Windows it's not even close. I used Android for the better part of a decade and every year they'd completely redesign the notification shade, the settings app, they'd switch the SMS app out for Hangouts, then put you back on Messages, then rename it, then change the logo/branding, then redesign it again, etc. Everything was endless changes for no reason, felt like a constant beta.
If you look at the basic iPhone apps - Messages, Settings, Notes - prior to Liquid Glass it's been pretty much exactly as it was when Jobs showed it off at the iPhone reveal 19 years ago.
Tools' success metric is how much they make your task easier/faster. Ad delivery machines' success metric is how much of your time they waste.
Shitty UI making you spend more time in front of the screen is considered a good thing according to those "designers"' performance metrics.
Microsoft’s desktop dominance was challenged by Google Docs and Facebook apps in parallel, Microsoft had to jump to web tech late, and the last few decades have been them reconciling their stack to the fact they missed the web and mobile despite pioneering key user-facing tech for both. Then they entered catchup-mode for client and search tech, only to later realize maybe they don’t care because the cloud catchup efforts blossoming into MS-consultant orthodoxy for every-darned-thing made their cloud offerings the most profitable bit… Ads and desktop stagnation pulled Windows into a weird OSX-at-home territory everyone kinda hates. And then LLMania took off, entrenching MS cloud and AI strategies into all of it, all the time. Pushing so hard that they’re gonna spend billions distancing their semi-Enterprise office brand from the term‘Microslop’.
TLDR: the rush to the web/mobile moved the focus off thick clients and desktop affordances, the money is elsewhere, a universal GUI toolkit isn’t obvious for anyone, and SaaS feels better online
For the last ~20 years: designing software for mobile devices
It's chock full of old screenshots from a variety of old desktops.
[edit] I just discovered the snow icon, which does turn off the snow but turns the background into bright yellow. Oh and the other icon which turns your cursor into a ...spotlight? On an otherwise black page? Do I have that right? Which one of those things was a design decision that enhanced usability, or readability, or... anything at all? These choices can best be described as sophomoric. You can disagree with menu icons, but they at least in theory serve a purpose. What purpose is served by any of the gizmos on this site?
It's a personal website, it's fun. Comparing someones personal website - where the 'fun' things can be turned off - to a $tn company with hundreds of millions of users who rely on the usability their products is not a great basis for disagreeing with a pretty great write-up with many salient points.
I wonder if JWZ still has the red carpet for HN users. Let’s test: https://www.jwz.org/blog/2026/01/dali-clock-in-the-wild/
EDIT: Yep, still works!
I agree that colors could help.
Don't hesitate to give KDE/Qt a try, it apparently happens to get all these things right according to this article from a quick glance: everything is correctly aligned, even when in the same menu some items both have an icon and a checkbox, and some don't have anything; icons are mostly meaningful, some icons are colored (most are monochrome though, there's a move to this), and not all items have icons.
I guess it's the kind of things that are hard to get right for a hobby OS like macOS that lacks professional UX designers. :-)
A key problem is that big US corps have always had a product design mentality that can produce monstrosities like your average cable TV remote and think it is in any way a good solution. That was clearly already an influence on things like XP.
It makes it seem like they’re designing for you until they’re not.
But if you demand sugar in your tea it doesn't matter how good the tea is, right? You are not going to like that restaurant.
> Importantly, that philosophy relies on the result having merit, and working cohesively on its own terms, even if it's not your preference
I am too dumb to understand what this means.
> I thought that was supposed to be Apple’s thing. “We decide how to make it and you decide to buy it or not.”
This was Apple; your customization options were limited, but things were well designed and cohesive. If you were willing to adapt to their design paradigms, you'd benefit from their expertise, and also have to put in less effort tweaking. Plus you could pick up any random new Apple product and be up to speed immediately.
But to extend the metaphor, if the tea sucks, I'll stop going to that restaurant. If Apple makes their UIs both immutable and bad, I'll use something else.
This is an opinion, though. macOS did do certain things better than Windows, but it also did a lot of things markedly worse. The Mac market share never overtook the Windows market, on-merit it was considered a worse product. You or I might think it was a decent system at some point, but the evidence is really just anecdotal.
I agree with the parent comment, Apple's "thing" was their financial skill and not their manufacturing or internal processes. Once the IIc left the mainstream, people stopped appreciating the craftsmanship that went into making the Apple computer. It was (smartly) coopted by flashy marketing and droves of meaningless press releases, documented as the "reality distortion field" even as far back as the 1980s.
The fact Apple makes and sells the only hardware to run macOS does not mean the software is fundamentally different from the rest of the industry. Apple has deprioritized backwards compatibility, not user choice.
¿Por qué no los dos?
I bet you wish that was the case. XServe existed though, and for all of Apple's confidence in the product it was (and is) treated like a second-class citizen that doesn't compete with free alternatives.
There is literally nothing that stops macOS from falling into the same pit of irrelevance besides first-party hubris. How much do you trust Apple to make smart, responsive decisions?
They did desperately need a distraction from the complete bedshitting mess they've made of AI though. Regardless of whether you think AI is good or wildly overpromised, getting on stage and lying through your teeth about what your AI does is what C-tier companies do, not Apple. Except they do now.
ive had the most success with fedora x11 / xwayland but lately wayland has been pretty solid
There's a lot more consistency in the Apple ecosystem.
Don't get me started on the other crap with Linux distros: power management doesn't work, audio barely works, heck even though both Linux and MacOS use CUPS for printing, in MacOS it works way better.
Maybe I'm just scarred from laboring much too hard in the 90s and aughts to get desktop and laptop Linux working, but here is my current take:
- Yes there is fragmentation. Perhaps there are not hundreds of Linux distros but, off the top of my head: Debian, Ubuntu, Mint, Fedora, RHEL, CentOS, Rocky, Alma, Arch, Manjaro, openSUSE, Kali, PopOS, elementary OS, Zorin, Gentoo, Alpine, NixOS are all viable options. Next, pick a desktop: GNOME, KDE Plasma, Xfce, LXQt, Cinnamon, MATE, Budgie, Pantheon, Deepin, Enlightenment. Each has different UX conventions, configuration systems, and integration quality. There is no single Linux desktop and its bewildering. - Power management now "works" in the sense that, when you close your laptop lid and re-open it, yay! the machine (mostly) comes back to life instead of just crashing. It took us at least 15 years to get to that point. However, PM does not work in the sense that battery like on my M4 Macbook Air is literally 2x what I would get from a comparably priced Linux laptop. Part of that is better hardware, but _a lot_ of that is better power management. - Audio now mostly works without glitching, just like it did in OS X circa 2002. But God help you if you're not using a well-supported setup and find yourself manually having to dick around with kernel drivers, ALSA, Pulseaudio. (Just typing these words gives me PTSD.) Here is a typical "solution" from *within the past year* for audio troubles in Linux: https://www.linux.org/threads/troubleshooting-audio-problems.... There are thousands more threads like this to be found online. For typical, 99%-of-the-time use cases, experiences of this sort are rarely if ever encountered on Mac. - Printing is arguably the closest because, as previously noted, they are both using the same underlying system. But printing, thanks to AirPrint, is still smoother and more pain-free on Mac than on Linux. - Don't even get me started on Bluetooth.
It's not that I'm anti-Linux, I wanted sooo bad for Linux on the desktop and laptop to succeed, for a variety of reasons. But Steve J came along 25-30 years and completely pulled that rug out from under us.
I've had no problem whatsoever with 2 laptops regarding power management or audio.
Get a major distro and major software if you don't want to wander into problems.
At least half of those complaints on the article have standard, close to universally agreed icons.
I use both Linux (home) and Mac (work) and I don't see one in Mac either. Also over time Linux has been getting more consistent, and Mac less.
Having no single desktop is a huge bonus. If you don't like one distro, you might like another. "Consistency" is a poor way to restate, "Windows or MacOS might be bad, but at least someone can unilaterally make it worse against your will."
I'd rather choose a drink from a soda fountain than get a more consistent flavor from a urinal. But to each their own.
This is a symptom of no strong leadership that's capable of enforcing standards, Apple downslide as a whole firm where departments and people are fighting each other for resources.
This is a bad sign for design at Apple. It suggests a fundamental lack of attention to detail that would have been harder to imagine a few years ago.
What's driving it?
> Burger menu
> User agreement
"User disagrees with the content of this site."
I recommend playing with the top-right buttons, it made me chuckle audibly.Still, my primary OS is Linux, but for laptops I prefer macOS, and it's still in acceptable shape.
However, I'll agree that Tahoe has far more papercuts than its predecessors. It needs a "Snow Tahoe" version.
Another wrong rules I've seen blindly followed is making everything an edge-to-edge canvas, so that the sidebar floats on top. Having a full-window canvas with floating sidebars can make sense for applications where content is expansive and inherently spatial (like say, Figma) or applications where the sidebar is an actual floating element that can be moved around (like Photoshop once was).
It doesn't make sense in Finder, or Reminders, where the content is ultimately just a list. Forcing the sidebar "to float on top of the content" yields no benefit because the content wont ever scroll under it, and because it can't be moved anyway, but it does lead to wasted space, that ugly "double border", etc.
It seems that Apple has nobody left who has all three at the same time: taste, attention to detail, and authority to demand fixes. Having lots of people who have max two out of these three gives you designs of Microsoft and Glass Apple.
Every app should not be its own little universe as if it was a videogame.
And the ones doing it have no say in how it's done.
Being involved and in the loop is how great software is made. Otherwise you can just outsource and have tickets completed.
No, this is what happens when the people in charge of UI design have no clue what they're doing.
It's like what Miyamoto warned: a delayed UI is eventually good, but a rushed UI is forever bad.
The article starts with this: > Sequoia → Tahoe It’s bad
And I look at the image... And I like it? I agree with the author that it could be better, but most of the icons (new, open recent, close, save, duplicate, print, share etc), do make it easier, faster and more pleasant for my brain to parse the menu vs no icons.
Again, I don't disagree that you could do it better, I just disagree with the premise that the 1992 manual is "the authority". Display density has increased dramatically; people use their computers more and have been accustomed to those interfaces, which makes the relationship of the people with the interfaces different. Quoting a 1992 guideline on interfaces in 2026 feels like quoting the greeks on philosophy while ignoring our understandings of the world since then.
But a file menu is still a file menu, and save is still save. In fact it's remarkable how little that has changed since 1983.
Besides, that interface designers or even the average computer user understands more than in 1992 is highly implausible on its face.
The person I replied to said, "that interface designers or even the average computer user understands more than in 1992 is highly implausible on its face"
Think of computer users at the ages of 10, 20, 30, 40, 50, 60, 70, and 80 in 1992. For each group, estimate their computer knowledge when they sat down at a computer in 1992.
Now do the same exercise for the year 2026.
How is it highly implausible on its face that the average computer user in 2026 understands less than the average computer user in 1992?
I think so.
> The person I replied to said, "that interface designers or even the average computer user understands more than in 1992 is highly implausible on its face"
Yes, I agree with this person.
>How is it highly implausible on its face that the average computer user in 2026 understands less than the average computer user in 1992?
I don't think it is. Particularly with the average user, the bar of understanding is lower now.
Can you explain how this is true given that everyone using a computer today has had a lifetime of computer use whereas in 1992 many people were encountering computers for the first time?
The userbase has been watered down with a larger proportion of individuals who are not highly technical.
Or 1982.
Users are always non-technical.
1. Computer users were generally well-educated, unlike today.
2. UX designers didn’t inherit any mess and could operate from first principles.
3. The “experience” of modern users—phones, tablets, and software that does everything for you—doesn’t translate the way you think. And it explains why Gen Z seems to have regressed in terms of tech knowledge.
And even if human cognition itself were unchanged, our understanding of HCI has evolved significantly since then, well beyond what merely “feels right.”
Most UX researchers today can back up their claims with empirical data.
The article goes on at great length about consistency, yet then insists that text transformations require special treatment, with the HIG example looking outright unreadable.
Menu text should remain stable and not mirror or preview what’s happening to the selected text IMHO.
Also, some redundancy is not necessarily a bad thing in UI design, and not all users, for various reasons, can read with a vocabulary that covers the full breadth of what a system provides.
HCI work in 1992 was very heavily based on user research, famously so at Apple. They definitely had the data.
I find myself questioning that today (like, have these horrible Tahoe icons really been tested properly?) although maybe unfairly, as I'm not an HCI expert. It does feel like there are more bad UIs around today, but that doesn't necessarily mean techniques have regressed. Computers just do a hell of a lot more stuff these days, so maybe it's just impossible to avoid additional complexity.
One thing that has definitely changed is the use of automated A/B testing -- is that the "empirical data" you're thinking of? I do wonder if that mostly provides short-term gains while gradually messing up the overall coherency of the UI.
Also, micro-optimizing via A/B testing can lead to frequent UI churn, which is something that I and many others find very annoying and confusing.
This was all experts driven in that time to my knowledge.
Empirical validiton did not really take off until the late 00s.
https://hci.stanford.edu/publications/bds/4p-guidelines.html
Don had the explicit expert knowledge first stance in 2006 and 2011, nothing inherently wrong with that, but it's defenitly no research driven.
"Always be researching. Always be acting."
https://jnd.org/act-first-do-the-research-later/
Tognazzini and Norman already criticized Appple about this a decade ago, while the have many good points, I cannot shake the feeling that they simply feel like the were used to just brand Apple as user friendly in the 90s and that Apple never actually adopted their principles and just used it as it fit the company's marketing.
https://www.fastcompany.com/3053406/how-apple-is-giving-desi...
there are a bunch of discussions on this
https://news.ycombinator.com/item?id=10559387 [2015] https://news.ycombinator.com/item?id=19887519 [2019]
Empirical validiton did not really take off until the late 00s.
https://hci.stanford.edu/publications/bds/4p-guidelines.html
Hmmm, I don't quite see where that supports "Apple didn't do empirical validation"? Is it just that it doesn't mention empirical validation at all, instead focusing on designer-imposed UI consistency?
ISTR hearing a lot about how the Mac team did user research back in the 1980s, though I don't have a citation handy. Specific aspects like the one-button mouse and the menu bar at the top of the screen were derived by watching users try out different variations.
I take that to be "empirical validation", but maybe you have a different / stricter meaning in mind?
Admittedly the Apple designers tried to extract general principles from the user studies (like "UI elements should look and behave consistently across different contexts") and then imposed those as top-down design rules. But it's hard to see how you could realistically test those principles. What's the optimal level of consistency vs inconsistency across an entire OS? And is anyone actually testing that sort of thing today?
I cannot shake the feeling that they simply feel like the were used to just brand Apple as user friendly in the 90s and that Apple never actually adopted their principles and just used it as it fit the company's marketing.
I personally think Apple did follow their own guidelines pretty closely in the 90s, but in the OS X era they've been gradually eroded. iOS 7 in particular was probably a big inflexion point -- I think that's when many formerly-crucial principles like borders around buttons were dropped.
You have state management for debugging purposes already, so why not expose it to the user.
As an example in photoshop no non-professional users care about non-destructive workflows, these things have to be learned as a skill.
Undo is nice to have in most situations, but you can really only trust your own saves and version management with anything serious.
Sonething as simple as a clipboard history is still nowhere to be found as built in feature in MacOS, yet somehow made it's way into Windows.
Sure, we can debate about the general points.
Yet, we can't refute that my subjective opinion evaluation of the opening image looks better (for me) , reads better (for me) and is easier (for me) to parse. Either I don't fit the general guidelines, or the general guidelines need a revision, that's my point overall.
But answer me this. You say "but most [but not all? - interesting] of the icons do make it easier, faster and more pleasant for my brain to parse the menu vs no icons."
How does a list of icons that are used inconsistently, duplicated, used in other places, sometimes used and sometimes not used, not to mention illegible, positioned inconsistently, go directly against the broad (reasoned) rules of the Apple HIG, help 'make it easier' as you say?
This is literally what half the article is explaining and you are just saying - no it's easier to not be able to tell an icon apart, and it's easier to have the icons sometimes be the same or move locations, be so small as to be illegible.
How many did you get when the menu text was removed ? I just don't believe it makes it easier. But who am I to argue against someones 'subjective opinion evaluation' I'm just a guy on the internet.
ps I assume by the opening image, you mean the first screenshot supplied by the author of the article - the Sequoia to Tahoe menu comparison, which he brilliantly posted below a shot of the HIG which literally is explaining the exact same thing and why not to do it the Tahoe way. That in it self is confusing.
It makes no sense why Apple chose to do that with Tahoe?
I'll add a general comment - one of the reasons I use Apple systems was they had the UI stuff nailed down. Stuff was consistent. It looked and behaved in proper ways. It felt like a properly designed, holistic approach to UI design. Lately it's just a mess. This article touch the surface of the issues. My current beef is this stupid 'class of window' that appears now and again which is half-way between a dialog and a window. Best place to see it is immediately after a screenshot - click the thumb that appears. This window type doesn't behave like any other window. Z-order, closing, focus, actions that occur when you click certain things, are all different and inconsistent. But it does look a little like IOS though.
I have never daily driven an Apple device, so I can't comment on this; but from what I seen I do agree that Apple UI has not been as consistent lately.
> ps I assume by the opening image, you mean the first screenshot supplied by the author of the article
Yeah, sorry about that; that's correct, that's what I'm referring to. To remove ambiguity: https://tonsky.me/blog/tahoe-icons/sequoia_tahoe_textedit@2x...
> How does a list of icons that are used inconsistently, duplicated, used in other places, sometimes used and sometimes not used, not to mention illegible, positioned inconsistently, go directly against the broad (reasoned) rules of the Apple HIG, help 'make it easier' as you say?
Sure! First of all, I'm only commenting on the FIRST image of the blog. There are no duplicated images in it. The icons appear consistently used in that image (maybe export to PDF looks a bit off, but this is a pattern that I have seen repeated on other apps, so I'm used to it). I'm not sure how the icons would look on the actual display, but they look alright on my 4K display as shown on the blog. I also can't comment on they being used "inconsistently" across other parts because I don't use Apple devices.
I'm making a very narrow claim: On the first image, if I compare the menu on the left, with the menu on the right, I prefer the menu on the right. I have tried to "find X" on a menu on the left and then repeat a similar exercise on the right; I am faster on the right and I am more confident on the right. My brain seems to be using the icons as a "fast lookup" and the text to verify the action.
Now, does this translate to all other menus? No! The "File" example he shows is super confusing. Also, it's possible I would prefer the less cluttered version with less icons. But for me (all icons) > (no icons) on that specific example.
I have not put enough mental energy to agree with the author on all of his individual suggestions across the article, but they look overall fine on the individual examples he provides. I'm just find the first example... not particularly compelling.
> Well - that's just your --- opinion, man
Well... Yes. But unless we objectively measure how I use the computer, that's the best we have got to evaluate my preference.
All my classes on human-computer interaction and design has always been about "listen to your users".
So not only did teams have to rush incroporating this new design, they also didn't get time to think things through and polish. It sounds like they didn't have time to push-back either, and so there was no feedback/iteration loop where they could work out with whomever is in charge of wider system design what works and what doesn't on what apps.
It's not Microsoft-add-all-the-bloatware-and-adverts-we-can worse, but it's 20-year-old-operating-systems-were-better-designed worse.
We're getting to a point where I think Linux windowing systems like KDE are better designed. And it seems that all they had to do was not change much over the span of a few decades.
Or am I out of touch? It feels like I could use a computer better back when XP was the mainstay.
Going to continue to get worse as time goes on.
Even the hardware will go at some point I'm betting.
That’s not what enshittification means.
Somehow its usage has quickly devolved from a very specific pattern companies used to squeeze “value” out of their customers over time into just “something getting worse in any way” or even just “something I don’t like now but I did before” which is not at all the point.
All these give someone a reason to replace their otherwise perfectly-working iPhone sooner.
I think a lot of key lessons that were studied and learned back in the day weren't adequately transferred to the new generation.
This is the core rotting value in so much of big tech. So much of your bonus, performance review, promotion package, etc is hinging on "delivering impact" (ie: doing the flashy stuff). Imagine a world where some internal R&D team took a risk on liquid design but then thought it was okay to not ship it because it didn't work out.
> When a measure becomes a target, it ceases to be a good measure.
All this data driven BS about impact is really just that. So much of it is just gaming the system.
To prove that, you need some data to compare before/after. Hm, how about how much time people send in the software? Seems like a decent proxy. Well, plenty of people are very unhappily addicted to social media. and yet that’s what companies and investors frequently look at.
It’s very hard to come up with an incentive where just keeping things the same is acceptable. I mean it’s basically an admission that you as a company cannot innovate or invent better ways for people to interact with a computer.
Today, both software products are treated like monopolies. macOS is satisfied being an insular underdog, and Microsoft has no motivation to compete if Apple won't get off their ass.
My personal theory is that making the menu bar transparent by default (and shifting/ritating backgrounds) on Tahoe is preparing for OLED laptops and displays, which maybe will get under screen cameras or just a nicer cutout.
Software however especially from UX point of view, is more likely to be more or less ready at some point. Any improvements are marginal and subjective. What are the large UX teams at Apple going to do if not redesigns for the sake of redesigning? I wish it would happen, but it’s hard to imagine Apple shipping an annual OS release without noticeable visual changes.
Maybe someday they'll completely remove the text to just keep the icons like the Office ribbon did, and I'm still confused with it.
While v4 was pretty much text based in early v5 every item and action had an icon, often only an icon. The manual read something like this:
"To do ⌘ you navigate from the ⌙ page to the ⌟, while holding the middle mouse button. The⌇will open and you will see the ⌆."
I think they did that with good intentions. CATIA being a French product sold all over Europe and beyond, localization must have been a significant line item. The result was a nightmare though and they to toned the reliance on symbols down in subsequent versions.
"The Ancient and The Ultimate"
https://archive.org/details/Fantasy_Science_Fiction_v044n01_...
This is because incentives have changed such that being good at your job in your average mega-corp has very little to do with the outcome: To climb their ladder, you optimise for impact and move before you have to deal with the consequences.
The default is that it’s not the case, and it requires eternal vigilance to do otherwise. Everything good is a fluke.
It's not so different from the rationale for many consumer electronics products: novelty for marketing's sake rather than functionality. Similarly, notice the ridiculous trend of removing most physical buttons from car dashboards, started by Tesla and mindlessly aped by the other carmakers.
To be fair, the bar is really really low in terms of mobile hardware. It’s just really hard (and expensive) for new players to design, build and manufacture hardware with competitive processing power.
People used to build hackintoshes to get Mac OS X without paying Apple's RAM tax or suffer having mid-range laptop GPUs in the top-of-the-line desktops.
Apple's outstanding success with their ARM chips is more of an exception than the rule.
I think the key issue with macOS is that they don't seem to have someone who is looking at the whole ecosystem holistically to make sure that there's consistency and integration across experiences. You probably have development silos for different applications and they don't really integrate with each other. There should really be a role like Integration Emperor that exists outside of the traditional corporate hierarchy who can go to different teams to push for increased consistency.
That person died in 2011.
UI was simple, maybe too simple, but it made sense. The only complaint from that era was the thin scroll bars and the flipped scroll direction, both of which are forgotten about and accepted nowadays.
But it had taste and attention to detail. It followed Apple's own HIG (design guidelines). The UI had some flashy details, but they had a purpose or at least didn't get in the way. I don't feel any of this in the current Apple designs.
The snow? Something else?
No one uses menus. So why so get so upset over the mac implementation of a dead paradigm? Because ironically the Macintosh bakes a "menu" into the screen space for you, and has since 1984. Giving up menus on a mac requires that you give up one of the things that make a mac a mac. And that's hard, for marketers even more so than designers. So it persists and festers.
But in the rest of the world, we walked away and never looked back. The icons aren't the problem here.
[1] "Big" apps for experts still get value out of putting their actions into tightly packed text. Photoshop too, etc... But these are increasingly the exception and not the rule, and even there the next generation of big tools (c.f. Cursor) don't have them.
[2] Even worse there, because the tab bar it does have actually looks like a menu but isn't.
I much prefer text to inscrutable icons.
Chrome’s unique buried menu breaks user expectations. Casual users have trouble finding it.
I repeat: the menu bar is a dying abstraction, preserved in a consistent form only on the Macintosh (even iOS has no equivalent), because it's presence is unavoidable. Users of modern apps don't see it used consistently, so it's absolutely not surprising that Apple's designers aren't doing it either.
clearly apple are doing their best to destroy them.
Top menus are good because of infinite size (fitts law) Discoverability - i can look in a menu add see what options are available to me. You can tell people either verbally or textually what to do e.g. Choose File Export as PDF instead of something like 'See the small icon that looks like a document with a folded corner, no not that one' yadda Menu are direct I don't need to open another menu to get to my menus (hamburger menu)
I use menus all the time. I totally agree with the article. You say 'no one uses menus' Not true.
Applies only to an increasingly obscure input device. No one talks about Fitts's Law on phones, because it's fundamentally wrong. The "size" of a control assumes you have a mouse or joystick or something trying to find it. Fingers don't do that, no one worries about moving to a control.
> Discoverability
Equally true of a hamburger menu.
> I use menus all the time
I do too! Because we're dinosaurs.
> You say 'no one uses menus' Not true.
It was hyperbole that I genuinely thought was clear from the text. You know exactly what I meant. Menus are secondary devices at this point only seriously used on one platform, so it's 100% unsurprising that design paradigms for using them are changing ("decaying", I suspect you and the other dinosaurs would say) to reflect patterns used elsewhere in the industry.
I'm not talking about fitts law on phones though, I don't think anyone is. This is about Tahoe/mac OS on Macs.
But I'd hardly call a mouse, or a trackpad 'increasingly obscure' I use the fact that I have an infinite target area with my trackpad everyday. It's one of those things that a user might not consciously notice until it's no longer there. Much like a lot of what we are seeing with Tahoe in general that people like me, you, the author of this article. We are pointing out the UI issues that are suddenly immediately apparent, and the bit that is really astounding us all is --- how on earth are Apple, supposed bastions of UI interface detail, and polish , are making such an almighty meal of all of these things that used to work but now just don't, and not even don't work -- the 'new ways' are objectively worse.
The only criticism of the top menu bar interface is that on big hi res monitors (which are relatively recent in terms of the Mac OS desktop) sometimes have the menu bar 'miles away' from the actual app that may be in a smaller window. But this is where fitts law comes in -- so the choice is shorter distance but you need to be more precise smaller target (windows) or large distance but infinite target. I prefer the latter.
It's probably in some ways almost a direct holdover from the time of 9" screens and single tasking -- apps were usually taking up most of the screen so having a menu bar at the 'top' wasn't that weird. Having multiple menus bars would have been really weird and taken a lot of space.
I mean Apple could offer an option like I used to see on gnome where you could have a global menu bar or a per app menu bar. That would be more useful than stage manager ever has been I'd bet.
I was, in the comment you responded to!
I'm saying that phone-centric UI paradigms completely dominate in the modern world. And by extension, arguments like the linked article dancing on pins over minutiae of the menu bar, are missing the point. Good User Interface design in the coming decades simply is not going to use menu bars, it's not. It exists for dinosaurs like us, not the coming generation of tool users who will be directing AI or whatever and not digging through text.
Apple has long been rumoured to be on the path to merging IOS and mac OS (whatever that means) . MS already tried it and failed.
I for one, usually avoid using phone apps and tend to use websites (for example) instead since the phone apps usually have arbitrary limitations enforced by piss poor UI.
The most obvious and egregious example is the number of apps that don't let you open more than one view - e.g. Amazon. I can't look at 2 products in amazon app at the same time. It's just horrendous.The next obvious one is Youtube -- The app is just terrible. Using a browser is a better experience in every way.
_edit_: I'd also like to point out that I know it can be disabled, the question(s) I then have are 1) why is it enabled by default 2) black text on yellow background is yet another obvious mistake
Btw, I'm using a desktop PC with a large browser window. Maybe on a small mobile screen the problem is less apparent.
It's painful to see the decay, update after update, into a more confusing, cluttered, and tacky experience.
Jobs had his own flaws, but he was definitely a huge part of why Apple's UI design (and product design in general) has historically been as good as it has.
People just don't like new things that change what they are used to.
I like this article because it points out how undeniably awful some of these decisions were in a "this signifies something is seriously, fundamentally wrong with Apple design" way. I really hope Apple listens a does a major course correction.
Sorry, who decided this? Which people exactly?
This was so obvious to me. The damage done to Apple by losing Jobs as their most vicious editor was almost instantly noticeable.
As a complete outsider, my impression is that this started slow because they had to politically overpower Apple’s actual UI group. Liquid Glass probably managed that with a unified look across all devices pitch which should’ve weighted the relative impact on the popular platforms much higher than the niche Vision Pro.
Jobs liked to talk this side of the business up because creatives were the substantial part of the business. Now they sell to everybody it doesn't matter so much. The average person won't even notice the complaints in the original article. They aren't sensitive to it in the way that creative people are.
With computers such a huge part of almost everyone's lives now, it's a travesty for one of the largest companies in the world to inflict something so subpar on so many old-style
I had to help him "get his bookmarks back" meaning see his bookmarks toolbar in Firefox. He must have hit a keyboard combo on accident. Since Firefox hides the menus by default, I had to tell him to tap Alt to see the menu, after which he was easily guided to View > Toolbars > Bookmarks Toolbar.
Bad UI design for novices is felt, if not conveyed outright.
I had to add my signature and write in the date so it looks like it was handwritten. So the plan was to just draw the date with a pencil tool and if that failed use the text tool to write the date.
First I instinctively clicked the pencil icon which turned out to be a highlighter. That's a great example where if they had added color for the tip and line it would have clearly looked like a highlighter. After that failed, I clicked that "i" icon because it looks like it's for inserting text. Honestly I was in such a rush I didn't even see the info pane popping up and was dang confused when nothing was happening.
I'm very familiar with info icons and have used them in my own apps, yet I had never seen one without the circle around it.
To use a famous movie quote: "Your scientists were so preoccupied with whether or not they could, that they didn't stop to think if they should"
Just because you have HDPI and opacity, doesn't mean that you have to use it by default, everywhere.
Commercial software coding glorifies denying anything older than 10 years exist outside of museums, let alone learn anything from it. The same has merely infected design world.
Now they just promote the youngsters that say the word AI a lot instead of those of us who actually care about the craft.
Is this known to be true or speculated? I don't know how this process is handled at Apple specifically, however, generally decision-makers are highly detached from UX. One would think that, especially for an overhaul initiative, "important" people would daily-drive dev/nightly builds to wear off the cool factor and experience the not so pleasant annoyances, but generally they shield themselves from that and mostly look at the "cool demos".
Regardless, as far as I am aware Apple has a tight product release cadence and ties feature gates to that. Obviously hardware readiness gates are much earlier than software, but I can easily imagine situations where "yes men" report "good enough" at gates relevant for marketing, feature gets greenlit, but then gets half-assed for the actual release. Recall iphones crashing at the initial release demo? Might as well be history rhyming.
Liquid Glass - with its wobbling jiggling jerking, shimmering and flashing, blurry and difficult-to-read, shifting and unpredictable design, and battery-demolishing performance - is so much worse. It's mindblowing how bad it is.
And rightfully so. Tahoe is not just a step back, but it throws away so many good design elements that have been there for ages - and for no good reason.
I really hope they revert most of the design changes in macOS 27. I don't mind the Liquid Glass - the other changes they made to expose/highlight Liquid Glass are the real issue.
IMO we reached peak design in 2013 with Mavericks.
Apple abandoned enforcing HIG for app developers around 2012 (Facebook tiled menu, modal abuse, and hamburger) but now seems to have given up on standards entirely.
The wall to wall interaction pattern is terrible too. Every time my hand brushes my phone some unexpected (and sometimes unknown) interaction occurs. Classic example is changing orientation while watching YT where accidental contact with the bottom-left (becoming top-right) part of the screen as you move the phone selects a new video. It’s becoming slop.
The experience right now is bad. It’s frustrating and there is no overarching vision or clear focus on the user.
Steve Jobs had good taste in many areas, but he also approved the puck mouse.
But when you see mistakes made consistently, year after year, you know the problem is systemic.
Starting to use Mac ~3 years ago, I often encountered giant blocks of text in right-click menus and while pleasing aesthetically, those were a chore to actually parse. For someone who daily drives macOS for, I assume, multiple years more than me, it probably comes down to memorisation, and how it looks becomes more important (with Tahoe breaking habits), but I find the inconsistencies and icons something that actually helps me find my ground.
Granted, the execution leaves a lot to be improved, I won't argue against it. Tahoe in some places feels downright amateurish. Despite that, I'll still take what Tahoe added over no icons at all... I feel like color icons + using them more sparingly would certainly be better though.
I guess a justification for Tahoe icons on my end is - those help me navigate the UI despite all their shortcomings (and ugliness they bring in many places).
I couldn't quite formulate the exact reason why I disliked the Tahoe menus compared to previous versions, but this article nailed it.
The classic Windows menu from the article is a good example: https://tonsky.me/blog/tahoe-icons/word@2x.webp?t=1767611340
Though even the visual clutter of everything having icons I find faster to scan with my eyes on the first "visit", as those usually suggest the functionality I might be looking for at least. Even if not perfectly distinct, I find iconography faster to parse and guide me towards what I seek. I don't even particularly mind reused ones, as those usually mark a "section" of their own.
However I understand that some people would probably take no icons at all rather than every option having one... or whatever Apple decided to do in a given menu, considering Tahoe's inconsistency all over the place).
Apple (software) has lost its way and needs to return to whatever made them great and different.
Try harder.
Logitech’s software is also stuck in a loop denying it has Bluetooth access (Which it has). And with the added graphical glitches (Apple likes to call them liquid glass) and weird window artifacts (For some reason, all my windows had a black, rectangular border one day), it’s honestly less reliable than my macOS-style Linux rice from 2015. But I'm still stuck with MacOS since I NEED Adobe Lightroom for my work and there is still now way to run that with GPU acceleration on Linux. But if there was, there would be no device running Windows/MacOS left in my household
I've also recently come upon this talk by an ex-apple UI/UX engineer: https://youtu.be/1fZTOjd_bOQ I think what he's talking about is precisely what got lost at apple.
Edit: In case someone stumbles upon this after experiencing the same problem with ableton, here is the command I executed:
sqlite3 ~/Library/Application\ Support/com.apple.TCC/TCC.db "INSERT OR REPLACE INTO access VALUES('kTCCServiceMicrophone','com.ableton.live',0,2,4,1,NULL,NULL,0,'UNUSED',NULL,0,1725000000,NULL,NULL,'default',0);"
Disclaimer: I have absolutely no Idea what it does, as it was generated by Gemini. I do not have anything super important on this computer so I just executed it, but please don't touch obscure system files if you have data to lose.
I remember being really excited for Liquid Glass, because it felt like a return to the good old days of Skeuomorphism, at least in some spirit. In reality, it was a botched delivery, I suspect for two reasons:
1. Trying to unify all of their design (in one year no less) against one style -- developed primarily on Apple Watch & the now defunct Vision Pro -- was a colossal undertaking.
2. There's so much goddamn software packed into each OS that you're going to inevitably be stuck with bloated menus. Imagine Apple releasing OS 27 this year and saying "we're stripping you down to the bare bones. It's going to feel like Snow Leopard, but we're going to give you customization menus to alter that experience." I would lose my mind with joy. I'd be so excited to be able to operate my fucking phone again.
But how would they do that without scrapping the whole version?
Their marketing for this year heavily relies on liquid glass but if they remove the shiny stuff, it’s not very pretty, it’s just functional. Functional is what people with work to do appreciate, marketing people will want the shiny back now that it was introduced.
I like the look of Liquid Glass and I'm generally for it. It just needs to be organized better.
> just functional
This is ultimately what I disagree with. I think iOS/macOS have become entirely dysfunctional. Software is broken, webpages are broken simply because they're running from OS 26. Alarms and calendar events either run randomly or not at all. The system preferences menu is hardly navigable. I could go on. Maybe I'm just getting old and crusty, and yearn for the days when Steve Jobs was running the ship.
They just pack needless software in and do nothing to keep it organized/usable.
There is a way for them to fix this while saving face. You see, Liquid Glass™ was just the first of their incredible new Material Design paradigm. Now introducing Apple Stone™, Apple Paper™, Apple Linen™ and Apple Brushed Metal™. All just as realistic as Liquid Glass™.
The fact that no one (in power) saw a problem with Liquid Glass shows that Jobs was right, that letting the MBAs take the power never works out. And he was wrong for appointing Cook. Remember that Jobs made MacBooks "expensive" (no more expensive or even cheaper than a Vaio or Portege) because he wanted to make great devices with a great UX and UI, which needed a certain level of investment. Jobs loved his users. Cook only loves his shareholders.
I have heard Liquid Glass was in development for two years, so I see no hope of them spending all that money over again. Nevermind all the developers who have redesigned apps for IOS26.
They could just re-release IOS 18, but that would piss me off as a developer.
This is why I left the Apple Ecosystem last month, I see no hope.
The off-ramp for liquid glass is what they’ve already been doing: repeatedly nerf it until it’s just tinted glass again.
The outstanding example is Spotlight. For years since the M1 release I have been able to bring the bar up and type C-H-R then Enter, in ~500ms I guess, and I would always get Chrome.
Now, I have to wait another 1-2 seconds for it to think. If I hit enter before it has finished thinking, Spotlight goes away and nothing else happens!
The UI and Spotlight are still fine on my 2020 M1 Air running Sequioa
What's even worse is I have disabled all search categories for Spotlight except Apps and System Settings. It's searching a list of 40 apps and idk maybe 200 settings, and it still sucks.
Claude Code wrote a single file AppKit program that uses 30mb memory (avoid SwiftUI).
Extended it with dictionary, quick translations, and more. Since I use it every day, I keep polishing it without writing code.
IOS26 broke my device, and what recourse do I have? None.
Buy a Pixel 9a for $399, flash GrapheneOS.
And back to your point, a large reason people buy Apple in the first place is so they never have to read this sentence/try this solution!
What usually works is fully resetting permissions for an app:
tccutil reset All <APP BUNDLE ID HERE>
To find the bundle ID for an app: mdls -raw -name kMDItemCFBundleIdentifier <APP PATH HERE>It does not "forget", the permissions are limited to 30 days. So you have to re-grant them periodically.
Oh, and if you miss the popup (because sometimes it pops _under_ other windows) then you just start getting silent failures. With absolutely NO indication in the UI that something is wrong. There is literally no way to find that out without looking directly into the TCC database.
Bane of my existence. I have wasted so much time on various apps not having some access they need.
I am literally writing a bug report right now for 26.2, because on my Mac, for whatever reason, running tccutil reset All on com.apple.Terminal isn't removing Full Disk Access. It removes everything else (screen recording, specific folder domain access, but not FDA).
A UI/UX Dev has two choices in 2026: 1. Try to execute their own vision and get shot down into burnout by management 2. Just make everything look shiny and modern, create demos that look great and get promoted
You shouldn't have to do any of that. Even if Live couldn't trigger the permission prompt, you should be able to give it a microphone permission in Settings.
Apple has been rudderless on the interaction design front for over a decade now. The windowing mess is evidence of it. We now have the cmd+tab (app switcher), Spaces, Mission Control, (full screen) split screen, Stage Manager, and now tiled window control. None of those interaction metaphors have been expanded upon since their initial launch.
I'm a "mac guy". I understood why Apple initially eschewed windows style alt-tab, given the emphasis on app-centricism. But now, they've created a thousand different ways to switch windows without giving us a proper window switcher. There are apps that bring alt-tab to Mac, but they are all bad because Apple doesn't give developers access to the low-level APIs to create performant and fully featured window management.
Before, Apple had an endless well of great ideas to tap. That's how we got the term "Sherlocked". However, now that they've locked down macOS so much, they've suffocated themselves of new ideas.
Other pet peeves include the smallest window thumbnails are enormous and enabling mouse hover to switch windows causes me to switch to the wrong window, and thumbnails are often stale.
Here's an example: panes, tabs, windows, apps, spaces. These things all fight against each other, have their own little silos, get treated differently by every single app, etc, etc.
"Let's throw every away and start from scratch" is a tempting idea, but it rarely works. Even taking your example of panes, windows, tabs, apps, and spaces, each of those have a separate and identifiable use case that, IMO, is valid. At least in my mind, I have a mental model around where panes, windows, tabs, and apps are appropriate, and I personally rarely use spaces (though I certainly understand people who like them), and they've never bother me because I can safely ignore them.
And when you look at the issues identified in the article, they all seem very fixable to me. Fixable starting with Apple getting new design leadership, and given the guy responsible for Liquid Glass jumped to Meta, sounds like it was a good thing for Apple.
Disagree. The fact that these are independent is a huge organizational boon for those who know how to apply some combination of them.
Example: I set up spaces (virtual desktops) to have themes: one of general web, one for iOS dev, one for Android dev, etc. within that, it’s useful to be able to further organize with windows and tabs — windows can represent projects for example, and tabs can represent files. These work together to prevent a Cambrian explosion of windows that would be impossible to manage no matter how your environment works.
It’s useful for apps to be distinct from windows, too. This allows things like moving all windows belonging to a particular app between screens, or minimizing/maximizing them all, hiding them all, etc. Panes in most circumstances are an entirely different beast than tabs and windows… it wouldn’t be useful to turn an inspector palette into a tab.
They work together as long as one’s mental model isn’t overly simplistic.
What I want is for an OS that treats things like tabs as first-class citizens, not a byproduct that each app implements in its own way.
Problem is that most third party apps don’t opt in and instead reimplement tabs themselves. It’s mostly native Mac apps made by small boutique developers that use native tabs.
I only very recently changed my System Settings > Desktop & Dock > Windows > Prefer tabs when opening documents to "Always". I'm pretty sure the default is "In Full Screen".
Now, something like TextEdit creates new files in new tabs rather than new windows. It's great! But by default, everything on macOS seems to use windows rather than tabs. I don't even think most people know about the "Prefer tabs" option at all.
When you create a new document, does it open in a new tab or new window by default?
I say this only half in jest, as it's what I'm using on my personal desktop. It's far from perfect though, for some reason the taskbar/launchpad just feels awkwardly shaped, some of the keyboard based navigation gets a little wonky sometimes. And I feel that there's a few areas where the split brain between docking regions and windows isn't quite as polished as it needs to be.
On the plus side, it performs better than Gnome, and is slightly more consistent than a lot of Linux desktop options. In the end, I do think that OSes need to expose a bit more details in a somewhat consistent way as cross-platform apps and even electron-like options are boing to become more common place as people jump between and need to support Window, Linux and MacOS targets, with the variances they all have.
Personally, there are a lot of things I like and dislike between all of them.. I spent a large amount of time tweaking Budgie to be almost exactly what I want at one point, and an update borked my setup... I just switched to Pop/Cosmic and dealt with it from there.
There's a veritable zoo to try out. KDE, COSMIC, XFCE, GNOME and its many derivatives (like Pantheon or the previous COSMIC), Unity, etc. Also interesting are the old XMotif based ones like CDE (which has the delightful "chiseled marble" look of 90s-era DEs).
Windows also has its own share of alternative shells, but AFAIK only Cairo DE is actively supported and developed.
And if it is an OS, it's also just not one where I can be very productive.
I’d actually forgotten about the menu icons in Tahoe (I tried Tahoe during the beta period and lasted less than a day) and at first thought this would just be a pretty short piece on Tahoe’s new application icons.
Going back to the article though: another amusing detail is that the one menu where having icons is actually helpful (the Move & Resize menu) actually had those icons before Tahoe.
This thread feels like classic HN bikeshedding. The article itself is nearly 5,000 words about menu icons. Menu icons! Yes, the inconsistencies exist. Yes, they could be better. But the level of outrage here doesn’t match reality. People are acting like Apple has completely lost their way over… inconsistent icon styling in dropdown menus that most users don’t even notice.
It’s been inherent in The Macintosh Way from the beginning.
On iOS it's totally fine, but on macOS it's a disaster. I've only updated one machine so far and will keep all others on Sequoia until this mess is resolved.
But bring that to desktop, where your UI is windowed and appears alongside (or overlapping) other windows, and you end up with chaos.
The floating sidebars are a prime example of this. Why should I have to expend mental energy to differentiate what's an actual window -vs- what's just a novelty round-rect with a shadow (oh, and and window controls for the parent round-rect, which is a window)?
Good for you, unless Apple changes its mind my MBPro from 2019 will be stuck on Tahoe forever.
On an Intel-based Mac:
- If you used Option-Command-R to start up from Internet Recovery, you might get the latest macOS that is compatible with your Mac.
- If you used Shift-Option-Command-R to start up from Internet Recovery, you might get the macOS that came with your Mac, or the closest version still available.
Instead they were all on stage praising it.
Yeah, but I doubt that would change much; the amount of damage done would be difficult to roll back. What do you think Apple is going to do for the next macOS: "Look we told you to design all these extra icons last year. Guess what, this year we want you to remove them."
I just can't imagine that happening. This is the fundamental thing that is wrong with this. They had the OSes in beta for a few months, barely listened to feedback, and now we're stuck with the damage. For how many more OS iterations?
I really wish they had at least macOS in a different cycle than iOS (and with the idiotic year version names, they've brazenly signed themselves up for the yearly schedules.) I really couldn't care less about what damage they do to iOS after iOS 7, but I still haven't upgraded to Tahoe and I won't do so until they roll this design back entirely...which I don't see happening.
Maybe I'm just pessimistic about Apple at this point but I feel like no amount of criticism is going to change their design trajectory now, unless it affects their bottom-line.
- framework laptop or similar Linux machine
- graphene os phone
- ditch the apple watch, go back to a watch-watch
That’s what happens when you put in charge arrogant industrial and print designers who think UX and UI are inferior forms of design.
In other words, I'm not using icons to find a single action. I'm using icons to quickly understand the available options to me. Meaning, they're only there to help me compare and contrast options within the same visual context. It's fine for the same action to have different icons in different places. When I'm looking at the "File" menu, I'm not really concerned what the action looks like on the toolbar (let alone a different app).
My point is that none of that matters. They just need to represent something within the context they're being used.
And it was clear enough that you would trust the stand-alone save icon to perform the action you expected. Icons were also selectively chosen for common actions.
If icons are considered non-deterministic and non-canonical, then they can't [confidently] be used stand-alone.
Also, I'll meet your data point of 1 with mine. I have an iPhone and find it far easier to pick an app from the list of names than pick an icon. I frequently mix them up (and I only use a dozen routinely)
It might sound flippant but I'm asking in good faith: do you have a reading disability (like dyslexia)? I (and I think this is generally the case) don't consciously read words character by character or say them in my head, it feels more like my brain pattern-recognizes whole words and I understand them "at a glance".
Tim Cook- if you do one thing before you retire- promote the right person to fix both MacOS and iOS. They're both in need of a lot of work to fix.
More specifically, to the user it means they’ll still have the opportunity to back out of the operation; that it won’t take effect immediately upon clicking the menu item. It allows the user to “explore” the command without committing to its execution yet. For that reason, IMO the ellipses in “Attach Files…” and “Add Link…” are appropriate.
What I’m seeing more often in practice is that menu items that should have an ellipsis don’t. They make you wonder what the immediate effect of the command would possibly be.
It's just a small nitpick though. The article generally seems to do a good job of highlighting some frankly shocking (and worrying amateurish) problems in these menus.
Explorer also "promotes" copy/paste commands to the top of the context menu as just icons with no text label, which can be confusing - your instinct is to just to look for a "Copy" or "Paste" item in the menu, but no - for some commands you must learn the icon and the fact that it doesn't appear in the menu proper.
Also the context menu populates slowly with dynamic items depending on the right-clicked file which causes items to dodge out of the way of your cursor, but I don't want to get too deep into a wider discussion of the awfulness of Windows 11 Explorer...
maybe apple or ex-apple people can comment ?
But lately, over the past 5 or 10 years, it seems to me that perfection in UI is just as arbitrary and mutable as people's tastes and preferences.
It's hard to admit it to myself, but I think my love for the early Mac OS and Windows 9x UIs was mere puppy love at first sight, and now is simply nostalgia.
To me, it seems very related to the idea of how to fall in love with a person. There seems to be nothing you can measure it against. You simply either do or do not feel a connection with the person, an inexplicable infatuation. And if you do, then that love cools and settles into something more subtle but just as real over decades, until you're holding hands on your deathbeds. Yet I can't for the life of me figure out how it begins, or what its metrics are, or where its catylists come from. I suppose this is what Randall Monroe wondered all those years ago when he came up with his blog's subtitle. If only I could ask him, perhaps I would have the answers to everything.
Consistency is always better. App A and App B should use the same save icon, without any caveats, regardless of what it actually is. If you offer me a consistent experience, I'll take it whether it's Windows 95 or Tahoe.
No. Perfection exists, but it's not how something works or looks. Perfection is stability. A bad solution that you know inside and out is preferable to a "better" solution that you have to spend time to think about every time you use it.
One issue of the recent redesigns is that they degrade rather than improve “how it works”.
Yes, it is. UI is not not about stable form, alone.
> Otherwise any old solution that does not change would be good.
UI design is nuanced, regardless of the large number of people who dismiss it as borderline irrelevant. eg The most common used elements should be the easiest to recognize and access. Culture driving change is the most useful, while random association is less useful and statistically worse for user experience.
Apple has taken a great leap in misinterpreting that if the most common functions benefit from iconography, there must be a benefit the icons convey, as opposed to the other way around. It's disturbing.
Also there are certainly better UIs than others, otherwise people wouldn't complain about software having bad UIs (see GIMP, older versions of Blender, etc).
For example, I also am old enough to have used DOS before Windows 3.1 came out, which was my first GUI. When Windows 95 came out, it was a clear improvement, but retained the same principles as Windows 3.1 began. Windows 98 iterated on it, and Windows 2k perfected it in my eyes.
So that when Windows XP came out, it abandoned the principles, going for a look that felt cartoonish and childish to me, but for others was perhaps casual and inviting. It was during this time that I discovered Compiz and other Linux eye candy, and although it abandoned the fundamental principles that Win 3.1 planted, it almost admitted this with pride, submitting a new set of principles altogether.
So when Windows Vista came out, clearly trying to compete in the arena of that new set of principles of beauty, I was ambivalent but mostly impressed. That's when I found out about Mac OS X, which Compiz et al. were clumsily imitating, and I fell in love with them, which I realized had perfected those principles before Microsoft and Linux even began to imitate them.
It almost seems like the same concept as the original purpose of MMA (mixed martial arts). There is a perfection particularly to a set of principles. You can be the best at boxing, or the best at Brazilian Jui-Jitsu, and it's almost comparing apples to oranges because they're so fundamentally different that they don't actually mix well (the current UFC being proof that the experiment has failed and created a monster).
It's the same reason movies exist like Home Front: it's that age old question, "who would win if ...", in this case London gangsters vs Southern American gangsters. Or Freddy vs Jason, or Alien vs Predator. I wish I could remember more, because those are some of the most interesting types of movies, with different real human cultures being pit against each other. Like David and Goliath, the top champions of two cultures facing off for the world to see.
I think MacPaint is one of the most beautifully designed GUIs ever created.
By that logic, you shouldn't have any particular preference for newer UIs (apart from how similar they are to your "primed" UIs) and you shouldn't ever able to discover new UI patterns you particularly like. You should also be unable to articulate why certain UIs are good or bad, or your reasons should be wildly inconsistent.
I don't think this is generally the case. There lots of articles like this one, and usually the takeaways are similar: UIs should be predictable and consistent and allow the user to reliably find actions and elements. Ideally they should also have markers and "fast paths" to allow more experienced users to find and do an action quickly. They should not overwhelm the user with too much irrelevant things.
That's the very high-level gist of it. There is actually lots of research in the details of it, which you can get a glimpse of in books like "The Design of Everyday Things".
They just don't seem to have a very high priority in current tech development for some reason.
That a good part of the industry seems to have essentially given up on GUIs, left them to the fashionistas and engagement maximizers and retreated to the command line and TUIs instead also doesn't bode well.
I do agree that Steve was somewhat revolutionary in this, bringing his personal quest for beauty into early GUIs through his (overly) perfectionistic command. I remember a decade or more ago being struck by just how genius it was for him to bring professional typography into personal computers, complete with kerning and everything.
GUIs are tools first and art second.
Any GUI that looks good, but gets in the way, or fails to make the task at hand easy is not a good GUI.
The books and research are about making GUIs functional in an effective way. No one is claiming that a GUI that follows all the principles will look good. However it will make the task at hand easier to perform than it otherwise would be.
The problem is when people in charge of GUIs focus exclusively on Graphic Design instead of UI design. They are different fields, with different goals that only sometimes overlap.
The Macintosh Human Interface Guidelines, especially the old ones, were seriously well thought out. Apple just doesn't follow them anymore, and that leads to demonstrably poor design choices. "Consistency" is still one of the three banner concepts (see https://developer.apple.com/design/human-interface-guideline...), and Apple just fails wildly at paying attention to itself.
Windows 2000 and Mac OS 9.x.x were contemporary with each other, are both considered to be strong examples of principled interface execution, and they were very different from each other in both behavior and appearance. They were also notably both the last generations of iterative executions of principled designs before both companies started veering off into the wilderness.
Based on the measurement of these outcomes you can absolutely say whether an interface is better or worse than another.
Anyone in software design in the 90s/early 00s was quite familiar with this subfield of human factors. Very few people making software today are. As vibe coded UIs become more and more widespread, it's only going to get worse.
Admittedly it would be "easier" if Apple gave you the choice of which version to upgrade to in macOS, but that's generally not how software is provided.
Open the Mac App Store and search for Sequoia.
edit: worked after second attempt (via macappstores://apps.apple.com/app/macos-sequoia/id6596773750?mt=12 )
This is the key point for me. I think I go further than the OP, though; I would almost force apps to use the stock menu items. Declare that your app has to save stuff, and the OS can take care of supplying a 'File -> Save' menu item, a Save toolbar icon, and a Save keyboard shortcut.
I guess the more 'liberal' way of doing this would be to make it so easy to do the above that you would really have to go very far out of your way to purposefully deliver a worse experience. But you're free to do that if you're so inclined.
Another example are customized extensions of MarkDown syntax. To my mind constructs like `[link title](like_address)` already stretch things and the only justification for having brackets plus parentheses to stand in for link syntax is their ubiquity. One of the downsides of this terse syntax is its resilience to extensibility. For example, when you start with an exclamation mark as in `` now all of a sudden your code is understood as an `<img>` tag, not an `<a>` tag. What if you need extra attributes? There have been multiple suggestions how to extend `` to include desired image dimensions, none have become standard. You probably should fall back to inserting HTML, which I am fine with.
Lastly, when designing a user interface, I have found that having to choose icons is a significant burden, and often one without satisfactory solution. One of my solutions in the past was to fall back to Apollo-era text-only buttons; sure, the texts would have to be localized, but then the entire application is subject to localization anyway. A plus is that each button with a short text instead of a picture already provided the mnemonic, the identifier for that action, something an icon does not do for you.
Apple software used to be that elevated experience for the average person.
Given the lack of basic consistency though, it’s evident that there are no leaders at Apple that care about UX enough to thoroughly design and test the whole software experience anymore. Just a bunch of random teams doing whatever.
I wonder why every large company seems to fall off in the same way?
Imagine wanting to save and accidentally click close....
> Or two-letter As that only slightly differ in the font size:
In that one case having a large A for bigger and smaller A for smaller makes sense.
Oh, but Google hasn't, so Chrome's icons are still all over the place. Apparently, that's not for the OS to sort out, but every single individual app.
Oh, while some of the menus in Preview are fixed (File, Edit), others aren't (View, Tools). So it's not just down to each app to manage its own icon alignment, but each menu!
This entire UI refresh strikes me as completely unnecessary. I didn't even notice the menu icons. Thanks for that. Just another thing to be annoyed about.
But really, the glaringly obvious ones are already in your face.
1. There is no setting to get rid of the ridiculously over-rounded corners.
2. The dock, which I put on the left, now has about 10 extra pixels between it and the edge of the screen. 10 pixels that now will never, ever, be usable again.
3. All the icons have been forced into a rounded corner box. As it turns out, the human brain is really good at recognizing silhouettes. This just made that part of my brain useless. It retroactively restyled applications' icons that I've used for over a decade.
I'm sure I'll find others, but it's clear that Apple does not care about users. This all about power. They didn't even include settings to turn any of this off...just "take it like we wanna give it to you, plebes".
Infuriating.
And none of these things matter. Literally none of them are core to how an operating system works, just how it looks. I just don't understand UX people, and at this point I'm starting to hate them.
If you're a designer at a top 10 S&P 500 company making 6 figures, you owe it to yourself to have some love for your craft. If a PM tells you to shove a UI style meant for an unsuccessful VR device onto desktop and mobile platforms, say no. Get your colleagues to say no. Make that PM read everything the Nielsen Norman group has ever written. Read it too.
"Icons that look like shit!"
and
"Notification summaries that may not be correct!"
In general I feel as if Apple's software feels buggier and less solid lately across my iPhone and my computers. Won't be upgrading the personal computer for as long as possible
Agreed. Rendering is very flaky. Input events are dropped.
Blinky. Laggy. Two of the Seven Dwarves of Liquid Glass.
Also what happened to their filters? I get daily spam from Apple email addresses now.
For instance you can "hide your e-mail" by using Apple's relay, but if you do so... your payments using Apple Pay will fail unless you fill all the information in manually because the e-mail addresses don't match
It's ridiculous how poorly tested everything is, and that combined with their newly entered foray into the world of politics has nearly destroyed three decades of steady Apple use for me. I'll be actively considering other options, not upgrading, and looking elsewhere for products in spaces they're in
I've never really liked macOS but it feels like someone at Apple was hired just to make it even less likable for me personally lol
Edit:typo
been using a Mac for years, and to this day I don't know how it's possible to navigate directories using Finder. It only has shortcuts for a few folders by default (photos, documents...) and doesn't have a button to navigate to the parent folder. I have literally no idea how to get to my home directory, I need to use the CLI
Command + Up Arrow, which is also visible if you click on the "Go" menu. There is also a toolbar button that shows the entire set of enclosing directories; offhand I can't remember whether this is visible by default. There is also "View -> Show Path Bar" which shows all this information at the bottom of the window.
> I have literally no idea how to get to my home directory
Go -> Home, which shows a shortcut key for this, Command-Shift-H.
I get it’s supposed to be easy to use but so much functionality is hidden behind non-obvious shortcuts. The end result is you either need to memorise a dozen secret handshakes just to perform basic operations, or you give up and revert to 70s technology in the command line.
[GNOME enters the chat]: "That's nothing, I'm way worse!"
However in my personal opinion Nautilus’s breadcrumb picker does edge it against Finder.
So I stand by my comment that Finder is the worst.
That used to be a preference, and last I used it, it was not. It is forced on because that’s how the GNOME developers thought you should use it… “Our way or the highway!” — GNOME devs.
Finder wins based on that alone. Finder wins so completely because of that one single thing that I’ll never voluntarily use GNOME again.
Finder is genuinely horrible. It’s obvious no one at Apple cares about files anymore nor anyone working with them.
We’re all supposed to consume cloud these days or so it seems.
With Apple's focus on cloud services, fixing the bugs that prevent the user from working with their local network storage runs contrary to their financial incentives.
Why on Earth is this a requirement? When you're navigating through Finder using keyboard, it's very inconvenient to use two keypresses to perform a very basic operation. Using Enter to open a file is how every file manager on every operating system works except Finder. Why would Enter key be hardcoded to a file rename operation instead?
It is a typical Apple behaviour of doing things differently from the rest of the world just for the sake of it, even when it's detrimental to the user experience.
Actually I just checked and it's not, technically you can create key equivalents without modifiers as well [1]. For Finder this doesn't work though, because enter seems to be specifically handled before menu-level key equivalent processing. (Note that it's not guaranteed to work on other apps either, based on [2] seems key equivalents are only dispatched if modifier keys exists. But that might be out of date since it worked for the people in the SE post.)
Option+Enter is the next closest thing.
I agree that their implementation here is not good. In fact there's already a "Rename" menu item, which isn't actually wired to the enter hotkey (this is very un mac like because it means there is no easy way to discover it). The "rename" menu item is actually a fairly recent addition to mac (I think maybe 10.11) while Finder itself is ancient (it was one of the last few apps to be migrated to Cocoa and even today still has lots of legacy warts), and possibly no one bothered cleaning things up.
[1] https://apple.stackexchange.com/questions/132984/keyboard-sh...
[2] https://developer.apple.com/library/archive/documentation/Co...
Just add it to the sidebar. Finder > Settings > Sidebar > Locations. Or drag it into Favorites.
> doesn't have a button to navigate to the parent folder
View > Show Path Bar. You can also right click on the directory name at the top of the window and it’ll give you the same options.
The major reason to stay on macOS is stability. Hopefully they stop breaking things on the Mac front.
I would really appreciate it if the next macOS would be about stability instead of some fancy features barely anyone asked for.
Just that usually the forcefed initiatives have to do with corporate profits for shareholders, or trends like shoving AI into everything. Imagine saying no to that!
Even at the supra-corporate and supra-national level, if the organizing principle is competition, no actor not even a CEO or a corporate board or a government can afford to stop racing towards disaster. There is a simple mantra: “If we don’t achieve AGI first, China will and then they’ll dominate.”
Once in a while, the world comes together to successfully ban eg chemical weapons or CFCs, and repair the hole in the ozone layer. Cooperation and restraint takes effort.
Judging by the way we’ve drained all the aquifers, overfished the last fish, destroyed the kelp forests, cut down the rainforsts, bleached the corals, and polluted the world with plastic, I don’t think there is much hope of stopping.
Insects and pollinators are way down, and many larger species are practically extinct, 95% of the world’s animal biomass is humans and their food, and people still pretend environmental catastrophe is all about a few degrees of temperature.
PS: Yes, that escalated quickly. In the real world, it has taken only 80 years… :-/
I reckon it's more that some Apple VP has to justify their million dollar equity package by creating work for their org, because otherwise why should you still have a job?
> People got canned for resisting the corporate overlords. That’s capitalism
Being told to do things by your boss is a problem as old as time. Except with capitalism you can change bosses — a luxury which has not existed throughout history.
Why is capitalist competition worse than any other form of competition? Wouldn’t wartime competition over land and sovereignty be far worse? Didn’t the Soviet Union have extreme forms of political competition?
Why should someone that disagrees with you on whether capitalism is uniquely responsible for bad icon design now be forced to defend it for every sin / shortcoming ranging from the social inequity to ecological collapse?
Corporate structure is driven by exploiting and using value for and by a de facto nobility (the c-suite).
Finance “capitalism” seeks to extract value, be it short or long term.
Engineers are motivated by building and creating value.
Creatives are driven by changes for changing’s sake to remain or get a seat at the table.
The uncomfortable reality is that these are inherently conflicting interests that are pulling and pushing each other, but mostly top down.
It’s essentially the “colonialist” exploitative model of existence using creators to leverage rather than extract natural resources, a system that is increasingly not suitable for the modern, technological, commoditized world. AI is a good example of that; it arguably diminishes the value to n degrees of both engineers and creatives, while also leaving the “nobility” and their neo-aristocratic corporate system out in the open exposed as revealing it not only as having no clothes on, but utterly abusive, useless, and downright evil. And no, that’s across the whole political spectrum, not just the opposite of your silly system approved political sport team.
https://www.youtube.com/watch?v=YSzjcVZXolc
https://tjkelly.com/blog/ios-7-sucks/
And he also takes credit for the dynamic island. It is an assault on my senses to see everything constantly moving around on my screen.
I have been working with Macs since 1995, but this year is my first using Pixel with GrapheneOS, that is how done I am with Apple. Unfortunately I know the UI will not change for years and I just could not take it.
Cook doesn't seem to a have any taste for product design, isn't he a logistics guy?
But he clearly falls afoul of Steve Jobs'warning about leaders with no taste.
It’s not a stretch to say that Tim Cook created the whole Shenzhen microelectronics industry. The thousands of specialist component vendors and large integrators that assemble products trace to his instigation with Compaq and then Apple. The iPod, Macs, iPhone, copied the Swiss Watch model of vast redundant networks of component competetors working as an ecosystem to drive down costs.
This created the skill and machinery base that made it possible for other western design companies (such as Android vendors that were not Samsung or Japanese) to make clones of the iPhone quickly and easily. (Let’s be real, every smartphone is an iPhone 1 clone)
China owes a lot to this work.
How does that benefit anyone?
I don't think that's fully accurate, unless you have a link that confirms it? That Dye designed it, I mean, not that it was horrible...
Jony Ive was the head of design at that point (both hardware and UI). Wikipedia says Dye "contributed greatly to the design language of iOS 7" but Ive would have had final say. Certainly at the time as I recall it, iOS 7 was seen as Ive's baby.
Also, I'm not defending iOS 7, but I reckon its visual design was a lot more influential than it gets credit for. Think of those ubiquitous Prime bottles, with their bright saturated color gradients; the first place I remember seeing that style was iOS 7. I bet they picked that style for Prime because kids liked it, and kids liked it because kids like iPhones.
Edit to add: "bright saturated colors" goes back a long way to Fisher Price toys and the like, of course, but it's the gradients specifically that I think iOS 7 popularized.
Dye was the senior rep of the Design org present and commenting on all our software progress. I never once encountered Ive.
It’s just stupid people doing stupid things.
A lot of us felt at the time that surely laptops and tablets would converge. Otherwise, what a waste of hardware.
But it hasn't really happened. From a hardware perspective, things have gotten closer with the iPad's magnetic keyboard. But, I still find that the iPad as laptop replacement to be a compromise that I may tolerate for travel but don't love for a lot of laptop work.
That's a gigantic market segment, and Apple has to be very careful to not make those devices complicated or vulnerable.
Not sure I'll buy another iPad given my current lifestyle.
I think you've unintentionally illustrated the root of the problem here.
People motivated by profit are not incentivized to produce high-quality results. Rather, people motivated by profit are only incentivized to do the least effort that they can get away with.
People motivated by pride are those who are incentivized to produce good results, because the result reflects on them personally.
Which is all to say, pride motives produce a race to the top, whereas profit motives produce a race to the bottom. It's no wonder our modern economy can only produce slop.
Many top bars have become a group of bubbles over the content, which we’ve been conditioned to see as floating notifications for years. Things shine and move when they don’t require attention, just because.
The end result is that my OS feels like a browser without ad blocker. As much as people hated flat design, at least it didn’t grab your attention with tacky casino tricks.
Genuinely believe Apple’s design team are rudderless or have unintentionally been forced to produce something to justify someone’s career, because this whole thing is disastrous.
This is the curse of being a UI designer for a long lived product. Once a thing has been created and future work consists of 99% code and 1% UI, your UI designer job has evaporated. And so we see that everything changes every major release of an operating system, so the UI people can justify their pay checks.
These changes in design are intended to appeal to our magpie brain of wanting the latest, shiniest, things.
You have to understand the vanity of consumers. If every new product looked the same then a lot of people wouldn’t both buying the latest gizmo because there’s no magpie appeal. So when the market stagnates, you need to redesign the product to convince consumers to throw away a perfectly good, working device.
And it usually works as a sales strategy too.
So designers then get told thy has to come up with something that looks newer and more futuristic than the current designs. Regardless of how much those designers might love or hate those current designs.
They come up with this shit not to justify their jobs but because they’re hired exactly to come up with this shit.
UX folks usually have no understanding of the impact of moving a common control and/or keyboard shortcut.
If Company X didn’t reinvigorate their product line then consumers might switch to Company Ys products because they look shiny and new. Which is literally why people switched from BlackBerry et al to iPhones in the previous decade.
Consumers are fickle and want that dopamine hit when they spend money. I know this and even I find myself chasing shiny things. So there’s no way we can change that kind of consumer behaviour.
To be clear, I’m not saying it’s right that companies do this, but I do think they’d go out of business if they didn’t because consumer trends will continue like this regardless of how ethical companies tried to be.
So the problem here isnt that Apple tried to refresh its operating system look. It’s that they completely jumped the shark and created something that was too focused on aesthetics while failing in literally every other metric.
Can't you swipe past the end on the tab bar (along the bottom by default) to create a new tab?
It’s relatively recent in iOS history that Safari’s address bar is at the bottom. There’s a setting to move it back to the top. This specific example is probably as innocent as a default getting accidentally changed during the development process.
There are some things that are nice. The dock looks nice. The transparent menu bar is nice enough too and there is a toggle off switch if it doesn't work for you. Spotlight looks fine. But the rest is so bad that I just cannot fathom how someone at Apple did not stop it before release. I would be throwing a fit to stop it from being released if I was in Apple and had any sway at all. I assume the executive team was all looking at it and using it before release. So how did this happen? The new side bar and the new tool bars are abominations. I cringe every time I have to use the finder; it is just a blob of various shades of white or, if you prefer, dark mode, grey.
My hope is that if nothing else they roll back the sidebar and the tool bar changes or do a complete rethink on how they are implemented. If they rolled back the extra rounded corners I wouldn't complain either.
Even Apple's own marketing material had screenshots where text was near impossible to read, even for someone with good eyesight: grey text on top of highly transparent glass... what were they thinking!?
It’s dreadful, it still blows my mind that out of Windows, macOS and Linux, my Linux desktop with KDE has the most premium experience now.
My first rebuttal was going to be Windows 8, but that was actually a lot better.
Windows 11 is, I think, worse than MacOS these days, half for still dragging the past along with it, and half for introducing a second start menu just for ads.
Keyword “looks”. Because considering behavior, there’s tons of delay introduced and results change under your finger as you’re selecting them, causing you to get the wrong thing.
More than likely designers are making up work to justify their jobs. Not good for your career if you admit the desktop interface was perfected in ~1995.
Edit: On Linux, you have desktop environments like LXQt for this. Unfortunately, last time I checked, Wayland was not supported.
Acrobat reader still performs like a lead balloon though, even a miracle can't fix that one.
Contrast this with the "os" of my LG oled monitor. It seriously takes 5 seconds to open the settings menu.
I'm not sure what they use these days, but 10-15 years ago the MCU in a monitor was likely to be a ~10MHz 8051.
Not to mention the fact that first, you have to get to a point where AR wearables are commercially viable, and we don't seem to have hit that point yet.
It has incrementally improved, and gotten cheaper, to the point that I now see them everywhere. When they first came out, they were pretty expensive. Remember the $17,000 gold Watch (which is now obsolete)? The ceramic ones were over a couple of grand.
But the dream of selling Watch apps seems to have died. I think most folks just use the built-in apps.
I think the current rumor is that development of a cheaper XR headset has been shelved in favor of working on something to compete with Meta's AI glasses.
If it were around the $500 point I’d pick one up in a heartbeat. Maybe even $1000. But $3500 is nuts for how little they’re offering. It seems like a toy for the ultra rich.
I assumed the price would eventually come down. But it seems like they’ll just cancel the project entirely. Pity.
It's going to take a revolution on miniaturization AND component pricing for XR to be feasible even for enterprise use cases, it seems.
I'd buy one if I could use it with my Linux (KDE) workstation, but there's no chance I'm going to be using it via a mac.
But being tied to Apple's ecosystem, not being really useful for PC connection, and the fact that at least at the time developers were not making any groundbreaking apps for it all makes it a failure in my book.
If Valve can get 60% of that and be wirelessly PC tied for VR gaming then even if they charge $1800 for their headset it will likely be worth it.
All of them immediately hate that it’s bulky, it’s heavy, it messes with your hair, messes with your makeup, doesn’t play well with your glasses, it feels hot and sweaty. Everyone wants to take it off after 5-10 minutes at most, and never asks to try it again (even tho the more impressive 3D content does get a “that’s kinda cool” acknowledgment).
The headset form factor is just a complete dud, and it’s 100% clear that Apple knew that but pushed it anyway to show that they were doing “something”.
Wearable products, outside of headphones, have a decade-long dismal sales record and even more abysmal user retention story. No board is going to approve significant investment in the space unless there's a viable story. 4x resolution and battery life alone is not enough to resuscitate VR/AR for mass adoption.
Outside of headphones and watches
I would see 9 garmins for 1 Apple Watch for instance and many more people wearing cheap casios or no watch at all.
I must admit I don't understand the point of a smart watch when most people have their smartphone in their hand a significant amount of time a day and said smartphones screen sizes have been increasing over the year because people want to be able to doom scroll at pictures and videos and interact with whatsapp all day. I don't know how you can do that from a tiny screen on a watch.
Those like me who don't subscribe to that way of living don't want distractions and notification so they use regular watches and would see as a regression a device that needs to be charged every few days.
Some people said payments but I see peolle paying with their smartphone all the time since they have it at hands or in a pocket very close anytime having it in a watch doesn't look like a sigmificant improvement. I'd be curious to see a chart of smartwatch adoption by country.
I do agree though, anecdotal experiences will vary depending on the kind of people you hang out with. For the people I know heavily into running and cycling, brands like Garmin are over represented. Meanwhile lots of other consumers practically don't even know these are options.
[1] https://www.mordorintelligence.com/industry-reports/europe-s...
In recent weeks, I’ve been getting push notifications about VP.
They hired Alex Lindsay for a position in Developer Relations.
And there’s the M5 update.
Just remember, it’s a lot cheaper than the original Mac(inflation adjusted). Give it 40 years – hell, given the speed of change in tech these days, it won’t even take 10.
(Apologies to @cyberge99 if my tone comes off intense, this is not to come at you but rather is just me venting my frustrations with Apple. I think you are correct in your assessment of the idea here.)
All people I know describe this usecase first: “Will be awesome when it replaces my 2x34" screens”. I described it to the salesman when he asked me why I wanted to try it. He never showed it. Gave him 0/5, he complained, I explained this is specifically what I asked. You can emulate one screen in VisionPro but it’s absolutely obnoxious about making it about apps and iPhotos 3D whatever. Users desire it. Apple is hell-bent in not addressing that usecase, and addressing family usecases first.
Imagine they find a proper UI to visualize an infinite Typescript file. Something like flinch and you find yourself in a method, look elsewhere and you immediately see the other method. Make it viral by making it possible to write files in a way that is not practical to normal-screen users, like the old “max 1 screen height” limit. View your team in the corners of your vision. THE tool for remote working.
Workplaces would look futuristic. Your experience at the workplace would be renewed thanks to Apple.
And then, reuse the desktop’s UI on VisionPro instead of the desktop using VP’s concepts.
But no, Apple prefers killing off VisionPro and imposing LiquidGlass to everyone. (In waiting for my threat letter from Steve Jobs for suggesting ideas now).
Ummm, you know he died yeah?
It seems much more likely that the driver here was to produce a UI that was resource intensive and hard to replicate unless you control the processors that go into your devices as well as the entire graphics processing stack that sits above that as well. It seems created to flaunt the concept of "go ahead and try to copy this" to Google and Microsoft.
Even if Apple is right, why shoehorn the future into the present on devices unsuitable for its new paradigms? The iOSification also only worsened the macOS UX. It's one of the reasons I moved to Linux with KDE which I can configure as I like.
If they want make the AR OS of the future then make it on the vision pro where it belongs.
No, this is the fault of a company and industry with way too much money and not knowing what to do with it.
So they hired a bunch of artists who would otherwise be carving wood in a decrepit loft somewhere after taking half a mushroom cap. These people now decide how computers should operate.
I remember watching a documentary from the 80s where Susan Kare explained how every pixel in the Macintosh UI was deliberately put there to help the user in some way. One lady did the whole thing, the whole OS.
Now we have entire teams of people trying to turn your computer into an avant-garde art piece.
…brother, you’ve just described the history of the personal computer and the Internet. It’s not the hippie artists causing this problem, I promise you that.
https://www.mondo2000.com/the-inspiration-for-hypercard/
The last decade or so of Apple designers have been as out of their minds on ego and cocaine as Donald Trump Jr.
VisionPro was meant to literally overlay its interface over your field of vision. That's a very different context and interaction paradigm. Trying to shoehorn the adaptations they made for it into their other, far more popular interfaces for the sake of consistency? It's absurd.
Things like “human interface guidelines” get written by nerds who dive deep into user studies to make graphs about how target size correlates to error rate when clicking an item on screen.
Things like Liquid Glass get designed by people who salivate over a button rendering a shader backed gradient while muttering to themselves “did I cook bro???”
They’re just two very orthogonal cultures. The latter is what passes for interface design in software these days.
Apple looked at innovations in hardware form factor and, rather than trying to out-innovate in that sphere, said, instead: how do we make something in software that nobody would ever try to imitate, and thus position ourselves as the innovators once again?
And the monkey's paw curled and said: Liquid Glass is a set of translucency interactions that are technically near-impossible to imitate, sure, but the real reason nobody will try to imitate is because they are user-hostile to a brand-breaking extent.
And Apple had nobody willing to see this trainwreck happening and press the brakes.
ios 7 relied heavily on blurring effects-- a flex at the time due to the efficient graphic pipeline vs android they had. this was coming off the heels of Samsung xerox'ing and they wanted a design that would be too expensive for competitors to emulate without expensive battery hit. liquid glass is following in this tradition.
and similarly to ios 7, the move to flat design was predicated on the introduction of new form factors and screen. flat design lent itself well to new incoming screen sizes and ratios. prior there was really one or two sizes and that was it, easy to target pixel perfect designs against. as apple moves to foldables and more, this design flexibility is once again required.
as for no one trying to emulate it, i'm not so sure, OriginOS 6 ripped it off back in October.
It also contributes to obsolescing older hardware.
Where what we really needed was a stable release version (now a year late from the original promised date) so we can build out UI components for the content editors to use that don't require constant design tweaks.
You know the designers are:
a) Just fucking around having fun
b) Making busy work to drag it out as long as possible
As it's now 4 years since they began working on the "design system", there's a good chance it will get canned as there's some more modern design they will want to use.
This has been solved with a button that switches the layout between the two designs, when I'm making changes it is sometimes necessary to flip back and forth between the two mid-change.
The anti-design bias in this forum is genuinely unhinged. I see some saying the entire destruction of the natural world stems from design lol.
"Behind every great fortune is an equally great crime."
https://www.britannica.com/biography/Honore-de-Balzac/La-Com...
It is 2026 and UIs are still abysmally slow in many cases. How is that even remotely possible? Now, with that in mind, consider (just for a moment) why people might think that UX people don't know what they're doing.
Because UI/X teams were separated from engineering. (Same thing happened with modern building architecture)
It's fundamentally impossible to optimize if you're unaware of physical constraints.
We need to get rid of the "It's okay to be a UI/UX designer who doesn't code" cult. (Looking at you, Adobe and Figma...)
We have convinced ourselves as an industry that this is not true, but it is true.
I don’t think designers who don’t code are really a problem. They just need to dogfood, and be lead by someone who cares (and dogfoods as well).
Switching windows between two non apple monitors after waking from sleep is wildly unpredictable and has insane ux like resizing itself after a drag.
My carplay always starts something playing on my car speakers even when I wasn't listening to anything before connecting. It's so off it's comical.
The iPhone alarm will go off like normal, loudly from the speaker, even if you're currently on the phone and have it up to your ear. This has been a problem since my very first iPhone.
There has been a bug about plugged in physical headphones being unrecognized sometimes after waking from sleep even if it worked fine when going into sleep. I checked once in probably 2014 and Apples' official response was that it literally wasn't physically possible despite all of us people experiencing it. The bug was ancient even at that time and >ten years later my m4 macbook pro STILL DOES IT.
Apple and apple fanboys seem to take the stance that these are all user error on my part (remember the "you just aren't a Mac person" era?). I bet some of these are configurable with settings deep in some menu somewhere so from a certain perspective that's right but also underscores my point about the limitations of myopic dogfooding.
As a fun aside, the ux for turning on the "Voice Over" tutorial is the worse thing I've ever experienced on an Apple device. I was laughing out loud trying to figure out how to get out of it instead of finishing the unknown remaining steps. I feel bad for folks who need that accessibility in order to be effective.
Yes. Yes, it has. I'm currently in the midst of a building project that's ten months behind schedule (and I do not know how many millions of dollars over budget), and I'd blame every one of the problems on that. I - the IT guy - was involved in the design stage, and now in construction (as in, actually doing physical labor on-site), and I'm the only person who straddles the divide.
It's utterly bizarre, because everyone gets things wrong - architects and engineers don't appreciate physical constraints; construction crews don't understand functional or design considerations - so the only way to get things right is for someone to understand both, but (apart from me, in my area - which is why I make sure to participate at both stages) literally no one on the project does.
Seen from a perspective of incentives I guess I can understand how we got here: the architects and engineers don't have to leave their offices, and are more "productive" in that they can work on more projects per year, and the construction crews can keep on cashing their sweet overtime checks. Holy shit, though, is it dispiriting to watch from a somewhat detached perspective.
I don't think anyone seriously believes Uber, Airbnb and Robinhood won because of "beautiful apps".
Unfortunately, most of the SW industry isn't even aware of the difference:
For beauty you hire a graphic designer
For usability you hire a PhD in cognitive psychology
RH made a lot of investment tool accessible to people that "I just want to buy stock of some company", I used tasty trades for a while, but their mobile app while has all functionality, but realistically you will just look to overview portfolio.
Things got pretty bad. More than 95% of all employees (and I'm guessing 99% of designers) were using iPhones at the time. There would be rough edges all over the Android app, but as one of our designers said "people with taste don't use Android".
Imagine knowing that most of your new users were getting a subpar experience, and that not being enough motivation to expense a flagship Android and drive it daily.
But the new users kept coming, and despite mostly being Android users, they still used the product. Turns out that legacy taxis are themselves an ugly interface, and ugliness is relative.
Probably true at the time.
Probably the vast majority of profitable Uber users were still on iOS, though, like most apps?
> but as one of our designers said "people with taste don't use Android".
Based lol
In my opinion, this article had very clear and direct criticisms; they were hardly "anti-design bias". The increase in visual clutter is, for sure, a net loss for MacOS Tahoe.
It's slow, bloated, buggy and ugly. Probably one of the worst apps running on my phone.
But there was a time when their app was native and was actually quite good.
Though if we could get the newer settings panel of macOS a few versions back, before they inexplicably ruined the best OS GUI settings interface I’ve ever used, that’d be great.
But since then, each new version of Windows has made me more and more grateful for not having to deal with that dumpster fire on my personal devices.
The saddest part to me is that I have the strong impression it wouldn't take that much work to turn Windows into a much better system. But for whatever reason, Microsoft is not interested in making that happen. Maybe they are incapable of doing so. But the system itself not the reason.
User interfaces are not art.
Do UI designers think that way?
I imagine some see it as engineering - make things work efficiently for the users. Others see it as art. The outcome will depend on which group gains the upper hand.
1. "Picasso, that's the wrong way to depict a human nose."
2. "Picasso, that's the wrong material, that vibrant paint is poisonous and will turn to black flakes within the year and the frame will become warped."
I interpret parent-poster's "interfaces are not art" as meaning they're mostly craftsmanship.
It may not be quantifiable enough to be labeled "engineering", but it's still much less-subjective and more goal-oriented than the "pure art" portion. All these interfaces need to be useful to tasks in the long term. (Or at least long enough for an unscrupulous vendor to take the money and run.)
- Arguably the dock, though it's probably contentious - Ubiquitous instant search (e.g. Spotlight) - Gesture-based automatic tiling of windows to left/right side of the screen, tiling presets - Smooth scrolling, either via scroll wheel or trackpad - Gesture-based multi tasking, etc - Virtual desktops/multiple workspaces - Autosave - Folder stacks, grouping of items in file lists - Tabbed windows - Full-screen mode - Separate system-wide light and dark modes - Enhanced IME input for non-latin languages - App stores, automatic updating - Automatic backup, file versioning - Compositing Window Managers (Quartz, Compiz, DWM, modern Wayland compositors...) - The "sources bar" UI pattern - Centralized notification centers - Stack view controlelr style navigation for settings (back/forward buttons) - Multi device clipboard synchronization - Other handoff features - Many accessibility features - The many iteration of Widgets - Installable web apps - Virtual printers ("print to PDF") - Autocomplete/autocorrect - PIP video playback - Tags/Labels - File proxies/"representations" - Built-in clipboard management - Wiggle the mouse to find the pointer
None of these can be said to be at their final/"perfect" form today, and there are hundreds if not thousand of papercuts and refinements that can be made.
The real issue is probably due to management misunderstanding designer's jobs, and allocating them incorrectly. The focus should be more on the interactions and behaviors than necessarily on the visuals.
The Dock came from NeXtSTEP circa 1989. It had square edges and no Happy Mac. (So did Mail.app, TextEdit, some of the OS X Finder, and a whole bunch of other things.)
To the untrained eye it looks like an Apple innovation because most people couldn't afford NeXt computers unless you worked in a university or research lab.
I don't need or want art, eye candy, or animations. I need to get work done and the rest of the OS to stay tf out of my way.
Designers at Balenciaga don't have to justify their jobs when they make oversized t-shirts, neither do the ones at Apple.
In actual tools, the form and function are strongly connected. Tools of competing brands look pretty much the same, except for color accents, because they can't look any different without sacrificing functionality, performance and safety characteristics.
You don't see power tool vendors trying to differentiate their impact drivers by replacing rubber handles with flat glass because it's more "modern", because it would compromise safety and make the tool unsuitable for most jobs its competitors fulfill. This happens in software mostly because the tools aren't doing much of anything substantial - they're not powerful enough for design to actually matter.
> No project manager ever got promoted for saying "let's keep things the same".
Maybe stakeholders were calling the shots and everyone was like, "Fine. If you want us to reuse the same icon for different purposes, you're the boss. We are done trying to explain why this is a bad idea."
... your career requires constantly chasing after what amounts to fashion trends every few years, otherwise it's a solved problem and probably does not provide much of a career
- UI/UX pros who understand this stuff: “I hate it” - everyone else: “I didn’t notice until you pointed it out”
I bet they’ve sold approx as many as they thought they would.
This product is a placeholder for a cheaper and lighter one in the future.
Few days ago I booted a very old device running High Sierra and the UI and old Dock look so clean.
That desktop was peak for me, and the age starts showing a bit in Finder, but it's still more usable than today's versions.
Also absurd is that tabs and menus are not attached to their elements.
If you run your own design agency, you've got your own company's reputation and yours on the line, so be as opinionated as you find necessary, but otherwise if you're just an employee without an inordinate amount of clear authority within the scope of your discipline at the large company (you know if you do or don't) then don't try and create a mutiny, it will more than likely be a childish assumption of personal risk on your part, much more so than it costs the company, much more so than anyone else needs to care, because someone on a forum told you to be passionate about round rects or small icons or whatever. If you need to tell your boss "google NN group", you probably don't have the trust or experience to be successful with such a play.
It's okay to have a personal hatred of it and do what you can to steer the work appropriately, but when you're tasked with a dumbass plan, let it be the decision-makers' dumbass plan, unless it's your decision to make. Let it be the project we tried and it didn't and couldn't have worked out, which sometimes happens, but you learn and then leave if it's pervasive and you have other options.
It would be remarkably stupid to single yourself out as the person who thinks of themselves as the reincarnation of Steve Jobs and risk your livelihood to save Apple's reputation. The unlikely upside is that you get your way and that can boost your confidence, but the downside is that you fumble your best shot at financial security for the rest of your life because you thought you'd be received well.
That's not to say you shouldn't say no to nothing, or have love for your craft, just don't pretend it's your job to, unless it is, which it's probably not. Disagree and let it be a failure if it's going to be, feel vindicated if it is, but the money is there for you if not. The people who worked on the Vision Pro aren't responsible for it being a dud product, and they can be proud of what they did design-wise and technologically despite that.
In the long run, no you don't want to set that much of your taste or expertise aside forever, but you shouldn't have to, it comes with all the things I said, trust & agency.
The only UI change that I've found useful since Yosemite was Mojave's introduction of a dark mode. They made the fonts look worse on non-Retina displays, threw out the Preference pane in favor of a weird list that can only be resized vertically, added transparent everything, and banned any icon that's not in a squircle. Such UI, many differentiation, much insanely great, wow!
Anyway, I bought a ThinkPad.
Soon, I'll get my hands on one of those fancy AMD AI Max's and go Linux everywhere.
Edit: Oh there is an icon to disable it, but still.
Not to even mention hardware support, as I had a lot of issues with Realtek external USB network devices randomly disconnecting (and they are in many USB-C hubs, including inside USB-C monitors), with no such issues under Ubuntu.
I imagine there is some history around MacOS being similarly much better in the past, but I've never seen anything great about MacOS UI/UX in comparison to GNOME.
I do like their performance and battery life, but the "shells" it's stuck in also sucks (until recently, only glossy screens; shallow keyboards with sharp palmrest edges; either heavy or passively cooled; no touchpad buttons...). Putting some of this hardware into a new Thinkpad X1 Carbon case would be amazing, though I'd want to run Linux on it.
But on the other hand, I think 95% of the icons in the first menu in this article are clear and probably help most people navigate faster.
I wait and hope that some point apple will revert certain changes, or at least give more options for disabling some stuff.
That said, Apple's Liquid Glass is really poor UI. It works okay on my Macbook but feel like it's basically broken a couple features of my second gen iPhone SE, which is kind of untenable imo. Apple also clearly seems to design for larger devices now, which I get but... am I any less of a customer because I use an older device? why should I be de-prioritized?
Lastly, speaking of UI/UX - this blog's website was really bad! Ironic that a blog on UI/UX would have bubbles floating down the screen interfering with text readability and no way to turn them off!
I do prefer this approach because it makes the symbols generally more useful for everything else than menu/toolbar icons. However as the article makes very obvious, unless a consistent scheme is placed in place, program developers will choose whatever they want to represent common actions.
Citation needed. The open icon is general a file containing an arrow or a file as a metaphor of taking a file out of a folder. I just tested with MS Windows XP, MS Windows 7, MS Windows 10 and Mate/Gnome 2.
Including the hotkeys in the menu is good for similar reasons. Does it help me find and click the menu item? No. But does it help me use that action next time without going through the menu? Yes. Icons are same.
Imho the best layout for menu-bars was Windows Phone 7. In WP7, a toolbar of action button icons were shown along the bottom of the screen along with a kebab-button. Clicking the kebab-button would just expand out the bottom-bar into a menu showing the icons in the same order as they were in the toolbar along with a text description. Below the toolbar icons would be all other non-toolbar commands.
It made it clear that the toolbar and the menu were the same thing, just the toolbar is an abbreviated form of the menu for the sake of economy of screen real-estate.
Putting icons throughout menus is kind of a cruder version of same. I like that.
I consider myself quite tolerant of UX quirks, iPhones are still pleasant to use, particularly if you select 'reduce motion' from the accessibility settings.
Tahoe though, bugs aside, is just genuinely unpleasant to use and interact with. By far the most offensive thing to me is the pointless rounded rectangle thing. It delivers absolutely no value at all to the user and defies any form of justification. How in any form is this a decision designed to improve things for the user?
The other multiple weirdnesses commented on elsewhere while unpleasant are more liveable with, but I honestly never found a single change that improved my interactions with the computer. How on earth can you have spent a whole year on this and why didn't anyone have the authority to pull the plug?
I would no longer recommend a new mac to my anyone. A second hand model running a previous operating system makes far more sense.
It all seems a bit needlessly tricky. Frankly though - for me at least - it's worth the trouble.
A brand new Apple iPhone 17 Pro.
Constantly lagging and locking up in preparation for another transparent animation that absolutely no one asked for.
Feeling like Apple just mugged me and stole $1,000.
PS— Edward Tufte has some interesting perspectives on data visualization but the reason he’s so popular with the engineering crowd is because he was an engineer and he makes cut-and-dried rules about things that are easy to understand without any design education, and explains them in a way that appeals to engineers. Reading that book is better than nothing, but it’s gives laypeople about as much understanding of design as a “[language] cookbook” gives laypeople an understanding of programming.
The first thing I did with Tahoe was go into the System Preferences to try and turn off as much as possible of the new UI because it's the biggest regression in Mac OS history (at least since I've been using Macs).
I used to mock Windows for the Explorer UI and general GUI experience, now I think I prefer using my Windows 10 PC over my Mac. It's just such a fucking mess, so inconsistent, shitty performance (even on brand new macs and phones), is actively harder to grok, much harder to use for my elderly parents, and doesn't even look "cooler" or "better" in any way. It's just worse on every possible metric, and made me start wondering about an Android phone, which has never happened since I bought an original iPhone.
I am and have been the ultimate Apple fanboy since 1992, but this release fucking sucks balls. I hope you're listening Apple.
I would wear the t-shirt with Reeder screenshot to work if I worked at Apple, and would observe who notices it.
If it can't be explained in words here on this site, could you please tell me what I'm looking at/for in the screenhot?
Looking at it for 30s, I still don't understand what Apple was trying to do. What am I supposed to believe happens to the table as it goes under that floating menu? It clearly doesn't seem to continue all the way to the left edge of the window. Why not? If not, why bother with that whole floating menu concept if the underlying content arbitrarily stops at the menu?
The most surprising part to me is how people keep calling that nonsense "skeuomorphic" when it doesn't replicate any kind of physical intuition known to mankind. It's just made up physics that looks dumb.
It seems like the preview for not-fans-of-Elon is also missing a screenshot?
Are there some perverse incentives to having the OS upgrades be free? Is that what is causing this? Do they simply have no taste?
Same name, same menu location, same shortcut key.
Much easier to train yourself or your employees - learn one app and you'd be familiar with the other apps in the Microsoft Office product line.
Other companies that tried to create Office-like suites didn't/couldn't create the consistency amongst the apps since they had to acquire the missing apps from other software makers to complete their own Office-like suite.
But this consistency was tempered by common sense.
The goal of shortcut keys was to make common actions quickly accessible to users. But for consistency, the same shortcut key should be used across apps.
When Mail/Outlook was introduced, users found that CTRL-F was bound to the Find command. Makes sense on first thought. But what's the most common command in an email app - is it "Find" or is it "Forward email". Especially when the prevailing standard for CTRL-F in email apps was CTRL-F.
When Bill Gates angrily complains that he's always invoking the Find command by mistake, Program Managers are willing to make exceptions to dogma.
It'd be interesting to compare the menus in Apple's iWork suite (Pages / Numbers / Keynote) to see if their menu items /shortcut keys are consistent or unique.
Note that this Microsoft Office suite consistency didn't necessarily extend to other Microsoft apps. There's no one managing menu item consistency company-wide at Microsoft, just within Office.
Apple Human Interface Guidelines were foundational to Win 3.1. At the same time, doing UI on Unix I was always asked to solve problems by “doing it the Apple Way.” My secret was the MITRE Corp UI book which surveyed best practices from all platforms and underlined the reasoning.
I wish that OS developers would provide the option to retain the bulk of the old UI when a new one comes out, implementing the UI like a swappable "theme." People who prefer consistency could keep most of their old UI, and those who prefer the newer UI can have it.
But where are they? Because they’re not leaving their imprint on any of the big tech companies in recent years.
This new system of icons is something I was entirely unaware of until now, since I haven't updated or had to use an updated MacOS device. I've in the past been a defender of MacOS, but it really does feel like the decision making is completely off the rails and has no consideration of the most basic principles of design. Baffling.
The exact reason/s for this to happen is hard to figure out. Leadership changes, trends, getting too comfortable, lack of competition, the list goes on...
There's always bad reactions to change, but eventually they fade away because the product turns out to be good and just needs some time to get used to it. But this time, this is not the case. Liquid glass sucks and so does the UX that came with it.
Apple will eventually fix this mess, they have all the resources in the world to do so.
That the icons exist is not necessarily a problem, since they can help teach users which buttons in the UI do which actions. (menu bar for discovery, app UI for less mouse travel + contextual options). But that requires consistency, which the current implementation lacks...
The reason for all this change is simple.
The first reason is planned obsolescence. The GUI has to change enough such that it looks like there’s constant progress so users think the company is moving forward when in reality GUI design plateaued decades ago.
The second reason is designers need to stay employed. So they change inconsequential things and make up reasoning to justify it. Liquid Glass is one of these things.
I also want to note that the most useable GUI is not the prettiest GUI. The current MAC GUI looks more modern and better then he one in the OPs guideline example. So gui design isn’t just about usability, it’s about manipulating consumer psychology.
As consumers ourselves There’s two traps here that people fall for. The first is aforementioned it’s that it looks better and feels flashier (like Liquid Glass) but isn’t rationally or logically better (in fact it can be worse). Most HNers don’t fall for this trap.
The second trap is to think these changes actually matter. Liquid Glass barely changed anything. More icons barely changed anything. This entire blog post is making it out to be a bigger deal than it is when in actual reality the difference is so minor it’s negligible. Every HNer falls for this trap.
Working at BigTech, this is the answer. ICs need to find their own impact. That's how you get things like Material Design 3 which talked about how "Bold" it made a brand look - "Boldness" is something you can measure with user tests, and designers need something they can point to and call success; even if everyone knows it's stupid.
It is as much of an actual business strategy as it is a method used to stay relevant in the company.
For example: "Look how much faster you can find Save or Share in the right variant..."
But each variant took me the same amount of time. Or so I think. But that demonstrates the issue: is any of this being measured and analyzed?
My opinion is that much of design is just "convincing opinion wins" (where convincing-ness is often not at all based on measurement of some kind), leading to crappy stuff like ultra flat design and Corporate Memphis.
Perhaps my biggest gripe is that many of these terrible UI/UX patterns are built in at such a low level, it is near impossible for developers to override them in the software they build. For example, I really dislike flat UI and particularly flat scrollbars. But it is near impossible to add scrollbars that look like these in any Windows or Mac app I build: https://flowingdata.com/wp-content/uploads/2024/02/evolution...
Usability has become theatre across Apple products. The sad part is that since Microsoft just seems to copy Apple, over time Windows usability has also degraded severely. I am so frustrated by what Apple, Microsoft and Google have done.
Why has this happened? Everything is too easy. You used to have to have pretty good intelligence to break into UI design and software development. Now, anyone can do it with 30 minutes of "e-learning", and, therefore, the average IQ of a UI designer/or software engineer has decreased, dramatically.
Exhibit A: In Safari I had to "share" this page to use the "Find on Page" feature to search whether anyone had mentioned the share button yet. Bonkers.
You can also long-press on the address bar to see a whole slew of even more hidden functionality
(seriously though, thanks).
Feels like this could be a riff on Android's 'Share' functionality, which is actually the user-friendly name for "send an Intent". And that means that any inter-app handling ends up stuffed into a "Share" menu that pretends only social media apps exist. So you do "Share -> Edit with Photoshop" or similar.
Or type your search in the address bar, scroll to the bottom to the "On this page Find ..." functionality.
A rare example of two different ways to do the same thing on iPhone
I'd argue it's not comparable to the 1992 standards because there's not clutter on the right due to dimming for the hotkey labels. These guidelines were written only slightly through Mac OS's colour era, with an extensive install base of monochrome Macintoshes where you could only depict dimming with hard-to-read dithering. Now that colour is ubiquitous, this gives designers the option to fade or tint UI items to make them look less distracting or to deemphasise them.
Even without a designer's detail-aware eyes, I couldn't stop facepalming with what I saw. I'd be embarrassed to ship that and call it an improvement! Apple icons may look cool and consistent when browsing them all together as part of SF Symbols, but all that disappears when used incorrectly.
Reading that as an animation of snow completely blocks my ability to read
Additional thought: It's always interesting to see a website linked from HN with a method for user input and a means to see what other users have posted. Just a funny juxtaposition between the buttoned-up ego (not in the pejorative sense) of the HN poster vs the screaming id of that same user on a different platform.
.icon {
text-align: justify;
display: none;
}It's the software equivalent of fast fashion.
Just avoid it and stay with true and tried staples instead.
The past wasn't as rosy: while the left column is purposefully confusing (the icons don't match the description), the lack of keyboard shortcuts is just bad design "for the sake of emptiness". It degrades, rather than enhances, the "usability of the interface"
And re. icons: while this is correct
> The main function of an icon is to help you find what you are looking for faster.
The following isn't as straight-forward
> Perhaps counter-intuitively, adding an icon to everything is exactly the wrong thing to do. To stand out, things need to be different. But if everything has an icon, nothing stands out.
Not really, there is plenty of difference - length, the presense of ... ellipsis ..., the presence of a keyboard shortcut, the icon itself. This combo gives visual cues without reading, so improves the ease of finding. Yes, it would be better to have colors here and there, but then you have the following fundamental issue:
> Look how much faster you can find Save or Share in the right variant:
But "Save" is something I do NOT want to find fast, I never use that menu item! So I'd prefer the slighly worse (but not bad) busy version rather than a highlight of useless menu items and no icons for the menus I'd actually use!
Of course, there is an easy way out - user customization to match user needs (maybe you never use shortcuts, fine, remove the "noise" then; maybe you don't care about "Save", fine, remove the icon there), but that was anathema even in 1992.
(but otherwise very good criticism of the basic design fails like tiny size, inconsistency, lack of vertical alignment, bad metaphors, etc.)
QGIS is free software, so it can be somewhat excused vs a billion dollar company. But they could really benefit from some UX expertise...
But what's interesting is why such hypocrisy persists - in particular, why so bad now?
Yes, designers might need to make work for themselves; yes, a new OS has to seem new, to justify upgrades and convince younglings that this isn't the oldsters' ride; but haventt these always been true?
What's different is Apple's slide into disorganization, as it spreads work around the globe in exchange for market access, and internal leaders coast in their mutual non-aggression pacts.
What remains of the center can issue global orders (adopt the liquid glass aesthetic; put icons on every action) and the periphery can comply - nominally, imperfectly, and inconsistently. Quality issues come to be tolerated like chronic inflammation, and even deployed in passive-aggressive turf battles.
"Back in the day" everyone would be pulled into a rock-tumbler room and grind it out. That's neither possible nor wanted today (as game theory effaced the requisite obliviousness).
What to do? Many YC companies have bonding time, where scattered teams join up for intense periods to restore alignment. Otherwise, the Apple might be ripe for a round of organizational consolidation.
Personally, I think internal competition with some misses and inconsistencies are a good thing long term. Inflammation is not cancer, and there are better ways to tamp it down.
Apparently I'm qualified to be a designer for Apple.
links for those interested in downgrading:
https://github.com/LukeZGD/Legacy-iOS-Kit/wiki/How-to-Use
https://github.com/LukeZGD/Legacy-iOS-Kit/wiki/Restore-32-bi...
But also do provide 15 different themes for the OS and make sharing themes trivial and built-in so that you can upvote a theme you like (or even a specific icon you like), downvote the one you don't, install the most popular theme in few clicks.
"It shouldn't look cluttered" --> "Apply ever increasing amounts of padding/margin everywhere"
"keep it simple" --> "monochrome is the happy place", etc
etc
Those are all hardware upgrades that Apple profits from. What incentive does Apple have to make the App Store better, or improve the visual clarity in the iOS and macOS interface? Shouldn't we be seeing downward pressure there too, if innovation can be generalized to software?
Most users don't have a say in the matter, and Apple has exploited their ambivalence for decades. If you're the sort of person who cares, you're not Apple's target audience.
This is only an opinion, but it feels like UX in general is moving towards making things cute rather than usable. Liquid glass is a case in point.
This is useful to a point, in the same sense that one might beauty, but when this is all you care about, it becomes a problem.
One small nitpick: the ellipsis reuse in item #8 - that’s actually valid and would have been back in the 1990s as well. A menu item that is followed by an ellipsis indicates that selecting that menu option will open a dialog box. Inconsistently applied, but that’s always been the meaning.
He has a knack for putting words to the vague frustrations I feel but can't quite articulate. How does he find so many perfect examples that nail exactly what's wrong?
Then something changed.
Touch bar was a miss. LaunchPad was a miss. I don't see a use of "Stage manager". iPad has gone to shit. Widgets came back, on mac... for no reason.
In Tahoe, the new spotlight search is one of the better features of Tahoe but i am fine with Alfred. But by and large, there are more annoyances in Tahoe than improvements.
For the iPhone and liquid glass, I am convinced that it was done to force people to upgrade to a new device. There *has* to be a reason to upgrade, and when hardware and software feature plateau, then we even the planned obsolescence era.
With the potential to set off the installation flow with the wrong click (when its being shown over-and-over again), it makes me anxious and feel like I'm not even in control of my own computer anymore.
For the time being, I've installed a management profile to defer updates, disabled the Settings options for automatic updates, and used "Quiet You!" to try and keep the notifications at bay.
But the maximum deferral time for profiles is 90 days, so if anyone knows of a better solution or work-around, please let me know
I agree the icons look cluttered, but they are likely addressing the fact that most users may not comprehend the meaning of those actions.
New Design, New Features, New Programming Language, New Products or even New Sector. It is like Apple without Steve Jobs again the first time around. None of them were done because it is better for the customers, but do so because it is better for paid promotion, justify their existing department budget or increasing it, and simply for profits.
In many ways Apple is still best of the pack, but they are no longer the same.
To all the people also whining about the snow, as if it invalidates the opinions written in the article, again, PLEASE click the buttons at the top of the page.
The author is clearly going for a fun, whimsical, playful vibe that works perfectly fine on a personal blog. Expecting something different from a workstation operating system made by one of the Fortune 10, especially one who heavily markets their design "prowess", is perfectly reasonable.
Things have become very... Amateurish. Think of the way these apps got to where they are-who decided to put all those icons? Probably someone who hasn't a good understanding of usability but maybe - like someone with not enough domain knowledge - looked at other apps and thought that the icons look pretty, while having a small amount of understanding of their purpose.
Why is that happening? I have theories... Hypothesis. Maybe too many managerial types are calling the shots. Maybe we needed more workers than we where able to educate and the average skill dropped (a lot). Maybe companies realized that poor quality doesn't matter, because either customers don't have other choices or the choices that there are are as bad.
Apple's effort to maintain some semblance of consistency across this incredible array is laudable. (Which is not the same as letting the grievances highlighted in this article slide; I agree with the author 100%.) We all want consistency (probably to a degree greater than Apple is capable of delivering) simply so that we can use the metaphors we're familiar with.
I imagine Apple has dozens of design teams, each of which cannot talk to more than a sliver of the others, with probably not a single person aware of exactly how many design teams exist at once. There was probably a period in Apple's history – and probably not that long ago – when a single employee could assess the iconography across the entire suite. Those days are over.
My question: beyond preventing the obvious and severe transgressions (Liquid Glass), what systemic solutions are available on a scale like Apple's to maintain high-quality and strong consistency?
(I appreciate that Apple does generally one design refresh per year, in contrast to the continuous zero-utility tinkering observable in Google's products, for example.)
This thinking is the fatal poison of the tech industry. The further you repeat it, the faster the industry dies. Watch:
"We all want privacy, probably to a greater degree than Facebook is capable of providing."
"We all want browser competition, probably to a greater degree than Microsoft is willing to provide."
"We all want advertisement options, probably to a greater degree than Google can tolerate."
See what's happening here? You're not making a concession, you're flat-out accepting their failure. Apple can provide consistency, they're a trillion-dollar business that has every incentive to compete on their own merits. Instead they carve out arbitrary and harmful rules for each platform and then steelman it when any authority of any kind suggests that they're wrong.
This isn't a "perfect being the enemy of good" situation, it's degraded into "good being the enemy of intolerable defaults" instead.
I'm neither complacent (as you seemed to imply) nor magically hand-waiving a "just do it" notion (as you seem to exemplify). I'm seriously interested in what it takes to effectively manage complexity as this scale.
"On the upside: it’s not that hard anymore to design better than Apple!"
However I guess the real pain is that there's still nowhere to go. Switching to Windows or Linux means giving up the efficiency of M-series chips, losing key apps, and losing the consistent menu-bar. It feels like an abyss.
I really hope the recent changes at Apple mean this will get completely overhauled and they'll return to their roots as design leaders. It will be such a shame if this mess is allowed to continue
jsheard•1d ago
master_crab•1d ago
alejoar•1d ago
jsheard•1d ago
master_crab•1d ago
hn_acc1•1d ago
moogly•1d ago
throwaway270925•1d ago
moogly•1d ago
serial_dev•1d ago
It's hilarious that it's a great article about clutter, and yet, the post is on a theme that is so badly cluttered it would have been funny... if I could have read the article...
jakzurr•1d ago
But yeah, maybe it helps that I have the scripting turned off.
MattRix•1d ago
Diggsey•1d ago
izacus•1d ago
d-lisp•1d ago
Avshalom•1d ago
esafak•1d ago
diffeomorphism•1d ago