I have the same impression. Frankly I believe the whole purpose of "Liquid Glass" is to create an exaggerated version of the GUI Apple intends to use later-on in AR glasses, which is then toned back again in later releases to match the feasible implementation on the glasses.
The expected migration curve seems to be to force all applications now to become more bland and less distinguishable from the OS (and Apple services), so that at the end of the journey (in a future AR-product) Apple can #1 render those apps consistently without disrupting their UX and #2 present itself as the user-facing service provider for all the value created by those apps (with the app-developer being responsible for the integration and UX-compliance).
It's a dream-scenario. Need a ride-hailing service? Let "the Apple glasses" do it for you. Under the hood the apps are then hopefully so streamlined already that service-providers will compete to be the fulfilment entity for this task.
Probably going to be against the hivemind on this one, but I for one welcome this. Public transport, taxis, flights, hotels, to a degree even restaurants are fungible. I want to get from A to B, I want to have a bed for a few nights in some other city, I want to get some specific food.
That's what I need. What I do not want is to waste time to get what I need with bullshit.
That's why I love the "evil B" of hospitality - I see all available hotels for my travel, the associated price and pictures. I select an offer (usually the cheapest one that still has decent reviews), I click on "book", I confirm my data, that's it. I don't need to wade through dozens of websites, enter my data a dozen times, and then finally enter payment data on yet another shady website that's probably gonna get hacked sooner or later.
I don't want to waste my time researching the phone number of the local taxi company, so taxi.de it is, I only select where I need to go and a few minutes later a taxi shows up, no need to call someone, spell out the street name I live in to someone barely understanding me on the phone because Germany's phone service is dogshit.
I don't care about which specific Chinese restaurant I want, so I go on Lieferando, and half an hour later a nice load of fried rice with greens shows up at my door. And every time I have to go to a specific place (say for an anniversary) I know exactly why I despise the old way - everyone does seat reservations differently, no integration with anything.
What still irks me is flight booking, because while Google and a fair few other resellers/brokers do at least compare available options of different fulfilment providers, the actual booking I have to do is still on each airline's different web page. And rail travel is similarly annoying once you try to leave Germany.
But it's reasonable that the merchants offering in the marketplace of that ecosystem start to observe how their opportunity to become the next "evil B of X" are increasingly diminished, in favor of being the fulfilment entity for the "benevolent A".
I neither need an "evil B" nor a "benevolent A", and substituting one for the other is not a solution either...
This started with the separation of iPad OS (and release of Catalyst, and SwiftUI 1.0) in 2019. iPad OS is effectively the mother platform - an iPad app can be adapted down to target iPhone, or to target macOS and visionOS (with both those platforms also supporting running iPadOS apps natively without modification). The L&F changes in 2020 to macOS (with release 11) were heavily about making iPadOS and macOS more visually similar.
It doesn't surprise me at all that a team at apple tasked to make a consistent HIG and L&F across all products is borrowing heavily from the most recent HIG/L&F they worked on (Vision OS).
Aside from the criticism of icons, every complaint in the article just came across as nit-picks.
Developers (and users) are citizens of that ecosystem, serving other citizens and contributing to its economy. It is their right to judge and criticize directions being taken.
The owner of that ecosystem must endure and acknowledge this (especially when he continuously makes efforts to increase the difficulty to LEAVE that ecosystem), and other citizens should not take any offense from this at all.
Do they need to acknowledge it? Ecosystems aren't countries, they're markets, and citizenship doesn't exist here in the same sense – only participation in the ecosystem. Maybe there's some EU chicanery that makes it illegal for American companies to ship a UI that's displeasing to European tastes, but if we pretend that Apple is strictly an American company, would they need to acknowledge this at all if it didn't affect sales?
The rest of your comment I don't understand, sorry.
We do, but what I'm suggesting is that Apple might exercise a third choice, which would be ignoring the criticism and pretending it doesn't exist if they don't think it will affect their sales.
Towards developers they will apply "acknowledge and ignore" as the communication strategy, towards users they will continue to express the usual confidence to know best what he needs, because that's what their users like and that's the only thing they can do now anyway.
In a grander scheme of things, Apple needs to prepare a transition of their locked-in userbase to AR, because there is a risk that AR replaces the smartphone and their users move away from Apple then. So they have to transfer the stickyness of iOS to AR before that happens, to have a ecosystem headstart against all competitors.
The actual interest of users and developers are secondary, they need to smoothly transfer them to a new world without alienating them too much. The only way is forward.
However there is one thing that I wanted to comment on here:
> I’ve said this before, but Apple is forcing third party devs to be in service of Apple. The guidelines and rules are meant to sublimate the brands of the third party, and replace it with Apple.
Personally one of the things that drives me insane is when an app tries to be special and have its own design language (looking at you Google) on my iOS device. The OS has an established design language and really should be used for most applications.
I understand wanting to have a brand identity but too many apps take this too far that just lead to a clunky experience.
This is a valid point.
The other side of the story is that the iOS ecosystem is a marketplace where merchants offer goods and services, including Apple themselves.
Apple increasingly wants to decide how you present your brand, to the point that the only brand-language Apple allows on its devices is its own.
I think it's reasonable that merchants in this marketplace feel the increased pressure to work less on creating and refining their own identity and more on normalizing their offer like it could come directly from Apple, at their own expense and financial risk.
But in either case, ignoring the platform's established design language and UI conventions is still wrong, and not taking advantage of the user's preexisting knowledge about how to use their device is a wasted opportunity at best, insulting at worst. If the only reason for doing so is that you are placing your "brand identity" over actual usability and insist that your app look and feel the same on any device regardless of context, that's at the insulting hubris end of the spectrum. Given how widespread that problem is, it seems entirely appropriate and deserved for app developers to feel pressure from Apple (or any other OS vendor) to put more effort into conforming. We as users shouldn't want any app ecosystem to fragment into the mid-2000s WWW full of Flash UIs with zero accessibility.
What valuable preexisting knowledge is ignored if you make your button text readable instead of being blurred with the glassy background?
Obviously, if Apple's committed to taking their UI in the direction of illegibility again, then deviating from their new recommendations may be worthwhile. But the UI design should start by complying with platform conventions, and only break those to the smallest extent necessary, with good reason (which doesn't usually include anything about your app's brand identity). And hopefully, Apple can speed-run the kind of changes they did over the first several years of OS X as they toned down the initial excesses of the Aqua design.
Using non-standard labelling _and_ putting buttons in their non-standard order is a real issue.
Not to mention that on some platforms the position of the buttons may be specific to localization, e.g. they may switch order if the user's language is set to Hebrew.
On Mac, the coloring indicates which action is triggered by default with the keyboard - one of the first things to go if you stylize the UI yourself.
These and all the other inconsistencies add up until you have something like the iOS Youtube app, where IMHO the UI is just complete nonsense. Just because you are full screen doesn't mean you can redefine fundamentals about how information is laid out or what taps are supposed to do.
No it doesn't since you don't know whether the app is following that convention, so you need to read the label. Besides I don't think it's even possible to avoid reading a short label while looking at it.
> Consider ... how to arrange the "OK"
This isn't the topic, though, this is:
> Apple's committed to taking their UI in the direction of illegibility
> which doesn't usually include anything about your app's brand identity
Do you have a good example here as to me dentity in design is mostly about color theme and maybe a few shape tweaks, which has almost no usability impact?
A company like Netflix may have its own experience and expectation on UX for the service it offers, and it might be equally valid for them to want a consistent UX across all its devices carrying Netflix as it is for Apple to want it across all of Apples devices.
But here Apple attempts to define the UX for its devices AND the services offered there. This might be useful and even helpful for the ecosystem when the guidelines are reasonable and supportive of the ecosystem.
But when Apple suddenly defines "everything should be frosted glass!" and their design language no longer intends to "get out of the way" but "emphasize Apple above all", it becomes branding, creating a collision of interests...
Are you primarily a Windows user and just want the app on macOS to look and feel like the Windows environment you're used to—essentially wishing there weren't different platforms to begin with, but otherwise dodging the issue? How would you feel if you encountered a macOS-style open/save dialog while working on Windows? Or if an application tried to attach its menus to the top of the screen instead of the top of each of its windows? Or if it responded to Ctrl-C by cancelling an operation instead of copying what you had selected? Or if the window close button was in the wrong corner?
You're just prejudging the answer. Who wouldn't want reasonable and appropriate? The issue is that a lot of these platform-specific defaults are bad unergonomic legacy
Case in point: "Or if the window close button was in the wrong corner?" That would be great to have on any platform since it's a common UI mistake to clump categorically different buttons (non-destructive minimize and destructive close) together, thus raising the cost of a misclick
That's the main issue with your argument - yes, familiarity is a UI benefit, but the net benefit of following a UI convention depends entirely on said convention.
But the actual reality is that the OS does not matter that much. The application is actually what's useful and exactly why we are even using the computer in the first place. The OS is just a middle man that we can't really cut out but should make itself as transparent as possible.
It follows then that the UI should be designed around the needs of the application and that it should be translated/transposed/copied across platforms regardless of the primary look of the platform. And this is exactly the problem with Apple stuff, focusing too much on superficial aesthetics at the expense of more substantial usefulness.
For example, if Apple hadn't decided to completely re-skin their whole office suite to make it "mobile ready", maybe they could have worked on the actual feature set and performance (awful in both cases against the offering of Microsoft).
Ironically, when Apple ported iTunes to Windows, they did a perfect copy paste, going through the trouble of creating their own UI framework specifically for Windows. So, it seems that their requirement/demands only applies to others, never themselves.
"Everything meshes with my chosen wallpaper" is a common aesthetic interest, that the article author dismisses because they don't care about it and mostly don't notice their own wallpaper, but if you look at certain subreddits and "Life Hacks" forums you'll find lots of people with heavily customized Android icon themes or deeply complex configuration of iOS shortcuts to aesthetically align everything they want on their home screen.
Sure, it's not earth shattering and the very definition of a nice-to-have that isn't hurting anyone in its absence, but it's also the sort of thing that enough people want to do the hard way that it seems nice to add an easy way to do it, too (and maybe more people will appreciate it than will take the hard way to it).
Not if it's a bad design language.
> just lead to a clunky experience.
So no different than following the bad defaults
Good defaults are extremely important.
However, I find this a bit irksome:
> The majority of people with vision impairments aren't people who are used to thinking of themselves as disabled and know their way around an a11y menu
That should change, and to the extent that something like this makes a small push in that direction, so much the better. Disability is a spectrum, everyone will deal with disability to some extent in their lifetime, and accessibility features are for everyone.
Separating out "serious" accessibility considerations from those of people with "normal" accessibility needs is just likely to make accessibility features even more second-class. If the main problem is the problem you describe, it should be fixed by better surfacing the existing accessibility features.
I'm reminded of a meme, from around when Facebook had its IPO:
"Why is Facebook going public?
They couldn't figure out their privacy settings either!"
I am really liking this upgrade. There is something appealing about seeing _less_ of the OS and more of your content. I know this felt like lip service, but the focus here really is content.
For example the lock screen. Before you even unlock your phone, your wallpaper is more visible than before. I've never enjoyed having my wallpaper cycle more than I do with update.
This is an update people are going to absolutely freaking fall in love with. It's good stuff.
An entity like Apple introduces UI "enhancement" to attract prospective users and persuade existing users that the functionality of the product is new, efficient, or otherwise "good."
UI is generally fashion and trend that seeks the "new" at great cost.
This is why there is a lack of internal consistency or rigor with respect to some UI direction: consistency, functionality, etc. are not the point.
The incentives simply aren't aligned to support a long-term strategy of not constantly messing with your UI.
It's backwards. It's for the UI designers to justify the money spent on them. They can't just sit there and do nothing. Designing is their job! It's the same with every position.
Meanwhile I get regular connection issues on Facetime on my iPhone 15 Pro Max, even though bandwidth isn't a problem here.
I see many angry comments because it's a change without a practical reason, and it's meant to make things more "new" or "fresh" at the cost of CPU and GPU resources. That's a valid complaint since making old devices obsolete is a design choice.
However, it's good to see it from a humane perspective. Fashion trends change because they are associated with identity, novelty, status, self-expression, etc. Companies make fashion changes to appeal to those things. For example, nobody complains if Nike changes a model just for fashion; however, everybody uses the same phone every day, just as they do with a pair of shoes. For us, working on programming or software design, the phone is just a tool, but for most people, the phone is a form of self-expression (like using single or double quotes in code, or tabs vs spaces). And every few years, tech companies undergo a fashion refresh.
So, even if Apple fires all the visual designers and keeps the same design for many years, people will likely grow bored with their UIs, which will push them toward competitors offering more stylish options.
Changing UI layout obviously breaks muscle memory, but even just reskinning the same layout with a new color scheme that changes the relative visual prominence of different UI elements brings usability penalties. It's rare that any UI change is purely beneficial or has no effect on usability. Unless proven otherwise, any UI change should be assumed to impose some usability harm on existing users, and the potential usability benefits of the change need to be weighed against that harm.
Don't pretend that the downsides of messing with an existing UI aren't real.
We are increasingly making everything digital, and it's leaving the older generations that cannot constantly adapt and relearn as their memory gets worse, in trouble.
Wow, you are barking at the wrong tree.
In my comment, I didn't say that the UI style changes were good or bad for usability. The comment thread had two postures: either these design changes have a business reason, or they are there to justify the designer's salary.
I wanted to highlight a third plausible reason: people crave experiencing something new, which is why businesses reflect that. For example, I love the classic Mac OS B&W style, but if Apple maintains the same style from 1984 to the present, many people will perceive it as outdated. (In fact, I remember reading MacOS 9 reviews pre-Aqua, that many comments highlighted how MacOS looked legacy compared to Windows 95.)
I'm not defending poor style changes or implying that they don't affect usability.
If you consider the model proposed by Donald Norman in Emotional Design, the experience occurs on three levels: visceral, behavioral, and reflective.
The behavioral level is crucial. I'm not minimizing its importance. But all three levels contribute to the user experience.
Companies like Apple, Google, and Microsoft redesign their UI because they want to maintain the perception that their products are modern, which justifies their UI updates.
> UI overhauls need a more substantial justification than mere fashion.
In the evolution of most UIs, many of the UI changes were driven by "fashion".
Most differences in behavior between Windows 3.1 and Windows 11 could have been achieved using the same visual elements of Windows 3.1. For example, Windows 95 moved the "Close" button outside the window control menu to make it easier to click; however, the same change could have been made while keeping the style of Windows 3.1. The switch from the "Program Manager" to the "Start" menu and taskbar could have been implemented with the Windows 3.1 style, without modifying its behavior.
When Apple launched OSX, its core was based on NeXT Step, and the goal was to replace Mac OS 9. They could have chosen to keep the Mac OS 9 style (which they initially did in the beta versions) to avoid disrupting the experience for existing users. The shift to Aqua was a fashion statement, conveying not only that Apple was modern but also highlighting the integration of their hardware and software, as the Aqua style matched the design of the iMac at the time.
I agree with you that disrupting the behavior and usability is not good. Many "fashion" changes could be incremental, allowing users to choose whether to retain their old version. But, you like it or not, the "visceral" perception in the user experience plays an important role.
I don't want my OS to have bell-bottoms and afro one year, a mohawk and ripped jeans the next, and a raver neon top with tracksuits and glow sticks the next.
A basic tee and jeans haven't went out of style since Marlon Brando and James Dean. Or a good shirt and a jacket. Or, if that's your thing, a Perfecto/Bomber/Motorcycle leather jacket. Or a sundress for women.
> For example, nobody complains if Nike changes a model just for fashion;
That's because shoes are part of clothes fashion. And even so, if established, long available models are changed or removed altogether (e.g. Doc Martens 1460, Converse Chuck Taylors, Timberland classic boot, Levis 501, etc.), these companies would get an earful from customers too.
Besides my OS is not the place for fashion to begin with. Especially when it messes with utility. It can look stylish or even "lickable".
But it absolutely doesn't need to change for fashion's sake.
I wish it were like that, then I could be merely unfashionable, without someone coming into my house at night and replacing the clothing I bought and am used to with whatever is in style next.
It doesn't matter what is their justification about making something "better".
The industry has reached saturation; there no longer exists any justification for making a UI somehow "better" to invite in more users. Changes ONLY create frustration and anger among existing users, which is essentially everyone at this point.
Changing the way a UI operates is like in an automobile swapping the position of the accelerator and braking pedals and moving the windshield wiper controls to the center console, heater controls to a steering-wheel stalk, and then claiming it is a "New Fresh User Interface!!". Of course people CAN adapt, but they will not like it. And automakers are already discovering how moving features from tactile knobs & buttons to a center touchscreen is hated and are going back to what people know and like.
It is past time for the software industry to get the message.
https://en.wikipedia.org/wiki/Mac_os#Timeline_of_releases
macos 13 supports hardware back to 2017.
https://en.wikipedia.org/wiki/MacOS_Ventura#Supported_hardwa...
looks like only 2 versions of Windows is supported. current is windows 11 and windows 10 support ends in October. of course, windows 10 was released in 2015 so comparing version support isnt a fair comparison.
If only! I wouldn't install macOS or iOS with that GUI if they payed me for it.
It's time we separate sales from UI progress.
If they want to "attract prospective users and persuade existing users that the functionality of the product is new, efficient, or otherwise "good." they can make substantial changes, of which there are numerous areas currently lacking.
Which is what they did in the first 10 or so years of OS X.
Folks aren't going to be able to simply pound out a bunch of new icons to make their non-apple toolkit apps look "native". There's far too much compositing and such going on here.
I doubt Google is going to rush out and try to mimic the L&F either.
This MAY give native iOS apps a leg up over web and other "portable" toolkit apps.
It's such a stupid argument. Also wrong, because the purpose for everything that company does is profit.
(I'm aware this is partly cultural desensitization, I remember the memes back when of people looking like shrimps staring at their phones etc).
So far nothing could replace it (headsets, watches, other wearable) because while different ways of interaction exist, the users always needed a screen for (media) consumption.
An AR device could suddenly tick all boxes, make a user buy such glasses as companion device and slowly transition away from his iPhone. So Apple needs to prepare and expand its ecosystem stickyness to AR.
Not saying it's a good bet for Apple or for users, but it seems that's the bet they are making.
That's absolutely obvious. What people are arguing is that this is a terrible move for many reasons. 0.1% of iphone users have a Vision Pro and Apple just degraded the experience for all of us.
They do this because there might soon be a disruptive AR-product (e.g. some nextgen Meta RayBan's) which gets adopted as companion device by the iPhone userbase and then gradually shifts their usage away from the iPhone to that other product.
So Apple needs to expand their ecosystem with all its stickyness to AR, as this ensures that even if competing AR products will be more appealing, they will all be inferior because only Apple's AR-product will allow you to mirror your whole iOS experience with all apps and content.
I got my first taste of computers on early 90's Macs and was enchanted. Within a year I discovered DOS, Windows - and freedom. I did tech support for both Mac and Windows computers for several years. It's always been abundantly clear that the Empire of Mac exists for its own glorification (and obscene profits) - customers, developers, partners - they are all in service of Apple. It's unfortunate that Apple does some things with unparalleled quality and maintains a loyal following, because the apple has always been rotten at its core.
Apple keeps good or great things as their are: Apple is loosing its edge look how other are so innovative.
Apple takes some risk and go bold on things: Why changing things?
At the end everyone else complaining would just follow and copy.
Rinse and repeat, bring the eyeballs and the money to my b
tching site or profile.Yes, of course, but it looks bold and innovative and designers can waste years tweaking various details across many apps!
> reduces the amount of information displayed on screen, and you’ll have to scroll more as a consequence. ... You’re just injecting white space everywhere.
Sure, but that's a common scourge in all modern design, why would an innovative design company stay behind?
I downloaded the beta and the more I use it the less I like it. The icons are blurry, washed out and look terrible overall. I have a difficult time using the buttons on the lock screen to activate the flashlight and camera. Most of the time, I push them and the lock screen customization screen comes up instead of the flashlight turning on. I don't know if they changed the geometry of the buttons or what, but I can't reliably use them anymore. There are other instances of low contrast text, weird blurry artifacts and janky animations.
I hope these are all things that get worked out during the beta period. Overall, the whole thing looks unimpressive so far. I keep telling myself that OSX had the same kind of jank during the first beta and it will all work out. I want to roll back to iOS 18, but I can't do that without using iTunes, which isn't possible because I only have Linux machines.
I don't know about now, but about 20 years ago iTunes worked under Wine to connect with my iPod and perform backups.
https://support.apple.com/guide/iphone/use-the-camera-contro...
I use the action button on the other side to immediately start recording a video from the lock screen via a shortcut.
Sony Xperias have had a shutter button since the Symbian days.
Typed from an Xperia 5III.
On the iPhone 16, there's also a customizable action button on the other side that can be mapped to all sorts of things.
I believe that's exactly what Apple wants. This new design direction appears to be a strategy to unify all UI for VR as well.
If all controls are designed to be translucent, they (Apple) have freedom to put the control anywhere on the user's field of view on VR and allow "focus on the underlying content" (which in the case of VR, is the real world).
Time will tell if this approach makes sense for 2D screens.
On the other hand, Apple optimized iOS for a phone without unifying with MacOS and was very successful.
Optimizing phones for VR seems a really bad idea.
The correct way to build a tablet OS is to start with a desktop environment and optimize it - including third-party software - for fingers. We see this with iPadOS, which keeps getting hand-me-down features from macOS, implemented almost exactly the same as they are on macOS but with bigger tap targets.
In contrast, Windows 8 saw Microsoft taking the contemporary state of the iPad - single window, everything full screen, etc - and treating this as the future. Hell, I'm surprised they even shipped split-screen on it. They even locked down the app runtime to signed Store apps only[0]. My guess is that management saw dollar signs from how much Apple made from the iOS App Store and thought turning Windows into an "iPad Killer"[1] would replicate the same success.
Ironically, Windows 7 was already built to be a touch-friendly desktop, they just didn't actually finish making it touch-friendly.
[0] Which created a fun bifurcation between widget toolkits in the Microsoft ecosystem that persists to this day.
[1] Any time a company describes a product as a "killer" product, i.e. something intended to outcompete another product, they've already lost.
What MS SHOULD have done is just left desktop Windows 8 be a lightly reskinned Windows 7 and only trigger the tablet UI in tablet mode on supported devices. But no, they had to make a bad mouse experience which soured everyone on 8's UI forcing them to backpedal.
Had the Surface RT launched at the same price with pen support I think it could have had serious legs as a device for students even with all the limitations. But no, another missed opportunity.
The whole point of having different platforms in the first place is to cater to different needs, contexts, and user experiences. If they could be unified, they wouldn't be different platforms in the first place.
As the years have gone on, it feels like computers are slowly losing every ounce of personality they once had. Software should be delightful to use! Computers felt fun! I'm hoping we eventually get through this minimal/bland era of UI design and come back around to design with a little creativity.
It feels like Apple is trying to subtly introduce the concept of spatial UI for the future, where the norm will be to have controls and content on separate "layers" - but I don't think it should come at a cost of sacrificing the interfaces we already have.
Houses used to be for families; they were often quirky or strange or emergent, with weird layouts or materials. They may have garish wallpapers or floor to ceiling wood panelling. But these touches were reflective of the personalities of the owners. They met the needs of the specific people who inhabited them.
Nowadays, as houses are more of a commodity, they must be generic. All flat white interiors, straight corners, no cornicing or archetraves or plasterwork or anything to give the home a unique character. Instead it must be a blank canvas such that any inhabitant can put his own things inside it to make it his.
Computers are the same; what was once a niche product for enthusiasts and businesses has now become an instrumental part of nearly every moment of nearly everyone's lives. Thus they also must be generic and same-y, with limited avenues for superficial customisation, so that they can be interchanged or upgraded without jarring the user against the new version or device.
Personally I prefer radical customisation and quirkiness. I find it charming. But it seems that those who are designing (or perhaps only selling) the things disagree with me.
Not to mention, there are still billions of people needing housing, and with the climate situation we’re already in, building billions of unique homes will make the problem a LOT worse.
Again, I don’t really care much about the issue, but I just think it’s worthwhile to remind people that the American way of life (which developing nations aspire to) is absolutely untenable as far as all modern as currently-feasible technology is concerned. Maybe we could live with not being expressive just on the outside of our houses specifically?
But much, much more are because people have too much an eye on resale value, and if your house is different from all the rest, you reduce your buyer pool.
It costs nearly nothing to make kitchen cabinet heights comfortable for the main user; almost nobody does this even on full custom builds.
Comparing this to my own city of Melbourne, Australia: high-density dwellings are generally constrained to innercity suburbs and are still seen as undesirable compared to free-standing homes or semi-detached houses. Councils restrict the development of new high-density or mixed-use buildings for what amounts to NIMBYism. Inadequate public transport in the growth areas of the Northern and Western suburbs increases dependence on roads and freeways.
There are options to support affordable living in cities that don't involve covering our farmland and wildlife reserves with uniform white plaster cubes.
I contend other aspects of your ideas are not bad but need some work.
> The solution here is to build more higher-density housing options.
and
> undesirable compared to free-standing homes or semi-detached houses.
Any good idea for housing won't please everyone. In this case, when you see anything about the rich and famous are they likely to live in "high density" the way developers think of it?
space is desirable. Space you control (rent vs own.. another can of worms.) even more so! High density housing may help - any bloody action at all would be nice - but it isn't what people desire.
As for "covering our farmland and wildlife reserves"... Australia is a huge country and comparatively tiny population as yet. There is a looooong way to go before a significant area of the country is covered. However I would argue that we don't try to have a continuously expanding population - which would also help with housing costs.
I have mixed feeling on "NIMBYism" too. On the one hand we need solutions for people. On the other hand, the general idea of "people chasing happiness" means they should be free to oppose actions too. You can characterise it as a class battle of the rich opposing solutions to homelessness but usually each such situation is not clear cut, usually being muddied by developer profiteering too.
To throw another idea in there.. why is it that all the infrastructure monies are being spent in our capital cities? We have a crap ton of towns in the countryside - many of which are dying or barely holding steady. Why can't they grow at similar % as Melbourne? Where are the jobs there? After COVID they got a shot in the arm but it wasn't sustained.
There are an awful lot of exceptionally wealthy people living in buildings in Manhattan with hundreds of apartments. Their apartments themselves are larger than average, but given how much they cost per square foot there’s clearly a lot of demand to live in that environment.
Basically, money = space. In the city, you need more money. In the suburbs you need less. There also other concerns like commute and facilities but that varies person to person.
For many people, the tradeoff to live in the suburb is the right decision because the other factors don't matter so much and so to get more space for their $ they choose suburb.
Does that mean high density housing is bad? Absolutely not! If there are people that want to live in X space for Y money then go for it. But that applies to suburbs too. Once you involve money there are developers/builders and rent/own issues however my general take is that higher density building are impeded by rules and regulations more than a lack of demand. I have nothing to really back that up though.
I thought being early to the low-birth-rate party, culturally valuing new construction more than "old bones" or whatever (preventing sitting on real estate), and a low-growth economy over the last ~100 years were much more relevant contributing factors than the type of construction they've prioritized
I suppose this is a big point. I used to spend hours... days really... setting up a new PC. Partly because it would take ages just to get everything off the various floppy disks and CD-ROMs and installed onto the HDD, but also because everything was quirky.
Nowadays I hew to the default install of Ubuntu (or Windows + WSL2) and replacing my device (or SSD) or upgrading the OS is basically a seemless experience. I have some .bashrc/git config/etc stuff I can grab quickly and then I'm basically good to go.
Fascinating observation! Speculation of course, but I think you nailed it, and this really helps me understand why they've done this
Not mentioned is the peculiar response by the Apple ecosystem pundits. They were oddly supportive of these maladaptive design system changes on the basis of excitement (they can’t afford to be left out of the media momentum…). I believe their collusive tendencies mask what would otherwise be perceived as a major flop or severe error.
Also not mentioned in the piece, I remain curious as to the internal Apple team/culture changes that resulted in this design system failure. What on earth happened?
On MacOS, though, I really do worry it's going to take several iterations before things make a lot of sense. God forbid this new UI layer hurts performance, too, on these exceptionally fast machines.
Good news is things are still in beta. Some ideas can always be walked back.
Gentle reminder that the average Mac user is nontechnical, especially the younger generation. IMO this is Apple slapping a formalized paint on the simplification of iOS 7, and doubling down on iconography (not app icons).
Now all of that said — there are major usability concerns in iOS 26 and the liquid glass design language. The file picker’s previous “Done” has been replaced with a single checkmark. Significant meaning is lost in a few places, and there are super-odd double-X icons (in mobile safari while entering a URL, for example). Safari’s new tab button is now out of reach, while the previous new tab gesture is now new tab group. Context menus expand now, instead of swipe, meaning what used to feel more natural now takes extra taps/muscle memory updates.
That all said, as iOS 7 improved over time and was nailed down, so will this.
To me it’s given the iPhone - an incredibly boring platform/device 18 years in - new life. The new Lock Screen Photo Shuffle is incredibly personable and downright beautiful. From my understanding, a lot of work went into pre-composing app icons, many of which are objectively beautiful.
I’ve found users can’t find buttons “under the thumb” so I’m curious how the dedicated tab bar search will work in practice.
Overall I think for the goals of liquid glass as a design system, it’s something only Apple could do - in a good way.
All of these things people keep telling me are "worse" are often things I thought were poorly designed in the first place. Being able to find controls easily seems to be the biggest complaint, for instance the buttons in Safari. I've always thought the buttons in Safari were unintuitive. This actually feels better to me.
This idea that things are harder to find because of the visual changes, that goes away in like a day. Just like every major visual upgrade before this your eyes train to it _extremely_ quickly.
I do feel like they "borrowed" the shape of textboxes from google though with the circular edges. I was never really a fan of that shape.
Apple has consistently balanced beauty and function, from their hardware to stores to product packaging. They must have an army of experienced UX designers and well thought out user testing processes. Given all that, how in the world did this liquid glass idea even get past the preliminary mockup stage? Were/are they simply betting that the cool factor of the glass effect will outweigh the usability issues? And that Windows Vista’s glass UI went out of style simply because it wasn’t realistic enough?
They do, which means they have an army's worth of officers all trying to jockey for influence and power and position with little pet projects and "suggestions" and so on.
Apple's best years in UX were well before the army arrived, when they had a single man with great taste ultimately green-lighting every decision.
Apple's air of "design excellence" is stale, and is purely marketing now, i.e., a lie.
Couldn't agree more. Leopard, Snow Leopard, Windows Vista, and 7 were by far the best-designed OSs to date.
hiring became run by recruiters and HR. who filter for surface over substance.
The core issue is that Apple et al all have their UX/UI in-house, which means on the payroll, which means they have to produce something new to justify their continued employment. But instead of devs who always have some bugs to fix or some old cruft to refactor, these opportunities are rare for UX/UI, mostly because you'd also need to employ proper support teams that have a way of aggregating customer complaints or improvement ideas (and legal FUD surrounding the question of "copyright on suggestions" - LWT famously bans their employees from even reading r/LastWeekTonight to avoid getting held liable if they use an idea someone suggested there) and to derive improvement ideas from there.
It's either "boom (every 3-4 years) or bust".
If you are writing all your apps in SwiftUI, then that's probably OK. However, the vast majority of apps use UIKit (or use a library that is ultimately based on UIKit), with AutoLayout. In fact, I'll bet that most of the really big apps are still ObjC/NonAutoLayout.
AutoLayout/UIKit is a big fat pain in the ass, but it works extremely well. It has had over a decade, to buff out the rough spots and corner cases. I can't think of a single UI conundrum that I've had, in the last ten years, that I couldn't figure out how to address with AL. With SwiftUI, I'm constantly running into "You can't get there, from here."
Most (probably all, but I have a couple of old ones) of my apps work well in any display. That means resized (usually narrow vertical) iPad windows, rotated iPhones (where a lot of the UI scrolls off the bottom, and you need to scroll for it), and resized Mac windows.
With SwiftUI, you tend to get support for that pretty much automatically, but with AL, you usually need to specifically design for it, so when the phone gets rotated, things flow into the right places, etc.
Adding all the LG borders around things like tab bar items and navbar items, changes the layout; sometimes, fairly substantially.
Right now, I'm just tossing a UIDesignRequiresCompatibility into the Info.plist, but that feels like a kludge, and I'll need to adapt, but that may take a while.
> Ensure that you clearly separate your content from navigation elements, like tab bars and sidebars, to establish a distinct functional layer above the content layer.
This makes perfect sense. Separate content and controls. In this variant, controls go on top, like a toolbar, and then there’s a horizontal division (maybe an actual line, maybe just blank space), then content.
Then I looked at the pictures again and did a double take. Apple doesn’t have a clear separation at all. And the controls are at the bottom, not the top.
The punchline, of course, is that Apple transposed the Y and Z axes! The functional layer isn’t toward the top of the screen — it’s toward the user! You are supposed to separate content and controls along X or Y so that content is next to controls but separated a bit but, for some reason, Apple is floating controls over content, in the Z axis. The separation is a plane, parallel to the screen, that you can’t see because the controls are in the way.
Apple, turn off your fancy shaders and go back to the drawing board. You have so many more pixels than any 90s designer, and yet you can fit almost no unobscured content on the screen. And your poor users can’t see the divisions between content and everything else because you built the technology to rotate the whole UI 90 degrees about the X axis!
This is something Apple has been consistently great at. I do not want a 3rd Party App to look and feel massively different from a first party app. In fact I'd like all apps to look and feel very uniform. Whenever some third rate designer thinks their design is somehow better than the entire rest of the systems design you get ugly mismatches and clashing styles. Even if the 3rd Party Design IS superior, it won't look any good being completely out of place.
rickdeckard•7mo ago
> Apple Design-Guide: "Ensure that you clearly separate your content from navigation elements [..]"
Honest auto-complete:
"[..] the OS will then use the GPU to draw all attention away from the content to the navigation elements"