(Perhaps charismatic leaders neglect it a bit on purpose because they enjoy being portrayed as irreplaceable.)
The people in power don't have a burning yes.
First, I quickly tap on the first button that has the picture of the credit card and its name. As a result I find myself in a menu that shows me the billing address (go figure)! So, I have to click back, and use the button below that simply states “Change the credit card” or something to that effect.
Why, for the love of god, for the info about the billing address Apple uses picture of CC? Why the billing address is even the first option!?
So, multiple clicks when it can be avoided by a proper design (I think in the past the picture button was the one that changed credit cards, but I don’t know if I am misremembering).
Every time it happens I think about Steve Jobs.
However, it seems that under Tim Cook, Apple has gradually lost many of its traditional values when it comes to usability and UI/UX perfectionism. I suspect that the company has not passed on "The Apple Way" to people who joined the company after Steve Jobs' passing. Not only that, there doesn't seem to be an "Apple Way" anymore.
Come to think of it, the old Apple had figures like Bruce Tognazzini who wrote about "The Apple Way"; I have a copy of Tog on Interface that distills many of the UI/UX principles of the classic Mac. I can't think of any figures like Tog in the modern era.
Gradually the Apple software ecosystem is losing its distinctiveness in a world filled with janky software. It's still better than Windows to me, but I'd be happier with Snow Leopard with a modern Web browser and security updates.
It's sad; the classic Mac and Jobs-era Mac OS X were wonderful platforms with rich ecosystems of software that conformed to the Apple Human Interface Guidelines of those eras. I wish a new company or a community open-source project would pick up from where Apple left off when Jobs passed away.
There's the Hello System[0]... not sure if it counts.
Instead, they should have stayed on the Straigth and Narrow of Quality - where they were for many years - where you move up to computing paradise by having fewer features but more time spent perfecting them.
We see what you did there!
yep. The attention to details is still there, it is just changed from polishing and curating details to creating a lot of small unpolished and uncalled for and thus very annoying details. From MBA POV there isn't much difference, and the latter even produces better KPIs.
Either everyone is worried about the consequences of failing to produce high quality work (including at the VP level, given they can allocate additional time/resources for feature baking) or optimizing whatever OKR/KPI the CEO is on about this quarter becomes a more reliable career move.
And once that happens (spiced with scale), the company is lost in the Forest of Trying to Design Effective OKRs.
I generally see complaints about advancement aimed at the hardware. Some are unreasonable standards, some are backlash to the idea of continuing to buy a new iphone every year or two as the differences shrink, but either way software feature spam is a bad response.
That's the bed they made themselves and lay in it willingly.
No one is forcing them to do huge yearly releases. No one is forcing them to do yearly releases. No one is forcing them to tie new features which are all software anyway to yearly releases (and in recent years actual features are shipping later and later after the announcement, so they are not really tied to releases either anymore).
The stock market can easily be taught anything. And Jobs didn't even care about stock market, or stock holders (Apple famously didn't even pay dividends for a very long time), or investors (regularly ignoring any and all calls and advice from the largest investors).
You need political will and taste to say a thousand nos to every yes. None of the senior citizens in charge of Apple have that.
They could easily wait longer between releasing devices. An M1 Macbook is still in 2025 a massive upgrade for anybody switching from PC - five years after release.
If Apple included fully fledged apps for photo editing and video editing, and maybe small business tools like invoicing, there would be no reason for any consumer in any segment to purchase anything other than a Mac.
They could, but then they wouldn't be a trillion dollar company. They'd be a mere $800bn company, at best. ;)
Not many consumers go out to buy an Apple device because the new one has been released. They go out to buy a new phone or new computer because their old one gave out and will just take the Apple device that is for sale.
That's also why Apple bothers to do the silent little spec-bump releases: it gives Business Leasing corporate buyers a new SKU to use to justify staying on the upgrade treadmill for their 10k devices for another cycle (rather than holding off for even a single cycle because "it's the same SKU.")
1. They've stopped starting small and instead started unrealistically large. Apple Intelligence is a great recent example.
2. They've stopped iterating with small improvements and features, and instead decided that "iterating" just means "pile on more features and change things".
I mean, some people are just impossible to please!
https://media.ycharts.com/charts/441687ba735392d10a1a8058120...
The companies forget how to make great products. The product sensibility and product genius that brought them to this monopolistic position gets rotted out by people running these companies who have no conception of a good product vs. a bad product. They have no conception of the craftsmanship that’s required to take a good idea and turn it into a good product. And they really have no feeling in their hearts about wanting to help the costumers.”
- Steve Jobs - https://en.wikipedia.org/wiki/Steve_Jobs:_The_Lost_Interview
That said, I wonder, Jobs lived through Apple's transformation, but not its peak phase where Apple was simply printing money year after year after year. I do wonder if Jobs in 2016 would have been able to keep the organization performing at such a high caliber.
Even he seemed like he was making unforced errors at times too, like the "you're holding it wrong" fiasco, but its hard to say since he didn't live through Apple 2013-2019 where it became an ever increasing money printing machine.
In the age of AI, COVID-19 etc. I wonder how jobs post 2020 would treat things.
When I interviewed at a smaller company, someone high up interviewed me last. I passed everything on paper afaik, but he didn't think I was the right person for some reason. Which is fine for a small company.
I was in a (tech) meetup last week. We meet regularly, we are somewhere between acquaintances and friends. One thing that came up was a very candid comment about how "we should be able to tell someone 'that is just stupid' whenever the situation warrants it".
I believe that does more good than harm, even to the person it is being directed to. It is a nice covenant to have, "we'll call you on your bs whenever you bring it in", that's what a good friend would do. Embracing high standards in a community makes everyone in it better.
The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
I wish he'd bless a certain Linux distro for PCs so we can have some default. Current default is kinda Ubuntu, but they've made some weird decisions in the past. Seems like he'd make reasonable choices and not freak out over pointless differences like systemd.
You can tell someone their idea is substandard without inferring their stupid, which is generally taken to be an insult. Tact in communication does matter. I don't think anyone needs to say "that is just stupid" to get a point across.
I've had plenty of tough conversations with colleagues where it was paramount to filter through ideas, and determining viable ones was really important. Not once did anyone have to punch at someone's intelligence to make the point. Even the simple "Thats a bad idea" is better than that.
>whenever the situation warrants it
Which will of course be up to interpretation by just about everyone. Thats the problem with so called "honest"[0] conversation. By using better language you can avoid this problem entirely without demeaning someone. Communication is a skill that be learned.
>The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
Linus took a sabbatical in 2018 to work on his communication and lack of emotional empathy. He's had to make changes or he absolutely risked losing the respect of his peers and others he respected. He has worked on improving his communication.
To follow Linus as an example, would be to work on communication and emotional empathy. Not disregard your peers.
[0]: Most often, I find people who are adamant about this line of thinking tend to want an excuse to be rude without accountability.
In all of those projects and organsiations which value respectful language and inclusivity and all sorts of non-results-oriented crap, not much usually gets done. This is how you get design-by-committee lowest common denominator slop.
And even if you don't agree with what I'm saying here, "avoid criticising people" quickly turns into "avoid criticising their ideas because they might feel offended". There was a recent HN thread about AI-written pull requests and how people have problems with them, because tactfully rejecting 10k lines of pure bullshit is very hard without making the submitter upset. Guess what, if they were allowed to say "no, you aren't going to be merging slop you can't even explain" the average code quality would skyrocket and the product would be greatly improved.
I prefer the former by a lot, but of course you're free to spend your time in the latter.
What's wrong with calling an idea stupid? A smart person can have stupid ideas. (Or, more trivially, the person delivering a stupid idea might just be a messenger, rather than the person who originally thought of the idea.)
Though, to be clear, saying that an idea is stupid does carry the implication that someone who often thinks of such ideas is, themselves, likely to be stupid. An idea is not itself a mind that can have (a lack of) intelligence; so "that's stupid" does stand for a longer thought — something like "that is the sort of idea that only a stupid person would think of."
But saying that an idea is stupid does not carry the implication that someone is stupid just for providing that one idea. Any more than calling something you do "rude" when you fail to observe some kind of common etiquette of the society you grew up in, implies that you are yourself a "rude person". One is a one-time judgement of an action; the other is a judgement of a persistent trait. The action-judgements can add up as inductive evidence of the persistent trait; but a single action-judgement does not a trait-judgement make.
---
A philosophical tangent:
But what both of those things do — calling an idea stupid, or an action rude — is to attach a certain amount of social approbation or shame to the action/idea, beyond just the amount you'd feel when you hear all the objective reasons the action/idea is bad. Where the intended response to that "communication of shame" is for the shame to be internalized, and to backpropagate and downweight whatever thinking process produced the action/idea within the person. It's intended as a lever for social operant conditioning.
Now, that being said, some people externalize blame — i.e. they experience "shaming messaging" not by feeling shame, but by feeling enraged that someone would attempt to shame them. The social-operant-conditioning lever of shame does not work on these people. Insofar as such people exist in a group, this destabilizes the usefulness of shame as a tool in such a group.
(A personal hypothesis I have is that internalization of blame is something that largely correlates with a belief in an objective morality — and especially, an objective morality that can potentially be better-known/understood by others than oneself. And therefore, as Western society has become decreasingly religious, shame as a social tool has "burned out" in how reliably it can be employed in Western society in arbitrary social contexts. Yet Western society has not adapted fully to this shift yet; which is why so many institutions that expect shame to "work" as a tool — e.g. the democratic system, re: motivating people to vote; or e.g. the school system, re: bullying — are crashing and burning.)
That’s a great way to make the other person defensive and ensure everything stays the same or worse. It’s not that difficult to be tactful in communication in a way which allows you to get your point across in the same number of words and makes the other person thankful for the correction.
Plus, it saves face. It’s not that rare for someone who blatantly say something is stupid to then be proven wrong. If you’re polite and reasonable about it, when you are wrong it won’t be a big deal.
One thing I noticed about people who pride themselves in being “brutally honest” is that more often than not they get more satisfaction from being brutal than from being honest, and are incredibly thin-skinned when the “honest brutality” is directed at them.
> The Linux kernel would be absolutely trash if Linus were not allowed to be Linus.
I don’t understand why people keep using Torvalds as an example/excuse to be rude. Linus realised he had been a jerk all those years and that that was the wrong attitude. He apologised and vowed to do better, and the sky hasn’t fallen nor has Linux turned to garbage.
https://arstechnica.com/gadgets/2018/09/linus-torvalds-apolo...
This is a huge misunderstanding at best and a malicious re-framing of serious issues within portions of the tech industry at worst.
In what context is virtually everyone pushing to hire demonstrably unqualified people?
Im sceptical. I've never seen what you describe outside of toxic "culture war clickbait videos", what i have seen is nepotism, class privileges and sprint culture pushed by investors - you know the exact opposite of what you describe.
If you take a hardline attitude on keeping the gates up, you're just going to end up with a monoculture that stagnates.
Sure, they lack wisdom, but that doesn't mean they aren't smart, it just means they're young.
Gatekeeping doesn't have to mean "Don't hire anyone under 35" it means "Don't hire people who are bozos" and "don't hire people who don't give a shit"
If Apple was made up of only top-end engineers led by a quality-obsessed maniac, would they put out better or worse products?
Of course, not everyone can follow this philosophy, but they don't have to, and most don't want to anyway.
My take away is that diversity at a global level, and in some specific contexts, is a great thing. But diversity in some other specific contexts is entirely destructive and analogous to rot or decomposition.
When we rely on a core societal function (firefighting, accounting, waterworks maintenance, property rights, etc.) the people responsible for maintaining these functions need to maintain in themselves a set of core characteristics (values as patterns of action), and there is room to play outside of those cores, but those cores shouldn't be jeopardized as a tradeoff for diversity and inclusion.
For example, if constructive core values of a railroad system is consistency and reliability, then these shouldnt be diminished in the name of diversity and inclusion, but if diversity and inclusion can be achieved secondarily without a tradeoff (or even to somehow further amplify the core values) then it is constructive. One has to thoughtfully weigh the tradeoffs in each context, and ensure that the most important values in that context to maintain the relevant function are treated as most important. The universe seems to favor pragmatism over ideology, at least in the long run.
So in a company if the core values that make it successful are diluted in exchange for diversity, it's no longer what it was, and it might not be able to do keep doing what it did. That said, it also might have gained something else. One thing diversity tends to offer huge complex systems is stability, especially when its incorporated into other values and not held up singularily.
In other words, my take on diversity (and by extension, inclusion) is that we need a diversity of diversity. Sometimes a lot of diversity is best, and sometimes very little diversity is best.
I do not for the life of me understand your point. Gatekeeping, as its most commonly used, means controlling access to something (be it a resource, information etc) to deliberately and negatively affect others that are not part of a "blessed" group. Its not objective, and certainly is not a practice reliant on merit. Its an artificial constraint applied selectively at the whim of the gatekeeper(s).
>There's been a shift where everyone wants to welcome everyone, but the problem is it erodes your company culture and lowers the average quality.
The first assertion and the second one are not related. Being welcoming to everyone is not the same thing as holding people to different standards. Company culture sets company inertia and how employees are incentivized to behave and what they care about. You can have the most brilliant engineers in the world, like Google most certainly does have its fair share, and as we have seen, with the wrong incentives it doesn't matter. Look at Google's chat offerings, the Google Graveyard, many of their policies becoming hostile to users as time goes on etc.
Yet you can have a company with what you may deem "average quality" but exceeds in its business goals because its oriented its culture to do so. I don't think Mailchimp was ever lauded for its engineering talent like Google has been, for example, but they dominated their marketplace and built a really successful company culture, at least before the Intuit acquisition.
Another example is that a hobby I loved is now dead to me for lack of gatekeeping; Magic the Gathering. Wizards of the Coast started putting out products that were not for their core playerbase, and when players complained, were told "these products are not for you; but you should accept that because there's no harm in making products for different groups of people". That seems fair enough on its face. Fast forward a couple of years, and Magic's core playerbase has been completely discarded. Now Magic simply whores itself out to third party IPs; this year we'll get or have gotten Final Fantasy, Spiderman, Spongebob Squarepants, and Teenage Mutant Ninja Turtles card sets. They've found it more lucrative in the short-term to tap into the millions of fans of other media franchises while ditching the fanbase that had played Magic for 30 years. "This product is not for you" very rapidly became "this game is not for you", which is pretty unpleasant for people who've been playing it for most or all of their lives.
Also, it became the best selling set of all time even before it was out. Which isn’t an indicator of quality, for sure, but it does show Wizards understands something about their market.
It used to be hard and a liability to be a nerd
I'm pretty sure this would also render the dot-com bubble the nerds fault?
Let's not go back to how nerd culture used to be regarding diversity... or lack thereof.
I remember when Bill Gates was on magazine covers, viewed as a genius, a wonderful philanthropist, even spoofed in Animaniacs as "Bill Greats."
I guess my point is, "It used to be hard and a liability to be a nerd" was never true, and is nothing but industry cope. The good old days were just smaller, more homogenous, had more mutually-shared good old toxicity and misogyny (to levels that would probably get permabans here within minutes; there's been a lot of collective memory-holing on that), combined with greater idolization of tech billionaires.
What changed in 2010?
https://arstechnica.com/gadgets/2018/09/linus-torvalds-apolo...
—Steve Jobs
Successful publicly traded companies have a responsibility to generate more revenue and increase the stock price every year. Year after year. Once their product is mature after so many years, there aren't new variations to release or new markets to enter into.
Sales stagnate and costs stagnate; investors get upset. Only way to get that continual growth is to increase prices and slash costs.
When done responsibly, it's just good business.
The problem comes in next year when you have to do it again. And again. Then the year after you have to do it again. And again.
Such as all things in life, all companies eventually die.
I don't use a Mac anymore, but I do use an iPhone. This is the worst version of iOS I can recall. Everything is low contrast and more difficult to see. The colors look washed out. The icons look blurry. In my opinion, Liquid Glass is a total bust. I don't know what these people are thinking. Times have certainly changed.
OP was talking about design languages
I think we're stuck with the notch forever on iPhones. Even if apple uses an on-screen fingerprint reader in the future like a billion phones already do they're not going to go back from the face scanner. The only thing that will work is if the face scanner can read from behind the display.
Maybe it's because I use dark mode? I can only tell it's there if I move my mouse under it.
Here's a "workaround" that might help [1]. It entirely excludes the notch area from use.
Got the idea from the Top Notch app, which no longer seems to work: https://topnotch.app/
Go to Settings > Displays. In the list of resolutions you need to enable “Show all resolutions” then you can select one that will hide the notch
ironically I don't really mind the new design language, whatever, if the damned thing worked.
I believe 2026 will finally be the year of Linux desktop.
Everything seems to be lazily done now - by that I mean, a modal pops-up and then it resizes to fit the content. Never seen this before.
Or, you open settings (settings!) and it's not ready to use until a full second later because things need to pop in and shift.
And it's animated- with animation time, so you just have to wait for the transitions to finish.
And "reduce motion" removes visual feedback of moving things (e.g. closing apps) so I find it entirely unusable.
And as others have noted the performance is completely unacceptable. I have a 16 pro and things are slow... And forget "low battery mode" - it's now awful.
I'm not doing anything weird and keep like all apps closed and things off when I don't use them and battery life is significantly worse. (Noticed the same on M4 + Tahoe, upgraded at the same time)
Very disappointed and I very much regret upgrading.
In this case the inefficiency was attention to detail but in other companies it might be something else.
For several years, there's been an issue with audio message recording in iMessage. Prior to iOS 26, it would silently fail; the recording would "begin" but no audio would be captured. This would happen 3, 4, even 5 times in a row before it would actually record audio.
Apple is clearly aware of the issue, because in iOS 26 the failure is no longer silent. Now, you'll get feedback that "Recording isn't available right now". Yet the incidence is...exactly the same. You might have to try 5 times before you're actually able to record a message.
It's simply infuriating, and it makes no sense to a user why this would be happening. Apple owns the entire stack, down to the hardware. Just fix the fucking bug!
There's little problems that keep accumulating, like the camera app opening up and only showing black until restarting it, at which point I've missed the candid opportunity.
I'm not going anywhere, it's still the right mix of just-works across their ecosystem for me, but dang, the focus does feel different, and it's not about our experience using Apple.
[1] https://discussions.apple.com/thread/256140468?sortBy=rank
Also, I have the iPhone 15 Pro (iOS 26.0.1), never had the black screen on camera open yet. That's the kinda thing I'd get on Android.
DHH was someone I kinda read him online, but he's been going full-in on these racist talking points, e.g., https://paulbjensen.co.uk/2025/09/17/on-dhhs-as-i-remember-l... :(
Judging by the Omarchy presentation video it feels too keyboard oriented. Hotkeys for everything. And hotkeys for AI agents? Is is opinionated indeed. Not my cup of tea.
I feel like that loses a majority of people right there. I like the option to do common things with the keyboard, or to configure things with a file. But for things I don't do often, holding a bunch of keyboard shortcuts in my head feels like a waste for those rare things.
I'm not sure about anyone else, but I can't run whatever Linux distro I want at work. When an OS relies on muscle memory to get smooth and fluid operation, it seems like that would make it extra hard to jump between for work vs home. I spent years jumping between OS X and Windows, and I found it's much nicer now that I'm on the same OS for work and home. Even the little things, like using a different notes app at home vs work do trip me up a little, where I'll get shortcuts wrong, or simply not invest in them, because of the switching issue. Omarchy feels like it would be that situation on steroids.
You outgrew this myth, congratulations!
> Look, I've got nothing but respect for the perfectly lovely humans who work at Apple. Several are classmates from university, or people I had the pleasure of working with before at different companies. But I rather suspect what's happened here is that some project manager ... convince Tim
But haven't outgrown this one yet, well, maybe in another 8 years...
Luckily Safari's Reader Mode didn't bug out
https://media.nngroup.com/media/editor/2025/10/06/1-messages...
It's fascinating to me because that's the single thing which every user goes through. It's the main branch and not some obscure some edge case. How do you do testing that you miss that?
There was one that was about all the annoying security pop-ups Windows (used to?) have. (FWIW, it starts here: https://youtu.be/qfv6Ah_MVJU?t=230 .)
Lately I've gotten so many of these popups on Mac that it both annoys and amuses the hell out of me. "Die a hero or live long enough to see yourself become the villain", I guess.
But, man, Apple hardware still rocks. Can't deny that.
Ah yes, the Johnny Ive era of "no ports on Macbooks except USB-C, and hope you like touchbars!" was fantastic. Not to mention how heavy the damn things are. Oh and the sharp edges of the case where my palms rest. And the chiclet keyboards with .0001 mm of key travel. I'll take a carbon fiber Thinkpad with an OLED display any day of the week, thank you. Macbooks feel like user hostile devices and are the epitome of form over function.
It was also a lot worse for me when plugged into outlets in an old house in Mexico, especially when my bare feet were touching the terracotta floor tiles; it's not really an issue in a recently re-wired house in California with a wood floor, using the same laptops, power strips, etc.
If you are having this issue and you currently plug a 2-pronged plug into a grounded outlet, try using Apple's 3-pronged plug instead, and I expect it would go away. If you don't have grounded outlets, then that's a bit more complicated to solve.
I've often wondered why I can tell by touch whether a device is charging or not from the slight "vibration" sensation I get when gently touching the case.
It's often noticeable if you have a point contact of metal against your skin; sharp edge / screw / speaker grill, etc. Once you have decent coupling between your body and the laptop, you won't feel the tingle / zap.
They're called Y-caps if you want to delve deeper into them and their use in power supplies.
[1]: https://www.apple.com/it/shop/product/mw2n3ci/a/prolunga-per...
[2]: https://www.apple.com/fr/shop/product/mw2n3z/a/câble-d’exten...
What I do mind is that there's only 3 of them.
The problem with the 2 USB-C ports on modern PC laptops is that one of them pretty much has to be reserved for the charger, whereas the MBP has a MagSafe port that you can charge with instead. So it really only feels like you have one USB-C port and the other ports are just there as a consolation. That might work out to roughly equal, but I don't think it leaves the Mac off worse. I don't hate the dongles so much though.
It wouldn't have hurt to have some USB-A and HDMI on the MBP--the Minis can pull it off, so clearly the hardware is capable--but more (Thunderbolt) USB-C would still be the best option IMO. USB-A (definitely) and HDMI (probably) will eventually be relics someday, even if they are here for a little while longer.
And even at their worst they were still much better than any Windows laptops, if only for the touchpad. I have yet to use a Windows laptop with a touchpad even close to the trackpad's that Apple had 15 years ago. And the build quality and styling is still unbeaten. Do Windows laptop makers still put shitty stickers all over them?
They really dodged a bullet there. 2016-2020 Apple laptop hardware definitely didn't rock. It's good they did an about-face on some of those bad ideas.
You can’t get more brain dead that taking away important screen real estate then making the argument that you get more real estate because it’s now all tucked into a corner.
God forbid there be a black strip on the sides of the screen. How did we ever live?!??
FWIW, I think the Touchbar was close to being a good idea, it was just missing haptics.
It's so rewarding when its charger dies in a month, and you feel superior to your colleague, whose vintage 6 months old charging cable with none of that extraneous rubber next to the connector catches fire along with your office. What a time to be alive!
The best part is the motherboard produced in a way to fail due to moisture in a couple of years, with all the uncoated copper, with 0.1mm pitch debugging ports that short-circuit due to a single hair, and the whole Louis Rossmann's youtube worth of other hardware features meant to remind you to buy a new Apple laptop every couple of years. How would you otherwise be able to change the whole laptop without all the walls around repair manuals and parts? You just absolutely have to love the fact even transplanting chips from other laptops won't help due to all the overlapping hardware DRMs.
I'll go plug the cable into the bottom of my wireless Apple mouse, and remind myself of all the best times I had with Apple's hardware. It really rocks.
And of course, just had to bring up the whole mouse charger thing. Back when Apple updated their mouse once and replaced the AA compartment with a battery+port block in the same spot to reuse the old housing, and a decade later people still go on about the evil Apple designers personally spitting in your face for whatever reason sounds the most outrageous.
“Legendary attention to detail”
Indeed, it is pretty open-and-shut.
[1] https://techpp.com/2011/04/19/mobee-magic-charger-for-magic-...
Apple have a couple of extra mechanisms in place to remind us to buy a new device:
- On iOS the updates are so large it doesn't fit on the device. This is because they purposely put a small hard drive i. It serves a second purpose - people will buy Apple cloud storage because nothing fits locally.
- No longer providing updates to the device after just a few years when it's still perfectly fine. Then forcing the app developer ecosystem to target the newer iOS version and not support the older versions. But it's not planned obsolescence when it's Apple, because they're the good guys, right? They did that 1984 ad. Right guys?
This is a weird one to complain about because Apple leads the industry in providing software updates. iOS 26 supports devices back to 2019. And they just released a security update for the iPhone 6S, a model released a full decade ago, last month.
The oldest Samsung flagship you can get Android 16 for is their 2023 model (Galaxy S23), and for Google the oldest is the 2021 model (Pixel 6).
Annoying popups on MacOS look like the 1999 remake of the modal dialogs from the 1984 Mac, I guess with some concessions to liquid glass.
Funny that a lot of people seem to have different Liquid Glass experiences, are we being feature flagged? I don't see the massive disruption to icons that the author seems but it does seem to me that certain icons have been drained of all their contrast and just look bleh now, particularly the settings icon on my iPhone. I don't see a bold design based on transparency, I just see the edges of things look like they've been anti-antialiased now. It's like somebody just did some random vandalization of the UI without any rhyme or reason. It's not catastrophic but it's no improvement.
All this wank to waste the power of faster and faster chips.
I teach C++ programming classes as part of my job as a professor. I have a work-issued MacBook Pro, and I make heavy use of Terminal.app. One of the things that annoy me is always having to click on a dialog box whenever I recompile my code and use lldb for debugging. Why should I need to click on a dialog to grant permission to lldb to debug my own program?
It wasn't always like this on the Mac. I had a Core Duo MacBook that ran Tiger (later Leopard and Snow Leopard) that I completed my undergraduate computer science assignments on, including a Unix systems programming course where I wrote a small multi-threaded web server in C. Mac OS X used to respect the user and get out of the way. It was Windows that bothered me with nagging.
Sadly, over the years the Mac has become more annoying. Notarization, notifications to upgrade, the annoying dialog whenever I run a program under lldb....
Get the fuck out of my way and let me use what is supposedly my computer.
They become more shitware and Microsoft like with every update.
Those ads ran from 2006 to 2009. That’s between 16 and 19 years ago. How young do you imagine the typical HN commenter is?
> There was one that was about all the annoying security pop-ups Windows (used to?) have.
Those have been relentlessly mocked on the Mac for years. I remember a point where several articles were written making that exact comparison. People have been calling it “macOS Vista” since before Apple Silicon was a thing.
> WIW, it starts here: https://youtu.be/qfv6Ah_MVJU?t=230
A bit better quality: https://www.youtube.com/watch?v=VuqZ8AqmLPY
I've been very patient with iOS 26. I tell myself - so long as its foundation is right, they'll iron out these details. But it is properly bad often and at times extremely frustrating.
This makes me extra sad. The HW is very good and very expensive, but the SW is mediocre. I bought an iPhone 16 a few months ago and I swear that is the first and last iPhone I'd purchase. I'd happy sell it at half of the price if someone local wants it.
lol
Apple is burning their remaining goodwill among longtime customers, myself included. It's sad to see. Next WWDC, they need to be incredibly transparent about how they plan to fix these issues and get their house in order. If they aren't capable of accepting feedback after this public excoriation, I don't have high hopes for their future.
I’m switching to android because why not? I mean, I have to install Google maps anyway because Apple Maps is horrible. But the UI on 26 is way worse than a pixel experience in my opinion. Plus, I could just do so much more with the pixel phone but then again I’m sort of a power user.
I was working on Apple since 1996 and started off as a computer support person. Now it pains me to help people with their computers because everything is siloed and proprietary and just makes no sense.
And I mean, I’m also annoyed that their voice to text is just horrible. I can’t even tell you how many mistakes I’ve had to correct in this comment alone.
On iPhone swipe keyboard something that feels like a random generator replaces not only the word you swipe, but the word before, and in 2/3rds of cases with random nonsense pairs.
And you can't turn it off without turning off the similar word proposals you definitely want.
It's a strange design decision and I think the implementation is not up to the task.
I'm not staying cause I like it, but because I dislike the other options more.
The one reason to use Android is so that you can actually switch out the awful stuff that ships with your device. Leaving Apple to join the "Google Ecosystem" seems absolutely insane. Google is so terrible at software, so terrible at UI and so terrible at having products.
I get that visual design is a complete preference, but the great thing about Android, to me at least, is that you can get away from Google totally goofy design and make your own choices.
>Plus, I could just do so much more with the pixel phone but then again I’m sort of a power user.
Google is starting to make that less and less feasible though, with it's start in restricting app installations.
On the bright side, Apple Silicon is amazing, and it seems like Apple decided in 2021 to make the MBP good again like it was in 2015.
<edit> spelling, since iOS 18 isnt as forgiving as iOS 6
When they release a new feature it needs to be everywhere. That happens every September. The cadence has not changed, but the scope since Apple was just making MacOS has been multiplied.
You can 10X your staff, but the coordination under 10X velocity will suffer.
I'm not trying to excuse Apple, but this article attempts to paint the impression that every issue is connected in some kind of serial incompetence, but that simply isn't the case.
I thought Apple was all about privacy. But their software needs location access to function properly?
It remains private because this runs locally. It's not sent up to the cloud.
iOS and Mac used to do a good job with things like animations, now they are horrible. Pre-beta quality.
And dark mode and accessibility settings need to just work. That is a core part of the job of every front end iOS developer, including the ones at Apple.
It absolutely is serial incompetence and the Apple engineering leadership that signed off on it should be ashamed.
The one thing that really changed is that every single company looked at Apple and saw something worth copying. Now there are dozens of phone makers, all seeking to emulate Apples success, putting effort into UI, polishing and design. This wasn't the case a decade ago. Just compare the circus bizarre design choice of Android Lollipop (either Stock or with manufacturer/user added layers on top) to iOS 7.
Now Apple is no longer particularly unique, in many regards. And I believe that they were absolutely aware of that and desired to continue being a defining force, instead of being "one of many". It's not that Apple has changed, it is that it hasn't and now desires to force through change.
IMHO, people are thinking about how well thought-out and usable the products and software tends to be - Yeah, Apple makes it so anyone can use it - But their software has always been buggy.
In my mind it is synonymous with style over substance. Bad software packaged in a user hostile interface, sitting atop shitty hardware that looks sleek and fashionable.
It doesn't matter anyway. It's fashionable enough that it will keep selling.
But nonetheless, there’s so many more bugs and visual glitches. Battery life is still unstable and feels markedly worse than before. Safari looks cool, but UI buttons being on top of content is foolish for the reasons highlighted in this article. Overall, it’s just much more visually inconsistent than before. And the glass effect on app icons looks blurry until you get 5cm away from the screen and really pay attention to the icons. I definitely won’t be upgrading my Mac any time soon.
I just wish we would get away from this annual upgrade cycle and just polish the OS for a while. We don’t need 1 trillion “features”, especially when they increase the complexity of the user experience. MacOS in general did this very well, ever since I switched I’ve been very impressed at how much you can accomplish with the default app in macOS, all while looking cleaner and leaner than windows software. No new feature is even close to that balance of power and UI simplicity anymore.
Fucking inexcusable that MacOS metal support for external monitors has been finicky and unstable since the very beginning, and they never resolved that (but at least external monitors were DETECTED, then somewhere in Sequoia things went completely south)-- and now it just seems to be completely broken. There are countless Reddit threads. Why can't the Apple engineering braintrust figure this out??
2) There is still no solution for this annoying-as-hell UI problem that I documented years ago on Medium: https://medium.com/@pmarreck/the-most-annoying-ui-problem-r3...
3) I had to buy Superwhisper (which is a nice product, but works a little janky due to how iOS handles keyboard extensions) because Siri's voice dictation is so abysmally worse than literally every other option right now, and has been for years. WTF, Apple?
Hey Tim, I love the Vision Pro too (I own one) but maybe get your head out of that for a bit and polish up the engineering on the rest of your lines!
It's literally a paid wrapper around a completely free program you would also be using for free if Apple wasn't actively hostile to Open Source software distribution.
Something I find worse: being unable to click a target while the animation is running! Because the target only gets focus after the animation is done: you start spending you time waiting for the animations in the end.
There’s no way I’m (ever) upgrading to Tahoe, I’m just going to hold out as long as possible and hope Omarchy gets as stable and feature rich as possible in the time being.
No idea what to do about the mobile situation - I can’t see myself realistically ever using android. Switching off of iCloud and Apple Music would also be pretty tough, although I’ve seen some private clouds lately that were compelling.
I just wish there was a more Linux-minded less-Google oriented mobile operating system
Since there are a lot of die hard Apple fans and engineers on hacker news this is going to get downvoted to hell, but I’m going to say it again.
It looks like Apple doesn’t care about user experience anymore, and the 26 series updates all look like they’ve been developed by amateurs online, not tested at all, and Apple engineers just took long vacations while they’re on the clock. It’s a complete and utter disaster of an operating system.
Isn't Omarchy just config files for a bunch of existing, stable programs? Why wait?
It reminds me of stories I've heard about the Cold War and how Soviet scientists and engineers had very little exchange or trade with the West, but made wristwatches and cameras and manned rockets, almost in a parallel universe. These things coexisted in time with the Western stuff, but little to nothing in the supply chain was shared; these artifacts were essentially from a separate world.
That's how it felt as a Mac user in the 80s and 90s. In the early days you couldn't swap a mouse between a Mac and an IBM PC, much less a hard drive or printer. And most software was written pretty much from the ground up for a single platform as well.
And I remember often thinking how much that sucked. My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
Now so much has been standardized - everything is USB or Wifi or Bluetooth or HTML or REST. Chrom(ium|e) or Firefox render pages the same on Mac or Windows or Linux. Connect any keyboard or webcam or whatever via USB. Share files between platforms with no issues. Electron apps run anywhere.
These days it feels like Mac developers (even inside of Apple) are no longer a continent away from other developers. Coding skills are probably more transferable these days, so there's probably more turnover in the Apple development ranks. There's certainly more influence from web design and mobile design rather than a small number of very opinionated people saying "this is how a Macintosh application should work".
And I guess that's ok. As a positive I don't have the cross-platform woes anymore. And perhaps the price to be paid is that the Mac platform is less cohesive and more cosmopolitan (in the sense that it draws influence, sometimes messily, from all over).
Yes, as a long-time Mac user who now uses PCs at home but still uses a work-issued MacBook Pro, I greatly appreciate how Macs since the late 1990s-early 2000s are compatible with the PC ecosystem when it comes to peripherals, networking, and file systems.
However, what has been lost is "The Macintosh Way"; a distinctly Macintosh approach to computing. There's something about using the classic Mac OS or Jobs-era Mac OS X: it's well-designed across the entire ecosystem. I wish Apple stayed the course with defending "The Macintosh Way"; I am not a fan of the Web and mobile influences that have crept into macOS, and I am also not a fan of the nagging that later versions of macOS have in the name of "security" and promoting Apple products.
What the Mac has going for it today is mind-blowing ARM chips that are very fast and energy efficient. My work-issued MacBook Pro has absolutely amazing battery life, whereas my personal Framework 13's battery life is abysmal by comparison.
What's going to happen, though, if it's possible to buy a PC that's just as good as an ARM Mac in terms of both performance and battery life?
As someone who has never really enjoyed using macs, I do agree with this. It's probably why I don't mind them as much these days - Using MacOS in 2025 just kind of feels like a more annoying version of a Linux DE with less intent behind it. The way macs used to work did not jive with me well, but everything felt like it was built carefully to make sense to someone.
Their advantage against Microsoft is that the Mac UX may be degrading, but the Windows UX is degrading much more quickly. Sure modern Mac OS is worse to use than either Snow Leopard or Windows 7, but at least you don't get the "sorry, all your programs are closed and your battery's at 10% because we rebooted your computer in the middle of the night to install ads for Draft Kings in the start menu" experience of modern Windows.
Their advantage against Linux is that while there are Linux-friendly OEMs, you can't just walk into a store and buy a Linux computer. The vast majority of PCs ship with Windows, and most users will stick with what comes with the computer. It definitely is possible to buy a computer preloaded with Linux, but you have to already know you want Linux and be willing to special order it online instead of buying from a store.
They also had an extensive industrial espionage program. In particular, most of the integrated circuits made in the Soviet Union were not original designs. They were verbatim copies of Western op-amps, logic gates, and CPUs. They had pin- and instruction-compatible knock-offs of 8086, Z80, etc. Rest assured, that wasn't because they loved the instruction set and recreated it from scratch.
Soviet scientists were on the forefront of certain disciplines, but tales of technological ingenuity are mostly just an attempt to invent some romantic lore around stolen designs.
DEC etched a great Easter egg on to the die of the MicroVAX CPU because of this: "VAX - when you care enough to steal the very best".
This is a biased take. One can make a similar and likely more factual claim about the US , where largely every innovation in many different disciplines is dictated and targeted for use by the war industry.
And while there were many low quality knockoff electronics, pre-collapse USSR achieved remarkable feats in many different disciplines the US was falling behind at.
https://en.wikipedia.org/wiki/Timeline_of_Russian_innovation...
I think they were in their own little world, and when they got past that with unix-based OSX and moved from powerpc to intel, they entered the best time.
The PC-based macs were very interoperable and could dual-boot windows. They had PCIe and could work with PC graphics cards, they used usb bt and more. Macs intereoperated and cooperated with the rest of the computing world. The OS worked well enough that other unix programs with a little tweaking could be compiled and run on macs. Engineers, tech types and scientists would buy and use mac laptops.
But around the time steve jobs passed away they've lost a lot of that. They grabbed control of the ecosystem and didn't interoperate anymore. The arm chips are impressive but apple is not interoperating any more. They have pcie slots in the mac pro, but they aren't good for much except maybe nvme storage. without strong leadership at the top, they are more of a faceless turn-the-crank iterator.
(not that I like what microsoft has morphed into either)
Right now, the quality and attention to detail have plummeted. There is also a lot of iOS-ification going on. I wish they focused less on adding random features, and more on correctness, efficiency, and user experience. The attention to detail of UI elements in e.g. Snow Leopard, with a touch of skeuomorphism and reminiscent of classic Mac OS, is long gone.
And then OS X came along, with bash and Unix and all, and there was a lot of shared developer knowledge.
But they still managed to keep a very distinctive and excellent OS, for 20 years after that.
The quality has dropped only recently.
This standard function doesn't exist on iOS but has been replaced with AirDrop. It's a big fuck you from Apple to everyone who prefers open standards.
This isn't true - my shining moment as a 10 year old kid (~1998) was when the HD on our Macintosh went out and we went down to compusa and I picked a random IDE drive instead of the Mac branded drives (because it was much cheaper) and it just worked after reinstalling macos.
It's certainly better than it was, that said Apple really try to isolate themselves by intentionally nerfing/restricting MacOS software to Apple APIs and not playing ball with standards.
> My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
My MacBook Pro has an integrated GPU that supposedly rivals that of desktop GPUs. However, I have to use a second computer to play games on... which really sucks when travelling.
Apple doesn't even have passthrough e-GPU support in virtual machines (or otherwise), so I can't even run a Linux/Windows VM and attach a portable e-gpu to game with.
The M5 was released and has a 25% faster GPU than M4. Great, that has no effect on reading HN or watching YouTube videos and VSCode doesn't use the GPU so... good for you Apple, I'll stick to my M1 + second PC set up
Counter example: Blender
It used to have a extremely idiosyncratic UI. I will only say right click select.
A big part of the UI redesign was making it behave more like other 3d applications. And it succeeded in doing so in a way that older users actually liked and that made it more productive and coherent to use.
What I am saying is, those are different dimensions. You can have a more cohesive UI while adhere more to standards.
There is still lot of weird sacred cows that Macs would do very well to slaughter like the inverted mouse wheel thing or refusing to implement proper alt tab behavior.
You can have both, follow established standards and norms and be more cohesive.
The problem is simply that the quality isn't what it used to be on the software side. Which is following industry trends but still.
I mounted a 20MB external Apple hard drive:
https://retrorepairsandrefurbs.com/2023/01/25/1988-apple-20s...
... on my MSDOS system, in 1994, by attaching it to my sound card.
The Pro Audio Spectrum 16, weirdly, had a SCSI connector on it.
I don’t think that there is going back for Apple, the company is already too enshittified to get back to a company with a vision. They got drowned by AI, the releases and features are subpar to competition. I do care about detail when I’m buying premium products and Apple just doesn’t cut it any more.
Apple built a phone that would bend in pockets because they used flimsy aluminum without enough internal structure, something they should have had ample experience to avoid from the exact same thing happening to tons of iPods.
Apple insisted on developing a moronic keyboard implementation to save less than a mm of "thickness" that was prone to stupid failure modes and the only possible repair was to replace the entire top half of the laptop. They also refused to acknowledge this design failure for years.
Apple built a cell phone that would disrupt normal antenna function when you hold it like a cell phone.
Apple has multiple generations of laptops that couldn't manage their heat to the point that buying the more expensive CPU option would decrease your performance.
Adding to the above, Apple has a long long history of this, from various generations of macbook that would cook themselves from GPU heat that they again, refused to acknowledge, all the way to the Apple 3 computer which had no heat management at all.
Apple outright lies in marketing graphics about M series chip performance which is just childish when those chips are genuinely performant, and unmatchable (especially at release) in terms of performance per watt, they just aren't the fastest possible chips on the market for general computing.
Apple makes repair impossible. Even their own stores can only "repair" by replacing most of the machine.
Apple spent a significant amount of time grounding their laptops through the user, despite a grounding lug existing on charging brick. This is just weird
Apple WiFi for a while was weirdly incompatible, and my previous 2015 macbook would inexplicably not connect to the same wireless router that any other product could connect to, or would fail to maintain it's connection. I had to build a stupid little script to run occasionally to refresh DHCP
Apple had a constant issue with their sound software that inexplicably adds pops to your sound output at high CPU load or other stupid reasons, that they basically don't acknowledge and therefore do not provide troubleshooting or remedies.
Apple was so obsessed with "thinness" that they built smartphones with so poorly specced batteries that after a couple years of normal use, those batteries, despite reporting acceptable capacity, could not keep up with current demands and the phones would be unusable. Apple's response to this was not to let people know what was going on and direct them to a cheap battery replacement, but to silently update software to bottleneck the CPU so hard that it could not draw too much current to hurt the battery. The underpowered batteries were a design flaw.
Apple software quality is abysmal. From things like "just hit enter a bunch to log in as root" to "we put a web request to our servers in the hot path of launching an app so bad internet slows your entire machine down"
Apple prevents you from using your "Pro" iPad that costs like a thousand bucks and includes their premier chip for anything other than app store garbage and some specialty versions of productivity apps.
Apple has plenty of failures, bungles, poor choices, missteps, etc. Apple has plenty of history building trash and bad products.
The only "detail" apple paid "attention" to was that if you set yourself up as a lifestyle brand, there's an entire segment of the market that will just pretend you are magically superior and never fail and downplay objective history and defend a 50% profit premium on commodity hardware and just keep buying no matter what.
Culture flows top-down. Cook is about growth, progressively flowing toward growth at any cost. It’s not a mystery why things are as they are at Apple.
https://www.joelonsoftware.com/2006/06/16/my-first-billg-rev...
Which, incidentally, is a great primer for younger developers on both what obsessive software quality looks like and why datetimes are a hard problem.
That was when the design team began what I call the "one-off UI design" rather than use the same language across all apps.
Never mind the round mouse before that and the blind USB ports on the back of the newer iMacs (hate that scritchity sound of stainless steel on anodized aluminum as I try to fumble around with the USB connector trying to find the opening).
I mean fuck, even their failures don't seem to matter much (Vision Pro, Siri) to their stock price.
We'll get a foldable phone, and some new emoticons. Some font tweaks..
They think we're going to love it.
In the 90s Apple was in worse shape. They couldn’t even compete with Windows 9x for stability. There were memes about how MacOS needed just as many reformats as Windows 98.
The problem isn’t Apples attention to detail, it’s that people hold Apple to a higher standard. But in reality they’re just as fallible as every other software company.
This means that author never considered checking how it looks on any other non-Apple OS. Meanwhile Apple has a setting, which is enabled by default, to artificially make a pseudo-bold font out of normal font: https://news.ycombinator.com/item?id=23553486
And no, you don't know better than me about this cool feature.
They know. When a designer makes one of those prompts with only a "not now", they tend to mean a very specific thing, that is at the same time a subtle message to the user and a passive-aggressive stab at the company they work for.
What they mean: "the code path that handles what happens when you say 'no' here has been deprecated, because support for this feature was not part of the planning of a recent major-version rewrite of this app's core logic. When that rewrite is complete/passes through this part of the code, the option to say no will go away, because the code for that decision to call will be gone. So, in a literal sense, we think it's helpful to keep bugging you to switch, so that you can get used to the change in your own time before we're forced to spring it on you. But in a connotational sense, we also think it's helpful to keep bugging you to switch, as a way of protesting the dropping of this feature, because every time users see this kind of prompt, they make noise about it — and maybe this time that'll be enough to get management's attention and get the feature included in the rewrite. Make your angry comments now, before it's too late!"
I tend to ignore these kinds of things, but sometimes applications are unresponsive, lose focus, and iOS apps don't show the keyboard, etc. so I cannot take it anymore.
I wanted to open a file from the Files app on iPad, a PDF. It opened the Preview app, but it couldn't allow me to scroll through the file. I tried to close it, but, back button goes to the Preview app, not to the Files. Then closed the app, and from the Files, but again it kept opening this separate app, instead of the in-app PDF viewer, and I guess I have never seen a malfunctioning state or application flows in default iOS apps ever.
The new reminders app is a joke. It has weird things that randomly jump from date selection to time selection, and sometimes select random dates.
It's like, they did, `claude new-reminder-app.md --dangerously-skip-permissions`, and "is it working? working! release it!" I know (hope) it's not the case, but, since the last few weeks, it feels it's like that.
And to be honest, it never really existed. It was more that everything else was cheaply manufactured garbage.
These days it feels like various teams are responsible for their part and they are managing toward a delivery date. As long as they check the box that the feature is there... ship it. There is likely not anyone around to throw the product in a fish tank if it isn't up to par.
- When an iPad is presented to you to enter your parent code to unlock an app, the name of the app isn't shown as the pin prompt is over the top of the app/time to unlock details.
- It's not possible to set screen time restrictions for Safari.
- If apps are not allowed to be installed, app updates stop. I have to allow app installations, install updates, then block app installations again.
- Setting downtime hours just doesn't seem to work. Block apps from 6pm - 11.59pm? Kid gets locked out of their iPad at school for the whole day.
- Most of the syncing between settings on a computer to the target iPads appear to be broken completely. If an iPad is in downtime, and the scheduled downtime time changes, it does not take the iPad out of downtime.
- Downtime doesn't allow multi-day hour settings. For instance, try setting downtime from 8pm - 8am.
- Popups in the screen time settings of MacOS have no visual indication that there is more beneath what can be seen. There is no scrollbar. You have to swipe/scroll on every popup to see if there are more settings hidden out of view.
- No granular downtime controls for websites. You can block Safari, or you can not block Safari.
Edit: Oh I almost forgot this nifty little bug reported back in 2023: https://discussions.apple.com/thread/255049918?sortBy=rank
Screentime randomly shows you a warning about being an administrator... no probs you just need to select another account and then re-select the one you want and it'll go away.
Presumably this is because apps could add individual features parents don't approve of between updates.
If you're locking down what apps you want your kids to use (to an individual whitelist of apps, not just by maturity rating), you're essentially stepping into the role of an enterprise MDM IT department, auditing software updates for stability before letting them go out.
What would you propose instead here?
I presume you'd personally just be willing to trust certain apps/developers to update their apps without changing anything fundamental about them. But I think that most people who are app-whitelisting don't feel that level of trust torward apps/developers, and would want updates to be stopped if-and-only-if the update would introduce a new feature.
So now, from the dev's perspective, you're, what, tying automatic update rollout to whether they bump the SemVer minor version or not? Forcing the dev to outline feature changes in a way that can be summarized in a "trust this update" prompt notification that gets pushed to a parent's device?
I really haven't had many problems, and I actually like some of the features. Sure, the UI/UX is not perfect from the start, but there hasn't been anything I have been unable to accomplish because of the new OS. The liquid glass can even be nice with certain backgrounds too.
This is just my hypothesis, but I have noticed that a lot of the people that have been complaining about macOS have been using 3rd party applications for a in their workflow. If I am not mistaken, there were issues with many Electron apps in the beginning. On macOS, I mainly Apple's apps or I'm deep in the command line. So, perhaps I have been fortunate to avoid many of the UI/UX features that many have faced?
macOS is essentially an iCloud client and sales funnel these days, it's clear that's all that Apple sees it as.
It was pancreatic cancer IIRC.
Just one example: I was excited by the idea of having two apps on screen at the same time: there are two I like to look at side-by-side all the time. But one of them (an iPhone app) randomly decides to switch to landscape mode, making the layout unusable. More generally, the window controls keep getting activated unexpectedly by taps when I use full-screen apps like games, resulting in the window reverting to not-full-screen. So I guess I'll just have to turn that feature off until it's actually usable.
Maybe the Windows Vista of Tablet OSs though.
It is terrible, does not anything visually or funcionally to the Apple experience.
oddly, kde plasma is more pleasing and consistent.
Kind of bizarre that they are destroyed their reputation for software perfection.
These are all things which have been broken for years.
He already covered this: https://youtu.be/K1WrHH-WtaA?si=tHrGBNmLlIfp4NSv
Steve truly is dead.
[1] https://cdn.social.linux.pizza/system/media_attachments/file...
1) battery warning above tabs in browser with no x to close it
2) WebKit bugs that make inputs and visual diverge so you have to click under the input to hit it
3) flickering email app when it’s opened
My Apple monitor has USB ports on the back side. Sigh.
My mouse had a charger cable on the bottom. Sigh.
My keyboard has no dedicated copy and paste keys. Sigh.
My keyboard has no dedicated undo and redo keys. Sigh.
At one point I had to start iTunes to update my OS. Sigh.
Really, the next time someone says Apple nails UX I am just going to cry.
x3n0ph3n3•3h ago