(Perhaps charismatic leaders neglect it a bit on purpose because they enjoy being portrayed as irreplaceable.)
The people in power don't have a burning yes.
First, I quickly tap on the first button that has the picture of the credit card and its name. As a result I find myself in a menu that shows me the billing address (go figure)! So, I have to click back, and use the button below that simply states “Change the credit card” or something to that effect.
Why, for the love of god, for the info about the billing address Apple uses picture of CC? Why the billing address is even the first option!?
So, multiple clicks when it can be avoided by a proper design (I think in the past the picture button was the one that changed credit cards, but I don’t know if I am misremembering).
Every time it happens I think about Steve Jobs.
However, it seems that under Tim Cook, Apple has gradually lost many of its traditional values when it comes to usability and UI/UX perfectionism. I suspect that the company has not passed on "The Apple Way" to people who joined the company after Steve Jobs' passing. Not only that, there doesn't seem to be an "Apple Way" anymore.
Come to think of it, the old Apple had figures like Bruce Tognazzini who wrote about "The Apple Way"; I have a copy of Tog on Interface that distills many of the UI/UX principles of the classic Mac. I can't think of any figures like Tog in the modern era.
Gradually the Apple software ecosystem is losing its distinctiveness in a world filled with janky software. It's still better than Windows to me, but I'd be happier with Snow Leopard with a modern Web browser and security updates.
It's sad; the classic Mac and Jobs-era Mac OS X were wonderful platforms with rich ecosystems of software that conformed to the Apple Human Interface Guidelines of those eras. I wish a new company or a community open-source project would pick up from where Apple left off when Jobs passed away.
There's the Hello System[0]... not sure if it counts.
Instead, they should have stayed on the Straigth and Narrow of Quality - where they were for many years - where you move up to computing paradise by having fewer features but more time spent perfecting them.
We see what you did there!
yep. The attention to details is still there, it is just changed from polishing and curating details to creating a lot of small unpolished and uncalled for and thus very annoying details. From MBA POV there isn't much difference, and the latter even produces better KPIs.
Not entirely, though; there is joy and playfulness at the core of Liquid Glass. But delight is not that common, and refinement and focus are definitely lacking. They could have used more nos and fewer yeses.
Either everyone is worried about the consequences of failing to produce high quality work (including at the VP level, given they can allocate additional time/resources for feature baking) or optimizing whatever OKR/KPI the CEO is on about this quarter becomes a more reliable career move.
And once that happens (spiced with scale), the company is lost in the Forest of Trying to Design Effective OKRs.
I generally see complaints about advancement aimed at the hardware. Some are unreasonable standards, some are backlash to the idea of continuing to buy a new iphone every year or two as the differences shrink, but either way software feature spam is a bad response.
That's the bed they made themselves and lay in it willingly.
No one is forcing them to do huge yearly releases. No one is forcing them to do yearly releases. No one is forcing them to tie new features which are all software anyway to yearly releases (and in recent years actual features are shipping later and later after the announcement, so they are not really tied to releases either anymore).
The stock market can easily be taught anything. And Jobs didn't even care about stock market, or stock holders (Apple famously didn't even pay dividends for a very long time), or investors (regularly ignoring any and all calls and advice from the largest investors).
You need political will and taste to say a thousand nos to every yes. None of the senior citizens in charge of Apple have that.
Their software side however is riddled with issues, delayed or cancelled projects, meandering focus, unclear priorities. They are increasingly overpromising and underdelivering.
Their product side is neither here nor there. Vision Pro is a technically marvelous bust. iPhones rely on gimmicks because there's really nothing to differentiate them anymore. Peripherals (Homepod, AppleTV) are stagnant. iPad suddenly saw signs of life this year with good functionality updates after a decade of "we have no idea what to do with it, here are meaningless and confusing hardware updates". Macbooks have been completed as a product years ago (not that it's a bad thing), so, again, they are just randomly slapping non-sensical names on upgrades and chase thinness.
Oh. Thinness. That is literally the only feature that Apple is obsessed with. You can't build a product strategy on thinness alone.
They could easily wait longer between releasing devices. An M1 Macbook is still in 2025 a massive upgrade for anybody switching from PC - five years after release.
If Apple included fully fledged apps for photo editing and video editing, and maybe small business tools like invoicing, there would be no reason for any consumer in any segment to purchase anything other than a Mac.
They could, but then they wouldn't be a trillion dollar company. They'd be a mere $800bn company, at best. ;)
Not many consumers go out to buy an Apple device because the new one has been released. They go out to buy a new phone or new computer because their old one gave out and will just take the Apple device that is for sale.
That's also why Apple bothers to do the silent little spec-bump releases: it gives Business Leasing corporate buyers a new SKU to use to justify staying on the upgrade treadmill for their 10k devices for another cycle (rather than holding off for even a single cycle because "it's the same SKU.")
And then what? Mac users would buy some janky Acer with Windows 11 and bunch of preinstalled malware instead?
1. They've stopped starting small and instead started unrealistically large. Apple Intelligence is a great recent example.
2. They've stopped iterating with small improvements and features, and instead decided that "iterating" just means "pile on more features and change things".
And that's not excusable - every feature should have its maintainer who should know that a large framework update like Liquid Glass can break basically anything and should re-test the app under every scenario they could think of (and as "the maintainer" they should know all the scenarios) and push to fix any found bugs...
Also a company as big as Apple should eat its own dogfood and force their employees to use the beta versions to find as many bugs as they could... If every Apple employee used the beta version on their own personal computer before release I can't realistically imagine how the "Electron app slowing down Tahoe" issue wouldn't be discovered before global release...
I mean, some people are just impossible to please!
https://media.ycharts.com/charts/441687ba735392d10a1a8058120...
The companies forget how to make great products. The product sensibility and product genius that brought them to this monopolistic position gets rotted out by people running these companies who have no conception of a good product vs. a bad product. They have no conception of the craftsmanship that’s required to take a good idea and turn it into a good product. And they really have no feeling in their hearts about wanting to help the costumers.”
- Steve Jobs - https://en.wikipedia.org/wiki/Steve_Jobs:_The_Lost_Interview
That said, I wonder, Jobs lived through Apple's transformation, but not its peak phase where Apple was simply printing money year after year after year. I do wonder if Jobs in 2016 would have been able to keep the organization performing at such a high caliber.
Even he seemed like he was making unforced errors at times too, like the "you're holding it wrong" fiasco, but its hard to say since he didn't live through Apple 2013-2019 where it became an ever increasing money printing machine.
In the age of AI, COVID-19 etc. I wonder how jobs post 2020 would treat things.
When I interviewed at a smaller company, someone high up interviewed me last. I passed everything on paper afaik, but he didn't think I was the right person for some reason. Which is fine for a small company.
I was in a (tech) meetup last week. We meet regularly, we are somewhere between acquaintances and friends. One thing that came up was a very candid comment about how "we should be able to tell someone 'that is just stupid' whenever the situation warrants it".
I believe that does more good than harm, even to the person it is being directed to. It is a nice covenant to have, "we'll call you on your bs whenever you bring it in", that's what a good friend would do. Embracing high standards in a community makes everyone in it better.
The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
I wish he'd bless a certain Linux distro for PCs so we can have some default. Current default is kinda Ubuntu, but they've made some weird decisions in the past. Seems like he'd make reasonable choices and not freak out over pointless differences like systemd.
You can tell someone their idea is substandard without inferring their stupid, which is generally taken to be an insult. Tact in communication does matter. I don't think anyone needs to say "that is just stupid" to get a point across.
I've had plenty of tough conversations with colleagues where it was paramount to filter through ideas, and determining viable ones was really important. Not once did anyone have to punch at someone's intelligence to make the point. Even the simple "Thats a bad idea" is better than that.
>whenever the situation warrants it
Which will of course be up to interpretation by just about everyone. Thats the problem with so called "honest"[0] conversation. By using better language you can avoid this problem entirely without demeaning someone. Communication is a skill that be learned.
>The Linux kernel would be absolutely trash if Linus were not allowed to be Linus. Some contexts do and must require a high level of expertise before you can collaborate properly in them.
Linus took a sabbatical in 2018 to work on his communication and lack of emotional empathy. He's had to make changes or he absolutely risked losing the respect of his peers and others he respected. He has worked on improving his communication.
To follow Linus as an example, would be to work on communication and emotional empathy. Not disregard your peers.
[0]: Most often, I find people who are adamant about this line of thinking tend to want an excuse to be rude without accountability.
In all of those projects and organsiations which value respectful language and inclusivity and all sorts of non-results-oriented crap, not much usually gets done. This is how you get design-by-committee lowest common denominator slop.
And even if you don't agree with what I'm saying here, "avoid criticising people" quickly turns into "avoid criticising their ideas because they might feel offended". There was a recent HN thread about AI-written pull requests and how people have problems with them, because tactfully rejecting 10k lines of pure bullshit is very hard without making the submitter upset. Guess what, if they were allowed to say "no, you aren't going to be merging slop you can't even explain" the average code quality would skyrocket and the product would be greatly improved.
Those two things are not coupled. You can maintain a sense of politeness in face of conflict. This is the entire basis of Nonviolent Communication, a great book about handling and resolving conflict in such a manner.
It’s extremely effective in my experience and results in overall better clarity and less conversational churn.
>Why would anyone do that if they can't even be called out for messing something up, yet alone being held accountable
You can be, that is in part a definition of accountability and you’re conflating a lack of accountability with some idea that it requires behaving in a manner that may be construed as rude, and that’s simply not true.
So like anything, you hold them accountable. You can do that without being rude.
>In all of those projects and organsiations which value respectful language and inclusivity and all sorts of non-results-oriented crap
I’m getting a sense you have a predisposition to disliking these things. They’re really important because they are, when correctly understood, results oriented. It frees up people to feel comfortable saying things they may not have been otherwise. That is very productive.
Abusive and abrasion language does not do that.
>This is how you get design-by-committee lowest common denominator slop
No, in my experience and many reports from others you get this for a myriad of reasons, but consistent theme is lack of ownership or organizational politics, not because people level up their communication skills
>avoid criticising their ideas because they might feel offended". There was a recent HN thread about AI-written pull requests and how people have problems with them, because tactfully rejecting 10k lines of pure bullshit is very hard without making the submitter upset. Guess what, if they were allowed to say "no, you aren't going to be merging slop you can't even explain" the average code quality would skyrocket
I don’t disagree with you because I don’t believe in proper criticism, I do. I disagree with you because the implicit messaging I’m getting here is the following
- you sometimes have to be a jerk
- therefore it’s okay to be a jerk sometimes
- somehow having an expectation of treating others with respect somehow equates to poor accountability
I’ve spent a good chunk of my years learning a lot about effective communication and none of it is about avoiding accountability, of yourself or others. It’s about respecting each other and creating an environment where you can talk about tough things and people are willing to do it again because they were treated respectfully
Yes, I'm conflating that. Maybe the two aren't intrinstically coupled, but from what I have seen, that seems to happen. When you forbid surface-level aggression, people don't stop being aggressive or frustrated. They just turn to more underhanded ways of aggression, like bullying or gaslighting.
>I’m getting a sense you have a predisposition to disliking these things. They’re really important because they are, when correctly understood, results oriented. It frees up people to feel comfortable saying things they may not have been otherwise. That is very productive.
I see what you're saying, but I do not think this plays out in practice like that. It's not results-oriented thinking when you prioritise how the other person feels above all. Not to say you should never prioritise it (that's called sociopathy), but if you prioritise it too much, you can say less things, not more, because disagreeing with someone always carries the risk of offence, especially if you are going to say that their idea isn't the best. If you nurture a culture of honesty - and that does not include being abusive - then people will feel free to push back on bad ideas, and that is results-oriented thinking.
>I disagree with you because the implicit messaging I’m getting here is the following
The first two points absolutely, but I would like to push back on the third point. Not every idea deserves respect and hell, not everyone deserves respect either! It is the hollowing out of what the word originally meant. Ideally, you only respect people who deserve respect, who return it to you in turn. To be respected is a honour, not a right, because it carries implicit trust in your words. If you consistently have a negative impact on the environment, I do not think it is a reasonable expectation to continue to treat you with respect, because that wastes everyone's time.
And if someone "consistently has a negative impact on the environment" you can still confront them without being abrasive. They can still be fired without calling them stupid. Adding that kind of tone adds no information except that you lost your cool. You're making it sound like every instance that warrants confrontation is about an intentional and repeated offence.
I prefer the former by a lot, but of course you're free to spend your time in the latter.
What's wrong with calling an idea stupid? A smart person can have stupid ideas. (Or, more trivially, the person delivering a stupid idea might just be a messenger, rather than the person who originally thought of the idea.)
Though, to be clear, saying that an idea is stupid does carry the implication that someone who often thinks of such ideas is, themselves, likely to be stupid. An idea is not itself a mind that can have (a lack of) intelligence; so "that's stupid" does stand for a longer thought — something like "that is the sort of idea that only a stupid person would think of."
But saying that an idea is stupid does not carry the implication that someone is stupid just for providing that one idea. Any more than calling something you do "rude" when you fail to observe some kind of common etiquette of the society you grew up in, implies that you are yourself a "rude person". One is a one-time judgement of an action; the other is a judgement of a persistent trait. The action-judgements can add up as inductive evidence of the persistent trait; but a single action-judgement does not a trait-judgement make.
---
A philosophical tangent:
But what both of those things do — calling an idea stupid, or an action rude — is to attach a certain amount of social approbation or shame to the action/idea, beyond just the amount you'd feel when you hear all the objective reasons the action/idea is bad. Where the intended response to that "communication of shame" is for the shame to be internalized, and to backpropagate and downweight whatever thinking process produced the action/idea within the person. It's intended as a lever for social operant conditioning.
Now, that being said, some people externalize blame — i.e. they experience "shaming messaging" not by feeling shame, but by feeling enraged that someone would attempt to shame them. The social-operant-conditioning lever of shame does not work on these people. Insofar as such people exist in a group, this destabilizes the usefulness of shame as a tool in such a group.
(A personal hypothesis I have is that internalization of blame is something that largely correlates with a belief in an objective morality — and especially, an objective morality that can potentially be better-known/understood by others than oneself. And therefore, as Western society has become decreasingly religious, shame as a social tool has "burned out" in how reliably it can be employed in Western society in arbitrary social contexts. Yet Western society has not adapted fully to this shift yet; which is why so many institutions that expect shame to "work" as a tool — e.g. the democratic system, re: motivating people to vote; or e.g. the school system, re: bullying — are crashing and burning.)
The likeliest outcome from that is the other person gets defensive and everything stays the same or gets worse. It’s not difficult to learn to be tactful in communication in a way which allows you to get your point across in the same number of words and makes the other person thankful for the correction.
Plus, it saves face. It’s not that rare for someone who blatantly say something is stupid to then be proven wrong. If you’re polite and reasonable about it, when you are wrong it won’t be a big deal.
One thing I noticed about people who pride themselves in being “brutally honest” is that more often than not they get more satisfaction from being brutal than from being honest, and are incredibly thin-skinned when the “honest brutality” is directed at them.
> The Linux kernel would be absolutely trash if Linus were not allowed to be Linus.
I don’t understand why people keep using Torvalds as an example/excuse to be rude. Linus realised he had been a jerk all those years and that that was the wrong attitude. He apologised and vowed to do better, and the sky hasn’t fallen nor has Linux turned to garbage.
https://arstechnica.com/gadgets/2018/09/linus-torvalds-apolo...
But that's a special case, not a usual one. Unfortunately, quite a lot of people say things are stupid when they don't understand them (often because of an inflated sense of their own expertise). If they can politely explain why they think an idea is bad, they are more likely to be listened to, and they can save face if the other person successfully counters their argument.
Bottom line is, if you go around calling ideas stupid you better make damn sure you're never wrong, otherwise, well... that's just stupid :)
This is a huge misunderstanding at best and a malicious re-framing of serious issues within portions of the tech industry at worst.
In what context is virtually everyone pushing to hire demonstrably unqualified people?
Any time anyone is celebrated for being an X in this role rather than being good at the role, this is being pushed for.
Im sceptical. I've never seen what you describe outside of toxic "culture war clickbait videos", what i have seen is nepotism, class privileges and sprint culture pushed by investors - you know the exact opposite of what you describe.
If you take a hardline attitude on keeping the gates up, you're just going to end up with a monoculture that stagnates.
Sure, they lack wisdom, but that doesn't mean they aren't smart, it just means they're young.
Gatekeeping doesn't have to mean "Don't hire anyone under 35" it means "Don't hire people who are bozos" and "don't hire people who don't give a shit"
I’ve worked at places that have the opposite philosophy - hire quickly and fire quickly. That works in terms of hiring people who already happen to be what you want them to be. It just leaves no room for anyone who could be, but isn’t yet, what you want them to be. It also leaves no room for anyone who is different from what you are looking for but who could still bring a lot to the table if you just take the time to figure out what that is, which I think describes a lot of people. You might have hired a mediocre programmer who would be a rockstar at documentation, for example. That kind of thing happens all the time, yet workplace culture and practices tend not to accommodate that. By all means have standards, but put in some effort to help your people reach them in their own way.
If Apple was made up of only top-end engineers led by a quality-obsessed maniac, would they put out better or worse products?
Of course, not everyone can follow this philosophy, but they don't have to, and most don't want to anyway.
The great engineers don’t graduate from college knowing everything they need to know, nor are they born with that knowledge. It takes time and help from other people to get them there. Even if they were already a top performing engineer at Netflix, that doesn’t mean they can smoothly transition into a role at your company and perform well with zero assistance. The on-ramp matters and has a huge impact on how they will perform. Some people will require more investment than others, but that’s true regardless of whether you stubbornly try to maintain your existing monoculture. And I firmly believe that everyone brings something different to the table. It’s mostly a matter of figuring out what that is for each person.
My take away is that diversity at a global level, and in some specific contexts, is a great thing. But diversity in some other specific contexts is entirely destructive and analogous to rot or decomposition.
When we rely on a core societal function (firefighting, accounting, waterworks maintenance, property rights, etc.) the people responsible for maintaining these functions need to maintain in themselves a set of core characteristics (values as patterns of action), and there is room to play outside of those cores, but those cores shouldn't be jeopardized as a tradeoff for diversity and inclusion.
For example, if constructive core values of a railroad system is consistency and reliability, then these shouldnt be diminished in the name of diversity and inclusion, but if diversity and inclusion can be achieved secondarily without a tradeoff (or even to somehow further amplify the core values) then it is constructive. One has to thoughtfully weigh the tradeoffs in each context, and ensure that the most important values in that context to maintain the relevant function are treated as most important. The universe seems to favor pragmatism over ideology, at least in the long run.
So in a company if the core values that make it successful are diluted in exchange for diversity, it's no longer what it was, and it might not be able to do keep doing what it did. That said, it also might have gained something else. One thing diversity tends to offer huge complex systems is stability, especially when its incorporated into other values and not held up singularily.
In other words, my take on diversity (and by extension, inclusion) is that we need a diversity of diversity. Sometimes a lot of diversity is best, and sometimes very little diversity is best.
I do not for the life of me understand your point. Gatekeeping, as its most commonly used, means controlling access to something (be it a resource, information etc) to deliberately and negatively affect others that are not part of a "blessed" group. Its not objective, and certainly is not a practice reliant on merit. Its an artificial constraint applied selectively at the whim of the gatekeeper(s).
>There's been a shift where everyone wants to welcome everyone, but the problem is it erodes your company culture and lowers the average quality.
The first assertion and the second one are not related. Being welcoming to everyone is not the same thing as holding people to different standards. Company culture sets company inertia and how employees are incentivized to behave and what they care about. You can have the most brilliant engineers in the world, like Google most certainly does have its fair share, and as we have seen, with the wrong incentives it doesn't matter. Look at Google's chat offerings, the Google Graveyard, many of their policies becoming hostile to users as time goes on etc.
Yet you can have a company with what you may deem "average quality" but exceeds in its business goals because its oriented its culture to do so. I don't think Mailchimp was ever lauded for its engineering talent like Google has been, for example, but they dominated their marketplace and built a really successful company culture, at least before the Intuit acquisition.
Another example is that a hobby I loved is now dead to me for lack of gatekeeping; Magic the Gathering. Wizards of the Coast started putting out products that were not for their core playerbase, and when players complained, were told "these products are not for you; but you should accept that because there's no harm in making products for different groups of people". That seems fair enough on its face. Fast forward a couple of years, and Magic's core playerbase has been completely discarded. Now Magic simply whores itself out to third party IPs; this year we'll get or have gotten Final Fantasy, Spiderman, Spongebob Squarepants, and Teenage Mutant Ninja Turtles card sets. They've found it more lucrative in the short-term to tap into the millions of fans of other media franchises while ditching the fanbase that had played Magic for 30 years. "This product is not for you" very rapidly became "this game is not for you", which is pretty unpleasant for people who've been playing it for most or all of their lives.
Also, it became the best selling set of all time even before it was out. Which isn’t an indicator of quality, for sure, but it does show Wizards understands something about their market.
I'm not sure Wizards does understand their market. As you noted, a set doing numbers pre-release has absolutely nothing to do with its quality; it just means there are a lot of Final Fantasy fans interested in collecting cards. But this is not necessarily sustainable for another 30 years, because those Final Fantasy fans are not necessarily going to stick around for Spiderman, and Spiderman fans are not necessarily going to stick around for Spongebob. The Spiderman set was already such a massive flop that they were trying to identify and blame which content creators/streamers were responsible for negatively influencing public opinion, as though that couldn't have happened organically.
At this point, I'm done with WotC. The Pinkerton thing was by far the worst and what made me turn my back forever. Bad rulesets or design designs with which I disagree are one thing, but I refuse to do business with a company that thinks it's acceptable to use force to try to bully people into sticking to their release schedules. They can pound sand forever.
It's good to not exclude people for arbitrary reasons (though even this requires the caveat that one man's "arbitrary" is another man's "important part of our identity"). But we also need to recognize that it's ok for something to not be everyone's cup of tea. There isn't some kind of moral mandate that everything must be maximally welcoming to all. Unfortunately, we don't recognize that in our current culture, and in fact we stigmatize it as "gatekeeping" which is deemed to be toxic. But the culture is wrong about this.
It used to be hard and a liability to be a nerd
I'm pretty sure this would also render the dot-com bubble the nerds fault?
Let's not go back to how nerd culture used to be regarding diversity... or lack thereof.
I remember when Bill Gates was on magazine covers, viewed as a genius, a wonderful philanthropist, even spoofed in Animaniacs as "Bill Greats."
I guess my point is, "It used to be hard and a liability to be a nerd" was never true, and is nothing but industry cope. The good old days were just smaller, more homogenous, had more mutually-shared good old toxicity and misogyny (to levels that would probably get permabans here within minutes; there's been a lot of collective memory-holing on that), combined with greater idolization of tech billionaires.
What changed in 2010?
https://arstechnica.com/gadgets/2018/09/linus-torvalds-apolo...
—Steve Jobs
Successful publicly traded companies have a responsibility to generate more revenue and increase the stock price every year. Year after year. Once their product is mature after so many years, there aren't new variations to release or new markets to enter into.
Sales stagnate and costs stagnate; investors get upset. Only way to get that continual growth is to increase prices and slash costs.
When done responsibly, it's just good business.
The problem comes in next year when you have to do it again. And again. Then the year after you have to do it again. And again.
Such as all things in life, all companies eventually die.
I don't use a Mac anymore, but I do use an iPhone. This is the worst version of iOS I can recall. Everything is low contrast and more difficult to see. The colors look washed out. The icons look blurry. In my opinion, Liquid Glass is a total bust. I don't know what these people are thinking. Times have certainly changed.
OP was talking about design languages
I think we're stuck with the notch forever on iPhones. Even if apple uses an on-screen fingerprint reader in the future like a billion phones already do they're not going to go back from the face scanner. The only thing that will work is if the face scanner can read from behind the display.
Maybe it's because I use dark mode? I can only tell it's there if I move my mouse under it.
Here's a "workaround" that might help [1]. It entirely excludes the notch area from use.
Got the idea from the Top Notch app, which no longer seems to work: https://topnotch.app/
Go to Settings > Displays. In the list of resolutions you need to enable “Show all resolutions” then you can select one that will hide the notch
ironically I don't really mind the new design language, whatever, if the damned thing worked.
I believe 2026 will finally be the year of Linux desktop.
I’ve been hearing this substituting in YYYY+1 every YYYY for the last quarter century.
The year of Linux desktop will never come. Why?
- Money. Hardware manufacturers make more money selling computers that are optimized for Windows and there is nothing on the horizon that will change that meaning that the Linux desktop experience is always the worst of the three main options for hardware compatibility.
- Microsoft. Call me when Office runs natively in Linux. You might be happy with LibreOffice or Google Docs, but MS Office still dominates the space (and as someone who does a lot of writing and has a number of options, I find Word to be better than any of the alternatives, something that 30 years ago I would have scoffed at).
- Fidgetiness. All the tweaking and customizing that Linux fans like is an annoyance for most people. Every customization I have on my computer is one more thing I need to keep track of if I get a new computer and frankly it’s more than a little bit of a pain.
Everything seems to be lazily done now - by that I mean, a modal pops-up and then it resizes to fit the content. Never seen this before.
Or, you open settings (settings!) and it's not ready to use until a full second later because things need to pop in and shift.
And it's animated- with animation time, so you just have to wait for the transitions to finish.
And "reduce motion" removes visual feedback of moving things (e.g. closing apps) so I find it entirely unusable.
And as others have noted the performance is completely unacceptable. I have a 16 pro and things are slow... And forget "low battery mode" - it's now awful.
I'm not doing anything weird and keep like all apps closed and things off when I don't use them and battery life is significantly worse. (Noticed the same on M4 + Tahoe, upgraded at the same time)
Very disappointed and I very much regret upgrading.
In this case the inefficiency was attention to detail but in other companies it might be something else.
For several years, there's been an issue with audio message recording in iMessage. Prior to iOS 26, it would silently fail; the recording would "begin" but no audio would be captured. This would happen 3, 4, even 5 times in a row before it would actually record audio.
Apple is clearly aware of the issue, because in iOS 26 the failure is no longer silent. Now, you'll get feedback that "Recording isn't available right now". Yet the incidence is...exactly the same. You might have to try 5 times before you're actually able to record a message.
It's simply infuriating, and it makes no sense to a user why this would be happening. Apple owns the entire stack, down to the hardware. Just fix the fucking bug!
There's little problems that keep accumulating, like the camera app opening up and only showing black until restarting it, at which point I've missed the candid opportunity.
I'm not going anywhere, it's still the right mix of just-works across their ecosystem for me, but dang, the focus does feel different, and it's not about our experience using Apple.
[1] https://discussions.apple.com/thread/256140468?sortBy=rank
Also, I have the iPhone 15 Pro (iOS 26.0.1), never had the black screen on camera open yet. That's the kinda thing I'd get on Android.
DHH was someone I kinda read him online, but he's been going full-in on these racist talking points, e.g., https://paulbjensen.co.uk/2025/09/17/on-dhhs-as-i-remember-l... :(
Judging by the Omarchy presentation video it feels too keyboard oriented. Hotkeys for everything. And hotkeys for AI agents? Is is opinionated indeed. Not my cup of tea.
I feel like that loses a majority of people right there. I like the option to do common things with the keyboard, or to configure things with a file. But for things I don't do often, holding a bunch of keyboard shortcuts in my head feels like a waste for those rare things.
I'm not sure about anyone else, but I can't run whatever Linux distro I want at work. When an OS relies on muscle memory to get smooth and fluid operation, it seems like that would make it extra hard to jump between for work vs home. I spent years jumping between OS X and Windows, and I found it's much nicer now that I'm on the same OS for work and home. Even the little things, like using a different notes app at home vs work do trip me up a little, where I'll get shortcuts wrong, or simply not invest in them, because of the switching issue. Omarchy feels like it would be that situation on steroids.
You outgrew this myth, congratulations!
> Look, I've got nothing but respect for the perfectly lovely humans who work at Apple. Several are classmates from university, or people I had the pleasure of working with before at different companies. But I rather suspect what's happened here is that some project manager ... convince Tim
But haven't outgrown this one yet, well, maybe in another 8 years...
Luckily Safari's Reader Mode didn't bug out
https://media.nngroup.com/media/editor/2025/10/06/1-messages...
It's fascinating to me because that's the single thing which every user goes through. It's the main branch and not some obscure some edge case. How do you do testing that you miss that?
There was one that was about all the annoying security pop-ups Windows (used to?) have. (FWIW, it starts here: https://youtu.be/qfv6Ah_MVJU?t=230 .)
Lately I've gotten so many of these popups on Mac that it both annoys and amuses the hell out of me. "Die a hero or live long enough to see yourself become the villain", I guess.
But, man, Apple hardware still rocks. Can't deny that.
Ah yes, the Johnny Ive era of "no ports on Macbooks except USB-C, and hope you like touchbars!" was fantastic. Not to mention how heavy the damn things are. Oh and the sharp edges of the case where my palms rest. And the chiclet keyboards with .0001 mm of key travel. I'll take a carbon fiber Thinkpad with an OLED display any day of the week, thank you. Macbooks feel like user hostile devices and are the epitome of form over function.
It was also a lot worse for me when plugged into outlets in an old house in Mexico, especially when my bare feet were touching the terracotta floor tiles; it's not really an issue in a recently re-wired house in California with a wood floor, using the same laptops, power strips, etc.
If you are having this issue and you currently plug a 2-pronged plug into a grounded outlet, try using Apple's 3-pronged plug instead, and I expect it would go away. If you don't have grounded outlets, then that's a bit more complicated to solve.
This comment explains it well: https://news.ycombinator.com/item?id=45686427
I've often wondered why I can tell by touch whether a device is charging or not from the slight "vibration" sensation I get when gently touching the case.
It's often noticeable if you have a point contact of metal against your skin; sharp edge / screw / speaker grill, etc. Once you have decent coupling between your body and the laptop, you won't feel the tingle / zap.
They're called Y-caps if you want to delve deeper into them and their use in power supplies.
[1]: https://www.apple.com/it/shop/product/mw2n3ci/a/prolunga-per...
[2]: https://www.apple.com/fr/shop/product/mw2n3z/a/câble-d’exten...
As another poster mentioned, it's particularly annoying because Apple does ship the UK adapter in 3-pin grounded form
What I do mind is that there's only 3 of them.
The problem with the 2 USB-C ports on modern PC laptops is that one of them pretty much has to be reserved for the charger, whereas the MBP has a MagSafe port that you can charge with instead. So it really only feels like you have one USB-C port and the other ports are just there as a consolation. That might work out to roughly equal, but I don't think it leaves the Mac off worse. I don't hate the dongles so much though.
It wouldn't have hurt to have some USB-A and HDMI on the MBP--the Minis can pull it off, so clearly the hardware is capable--but more (Thunderbolt) USB-C would still be the best option IMO. USB-A (definitely) and HDMI (probably) will eventually be relics someday, even if they are here for a little while longer.
Which is like, a great way to subsidize junk USB hubs...? But for sure they love following through with policies.
https://en.wikipedia.org/wiki/Apple_Attachment_Unit_Interfac...
ADB was definitely proprietary, but arguably it wasn't a data port, nobody used ADB to output data.
Apple SCSI ports used nonstandard Apple connectors.
Apple Ethernet port was just Ethernet, except Macs preferred AppleTalk for networking, which was a purported competitor to Ethernet.
Apple USB port was just USB, except they were among the firsts, so it was kind of ex-proprietary.
Apple FireWire was just IEEE1394, except(combine Ethernet and USB)
Apple Thunderbolt was(combine all above)
Apple USB-C is(combine all above)
The bizarre part of the USB-C story is not Apple's involvement or early adoption of it, but rather that the mobile hardware side of Apple refused to support it. That they clung to the Lightning connector until the EU forced them to drop it, while their computer division had long since and enthusiastically adopted USB-C, is much more damning.
Your argument was they had a rule of "one standard and two proprietary ports" as a means to "allow data to be ingested easily and freely shared inside Apple ecosystem, but not exported back out to the outside world with same ease".
For serial they used mini-DIN to save space on the back of the machine instead of a random mix of DB-25 and DE-9 on the PC side. My family and everyone I knew used a dime-a-dozen cable to use a typical PC modem, data was shared feely. There was no "one standard" port at this time to get data "ingested", serial went both ways.
Even on PCs, to do anything serial you needed hardware and driver support anyway, that was the blocker, not the shape of the port. If Apple adopted DB-26 for serial, how would that let data share more freely?
For SCSI, the DB-25 Apple used was not proprietary. And even in the System 6 days they had Apple File Exchange to access FAT-formatted disks to write data out for PC users.
For Ethernet, Apple started building in Ethernet as standard before PC makers. They sold a laptop with Ethernet built-in in 1994, this was unheard of on PC laptops.
As for AppleTalk, they pushed LocalTalk at a time before PCs had any built-in networking whatsoever, a PC network card cost a hundred bux and were only used by corporations whereas in the home if you had a Mac you could make a network with a printer cable between two machines, Apple got it for cheap by spending an extra 10 bucks on RS-422 for their serial ports, why wouldn't you advertise that?
If you're talking about AppleTalk the network protocol rather than LocalTalk the physical protocol, Apple bundled TCP/IP with MacOS before Windows did ("Trumpet WinSock" was third party software), back when Microsoft thought they could stop people from adopting the internet since "The Microsoft Network" was so going to be so much better.
Arguing that adopting the Apple making PowerPC machines adopting the Intel-defined USB which was already on PCs for years before was a means to keep people from moving data out from the "internet Mac" (which was advertised as letting you share information with the world with "there is no step 3") is just... it makes no sense.
iOS on the other hand... Completely different thing.
I'm sure it's handy for mobile devices where size and versatility trumps everything, but on laptop/desktop machines where longer-term usage is expected I would prefer something more reliable.
C is absolutely better than the micro/mini variants, but not the full-size ports.
I'm just salty because I will have to replace either the ports on my Macbook or my USB-C wireless headphone receiver (both are a pricey endeavor, not to mention the downtime of having the laptop shipped for repair) just for the same issue to most likely reoccur a year down the line since it wasn't a result of any kind of misuse (both devices are exclusively used in an office environment and otherwise in brand new condition).
On all but the top tier MBPs, USB C ports on Macs have different specs for data transfer (often the ones on the right of the machine will have half the transfer speed).
The Intel MBPs had more variance, but they only had the 2+2 configurations in 2016 and 2017. The 2018 and 2019 generations had all full-speed ports.
I call bullshit on this. Ever since I am alive, I could always use the USB ports on my motherboard and PC case for charging and data transfer.
That same laptop, and a desktop PC I have, do not support USB-PD over USB-C, so only 5V/500mA trickle charging is supported. This isn't the charging direction I was thinking of originally, but since this seems to be the direction you're thinking of, it's worth mentioning.
Also, neither of these ports are Thunderbolt. I'm pretty sure they are USB 3.0 at least, which doesn't have terrible speed to be fair, but still is somewhat limiting at least as far as the laptop is concerned since it means there's no way to get PCIe speeds.
Granted, this is ~2019 era hardware, but nevertheless the USB-C ports are not nearly as useful as they could be.
I also elaborated that, even when we consider the other direction, i.e. host-to-peripheral charging, many USB-C ports on PCs only provide baseline USB power levels (aka "slow charging"). The implication (that I now make explicit) is that such poor charging performance would not justify removing all other ports from the computer. I didn't mention this originally, because I didn't think of it then, but now that you have brought it up, I would add it to my argument.
The crux of that argument was: USB-C as the only port type is acceptable as long as those USB-C ports are full featured. That means (again, to be explicit) that they support Thunderbolt, DisplayPort, and bidirectional USB-PD (aka "fast charging"), though obviously one of those directions is not applicable to hosts that have no battery (e.g. desktops).
In other words, if the USB-C ports are no better than USB-A ports, then they are not good enough to take the place of other, dedicated port types.
And even at their worst they were still much better than any Windows laptops, if only for the touchpad. I have yet to use a Windows laptop with a touchpad even close to the trackpad's that Apple had 15 years ago. And the build quality and styling is still unbeaten. Do Windows laptop makers still put shitty stickers all over them?
Case in point that people will never admit that Apple messes up, even if Apple themselves will.
I still have my 2014, along with a 2021 MBP for work, and still love them as machines for my usage profile - writing software/firmware, and occasional PCB design. The battery life is good, M-series performance is great, screen is decent-to-good, trackpad is still best in class, and macos is _okay_ in my book. The keyboard isn't amazing as I prefer mechanical for sure, but I still type faster on a macbook keyboard than anything else. That being said, I designed a mechanical keyboard that sits on top of the macbook keyboard so I can enjoy that better typing experience.
Pretty dang happy with my setup.
They really dodged a bullet there. 2016-2020 Apple laptop hardware definitely didn't rock. It's good they did an about-face on some of those bad ideas.
You can’t get more brain dead that taking away important screen real estate then making the argument that you get more real estate because it’s now all tucked into a corner.
God forbid there be a black strip on the sides of the screen. How did we ever live?!??
Also worth pointing out that this design allows a substantial footprint reduction, which for example puts the 13.6” Macbook Air within striking distance of traditional bezel 12” laptops in terms of physical size. Some people care about that.
And peoples suggestions is to install 3rd party software or just deal with it. It doesn’t help that fanatics feel the need to tell you which parts of the screen are and aren’t important real estate. Like fuck me and my opinions right?
“Well actually mathematically there’s more real estate in less convenient places so it’s fine.” Is so…depressing to watch people just give in to any little idea that comes out of this company’s PR department, like their logic is The Only truth.
Or if your work is forcing it on you, I'm sorry but that's not the fault of people who happen to enjoy it. Maybe ask if you can get a different machine?
FWIW, I think the Touchbar was close to being a good idea, it was just missing haptics.
It's so rewarding when its charger dies in a month, and you feel superior to your colleague, whose vintage 6 months old charging cable with none of that extraneous rubber next to the connector catches fire along with your office. What a time to be alive!
The best part is the motherboard produced in a way to fail due to moisture in a couple of years, with all the uncoated copper, with 0.1mm pitch debugging ports that short-circuit due to a single hair, and the whole Louis Rossmann's youtube worth of other hardware features meant to remind you to buy a new Apple laptop every couple of years. How would you otherwise be able to change the whole laptop without all the walls around repair manuals and parts? You just absolutely have to love the fact even transplanting chips from other laptops won't help due to all the overlapping hardware DRMs.
I'll go plug the cable into the bottom of my wireless Apple mouse, and remind myself of all the best times I had with Apple's hardware. It really rocks.
And of course, just had to bring up the whole mouse charger thing. Back when Apple updated their mouse once and replaced the AA compartment with a battery+port block in the same spot to reuse the old housing, and a decade later people still go on about the evil Apple designers personally spitting in your face for whatever reason sounds the most outrageous.
“Legendary attention to detail”
Indeed, it is pretty open-and-shut.
I'm surprised it came out during the Jobs era because he strongly believed in "form follows function".
[1] https://cdn.shopify.com/s/files/1/0572/5788/5832/files/magic...
The Jobs era of Apple had a ton of pretty but less functional designs. Jobs is quoted as saying that, but he was full of it. He didn't actually practice that philosophy at all.
I don't think anyone does anything "to be evil".
But clearly they had a choice between what was good for the user (being able to use the mouse while charging) and what suited their aesthetic, and they chose the latter. Open-and-shut case, indeed.
The charging port location is weird and stupid, but I have never needed to charge it while I am using it. When it hits about 15%, I plug it in at the EOD and don’t have to charge it again for maybe a month. I am a neat freak and you have to look hard to see any cable on my desk rig.
The multi touch stuff works fine for me, but perhaps I am just used to it.
The only complaint I have is the shape, it’s not “comfortable” to use. Easily addressed by a stupid 3D printed humpback add on that changes the aesthetic but makes it comfortable for me to use. I shouldn’t have to buy a mouse accessory…but I did.
Here is the thing though…it’s just a mouse. I point, I click, then I move my hand back to the keyboard. It’s fine. While I’m sure there is a better designed one out there, is any mouse truly revolutionary?
On the apple mouse side, I got a white corded mouse with the tiny eraser looking mousewheel back in around 2003 or so, it's still in use today with a M4 mac mini. Works like a champ, Keyboard from that era is also still in use and used daily in our family.
[1] https://techpp.com/2011/04/19/mobee-magic-charger-for-magic-...
I have doubts that it did, as that would warrant a safety recall.
Plugging into the wide EU outlet with the Apple-manufactured plug, from the "World Travel Adapter Kit", can lead to uncomfortable "vibration" that you feel when you touch the top case, depending on the the hotel/airbnb. Whenever I visit I expect I should charge while I'm not using the device.
I have my doubts that Apple would admit enough to perform a safety recall given the issues they've had with their garbage chargers in the past. Other companies have no problems with building hardware that lasts. Apple seem to prefer their customers pay for replacements.
Apple have a couple of extra mechanisms in place to remind us to buy a new device:
- On iOS the updates are so large it doesn't fit on the device. This is because they purposely put a small hard drive i. It serves a second purpose - people will buy Apple cloud storage because nothing fits locally.
- No longer providing updates to the device after just a few years when it's still perfectly fine. Then forcing the app developer ecosystem to target the newer iOS version and not support the older versions. But it's not planned obsolescence when it's Apple, because they're the good guys, right? They did that 1984 ad. Right guys?
This is a weird one to complain about because Apple leads the industry in supporting older devices with software updates. iOS 26 supports devices back to 2019. And they just released a security update for the iPhone 6S, a model released a full decade ago, last month.
The oldest Samsung flagship you can get Android 16 for is their 2023 model (Galaxy S23), and for Google the oldest is the 2021 model (Pixel 6).
"This is a weird one to complain about, look at Donnie, he cheated on his girlfriend 3 times last month!"
If Apple continues to supply updates for six-year-old phones, iPhone 17 prices range from $11/month (base model iPhone 17) to $28/month (iPhone 17 Pro Max w/2TB storage), meaning it's only about 20% more expensive to store data on a RAID 10 array of iPhone 17 Pro Maxes running current iOS versions than on standard-tier S3 (not a relevant comparison, obviously, but it amuses me).
So I don't know what's reasonable, but Apple's policies certainly appear to be.
I'm still salty that Apple no longer offer battery service on my OG Apple Watch, however, so reason has its limits.
| Model | Launch date | Obsoleted by | Price
|-----------|--------------------|--------------|------
| iPhone | June 29, 2007 | iOS 4 | $399 (*price cut)
| iPhone 4 | June 24, 2010 | iOS 8 | $599
| iPhone 6 | September 19, 2014 | iOS 13 | $649
| iPhone 11 | September 13, 2019 | - | $699
Adjusted for inflation, the total for these phones is $3,287 excluding carrier contracts. Assuming the iPhone 11 will be obsoleted by iOS 27 in September 2026, this costs you about $14.29/mo.However, I find the iPhone keyboard so bad and the settings concept so muddled that I'm going to return to Android when this experiment is over. Probably not for another 4 years though!
It is abysmal that Android phone makers still need to customize the OS so much for their hardware. Apple has no incentive for longer support cycles if Android does even worse on it.
Vertical integrations like everyone sell a product, a brand, a whole ecosystem experience.
If all OEMs sold the same CP/M, UNIX, MSX, MS-DOS, Windows software stack, on the what is basically the same hardware with a different name glued on the case, they wouldn't get any brand recognition, aka product differentiation.
Thus OEMs specific customisations get added, back in the day bundled software packages are part of the deal, nowadays preinstalled on the OS image, and so on.
So I bought a 4xM2 PCI card, 4 2TB Samsung Pro SSDs for $1,100. And as a result got 6.5GBps versus the onboard 1TB's 5GBps.
Same with memory. 160GB (32 to 192GB) from Apple was also around $3K. OWC sold the exact same memory chips, manufacturer, spec, speed, for $1,000 for 192GB.
Can't relate to what you're saying, had 4 MacBooks, and many PCs too.
???? ctrl+a and ctrl+e? That works on most Linux setups, too. Only Microsoft screws that up. I love how in Mac Office apps, Microsoft also makes ctrl+a and ctrl+e do what they do in windows lol.
I also have all of the adapters that came with the MBPs too, all perfectly functioning, the oldest still attached and powering my 2013 model with the dead battery (2008 model was sold, still working). The magsafe cable is pretty yellow now, and maybe a little wonky from the constant travelling, but no fraying/fire hazard yet.
None of the above sound like anybody's actual experience. Which is also they have the biggest resale value retention among PC laptops, and biggest reported user satisfaction.
Now, if you were about the lack of ports (at least for a period) or the crappy "butterfly" keyboard (for a period), you'd have an actual point.
Home/End is just Control-A/E.
Never seen "molten keyboard plastic". I'm sure you can find some person who has that somewhere on the internet. I doubt it's a problem beyond some 0.0001% rare battery failures or something like that.
"yellow spots burned into a display with its own heat exhaust". Not sure what this even means. Especially AS Macs don't even get hot. I haven't heard the fan ever, and I use a M1 MBP of 5+ years with vms and heavy audio/video apps.
"when its charger dies in a month" is just bs.
I had a GPU issue (that was the subject of a recall that matched my symptoms precisely (and I could make the MBP core dump on demand in the Genius Bar) but "recall declined, does not fail diagnostics".
Damaged charging circuit on an MBA. Laptop worked perfectly. Battery health check fine. Just could not charge it. "That will be a $900 repair. Maybe we can look at getting you into a new Mac?" (for one brief moment I thought they were going to exchange mine... no, they wanted me to buy one. And of course, my MBA couldn't be traded in because it was damaged...).
I've also had multiple Magsafe connectors fray to the point of becoming like a paper lantern with all the bare wire visible, despite the cable being attached to a desk with cable connectors so there was near zero cable stress (and often only plugged/unplugged once a week).
The only PC laptops that were replaced were the ones that got damaged in accidents (car accidents, dropped off a balcony, used as a shield in self defense during a robbery, etc.). Dell Latitudes of that era were sturdy, and not noticeably heavier than their fragile Apple counterparts.
Annoying popups on MacOS look like the 1999 remake of the modal dialogs from the 1984 Mac, I guess with some concessions to liquid glass.
Funny that a lot of people seem to have different Liquid Glass experiences, are we being feature flagged? I don't see the massive disruption to icons that the author seems but it does seem to me that certain icons have been drained of all their contrast and just look bleh now, particularly the settings icon on my iPhone. I don't see a bold design based on transparency, I just see the edges of things look like they've been anti-antialiased now. It's like somebody just did some random vandalization of the UI without any rhyme or reason. It's not catastrophic but it's no improvement.
All this wank to waste the power of faster and faster chips.
So, they have to make the older phones feel bad to use to give a reason for upgrading because otherwise most won't care to do it.
It's not the first time Apple has done that and it has been their strategy for a while now. The "free" updates are the most bullshit marketing success I have ever seen.
I teach C++ programming classes as part of my job as a professor. I have a work-issued MacBook Pro, and I make heavy use of Terminal.app. One of the things that annoy me is always having to click on a dialog box whenever I recompile my code and use lldb for debugging. Why should I need to click on a dialog to grant permission to lldb to debug my own program?
It wasn't always like this on the Mac. I had a Core Duo MacBook that ran Tiger (later Leopard and Snow Leopard) that I completed my undergraduate computer science assignments on, including a Unix systems programming course where I wrote a small multi-threaded web server in C. Mac OS X used to respect the user and get out of the way. It was Windows that bothered me with nagging.
Sadly, over the years the Mac has become more annoying. Notarization, notifications to upgrade, the annoying dialog whenever I run a program under lldb....
Because apps and web browser tabs run as your user and otherwise they would be able to run lldb without authorization. So, this is the authorization.
Get the fuck out of my way and let me use what is supposedly my computer.
They become more shitware and Microsoft like with every update.
Years and years ago, I fought the good fight, full desktop Linux fulltime.
I see and hear from a lot of people it's pretty great these days though, and you can do whatever the new cool fork of WINE is or a VM for games / software compatibility.
Definitely not moving to 11. When 10 gets painful enough I'll probably try Devuan or Arch or something for desktop use.
I’ve just switched back to Linux, and I can confirm. It’s different, it’s very different. And it’s very good. I realised how much I’ve just been fighting Apple everyday in newer versions of Mac.
I was going to ask my boss for a new MacBook Pro soon but now I’m having second thoughts.
If you're looking for laptop recs, I've been happy with my Framework 16 except for one issue: when you close the lid, it can subtly flex in your backpack, press keys, and wake itself up. I work around it, but it's annoying.
But they recently announced a second build of the Framework 16 where one of the selling points is that the lid won't flex. Can't personally verify that they got it right, but given the build quality of the rest of the machine, I suspect they did.
While I appreciate NeXTSTEP heritage on macOS, I rather have something that costs 1000 euros less with more RAM, SSD, and a GPU capable of doing CUDA and Vulkan without translation layers.
Now if my employeer or customer is willing to assign a MacBook Pro with the same RAM and storage for project activities, great more on them, I would not say no.
i.e, blanket disabling of SIP will interfere with conveniences like Apple Pay, etc. People want those conveniences. Disabling SIP is a trade-off.
If you remove the 'quarantine' attribute that gets added to downloaded files it runs great.
Homebrew has an option which bypasses Gatekeeper when installing apps:
brew install --cask app --no-quarantine
And apparently you can have this on by default: export HOMEBREW_CASK_OPTS="--no-quarantine"
And I have this alias for everything else: alias unq="xattr -dr com.apple.quarantine"Those ads ran from 2006 to 2009. That’s between 16 and 19 years ago. How young do you imagine the typical HN commenter is?
> There was one that was about all the annoying security pop-ups Windows (used to?) have.
Those have been relentlessly mocked on the Mac for years. I remember a point where several articles were written making that exact comparison. People have been calling it “macOS Vista” since before Apple Silicon was a thing.
> WIW, it starts here: https://youtu.be/qfv6Ah_MVJU?t=230
A bit better quality: https://www.youtube.com/watch?v=VuqZ8AqmLPY
Part of getting old is accepting that 20 years was a long time ago and not everyone is going to remember the same commercials we saw two decades ago, including people who were children during the ad campaign.
This is an example of what I'm talking about https://www.reddit.com/r/Slack/comments/1geva4f/how_do_i_sto...
From the UX perspective, yes, it is triggered from the app.
It's been a long time since I used the Core Foundation API but you trigger a request, and then get back a token from the OS that grants you permission to do stuff.
I don't know if this is current or not:
https://developer.apple.com/library/archive/documentation/Se...
> Original Macintosh Beta Tester and Mac 3rd Party Developer (‘83-’85)
An early Mac was not a great gaming computer even by 1980s standards, and the last thing you want is Commodore or Atari running an ad saying "Apple's $2500 black-and-white prestige piece doesn't play games as well as a $299 C64/800XL". Not to mention the stink of the US video game market crash hovering around anything game-related.
If they pivot directly towards more professional workstation/publication/art department positions, nobody's making that point. (Now I'm thinking of the time "Boot", or maybe it was already "Maximum PC" by then, reviewed a SGI O2 and said it was impressive, but had limited game selections.)
Younger readers will find out that modern Apple attitude is quite similar to the early years of the Macintosh being sold to universities.
The be nice and think different phase was only during the time they were about to close doors.
I've been very patient with iOS 26. I tell myself - so long as its foundation is right, they'll iron out these details. But it is properly bad often and at times extremely frustrating.
This makes me extra sad. The HW is very good and very expensive, but the SW is mediocre. I bought an iPhone 16 a few months ago and I swear that is the first and last iPhone I'd purchase. I'd happy sell it at half of the price if someone local wants it.
Edit: Since people asked for points, here is a list of things that I believe iOS does not do well:
- In Android Chrome, I can set YouTube website to desktop mode, and loop the video. I can also turn off the screen without breaking the play. I can't do this in Safari however I tried.
- In Safari, I need to long-press a button to bring up the list of closed tabs. How can anyone figure it out without asking around online?
- In Stock app, a few News pieces are automatically brought up and occupy the lower half of the screen when it starts up. This is very annoying as I don't need it and cannot turn it off.
- (This is really the level of ridiculous) In Clock, I fucking cannot set an one time alarm for a future date (Repeat = Never means just today), so I had to stupidly set up weekly alerts and change it whenever I need a new one-time. I hope I'm too stupid to find the magic option.
- Again, in Clock, if I want to setup alarm for sleep, I have to turn on...sleep. This is OK-ish as I can just setup a weekly alarm and click every weekday.
So far, I think Mail and Maps are user friendly. Maps actually show more stuffs than Google Map, which is especially useful for pedestrians. Weather is also good and I have little complain about it.
I think you're supposed to use the calendar for that.
On Android, why would you not use the calendar when you want to be alerted days from now?
What kind of event are you creating an alarm for that's more than a day away? On Friday, do you create your alarm to wake up on Monday? Does Android have a calendar-like view of your upcoming alarms?
(Sorry for the barrage of questions, but this is interesting to me.)
Any color as long as it's black, eh?
More importantly, alarms don't get silenced by my nightly do not disturb schedule.
If I have to make it a calendar entry, I may not notice it in time.
I dislike the new Safari layout in iOS 26 too. https://support.apple.com/en-nz/guide/iphone/ipha9ffea1a3/io... -- change it from "compact" to "bottom". I assume this choice will disappear in the future, but for now, you can make it more familiar.
Unfortunately, I don't have any advice for the Clock/Alarm; I don't typically schedule one-off future alarms. That would be a useful feature.
But it IS Apple's choice. The problem is they have a mixed up conflict of interest, and it's even worse when Apple themselves is trying to sell you their own services.
IMHO the company making the hardware, the company making the software, and the company selling the cloud services shouldn't be allowed to all be the same company. There's too much conflict of interest.
Google sells PiP, background playing etc. as part of YouTube Premium (not Apple!). Google serves browser clients a player that can't do those things, because they want you to pay for them. Vinegar is a browser extension that replaces Google's player with Apple's plain HTML5 player. Apple's plain HTML5 player does all that stuff for free.
Visual customizations get upstreamed into the system accessibility settings. Extra functions are exposed exclusively in Shortcuts for you to hack together an automation feature yourself. For a fully feature supported app Apple would probably say go pay for an app (and them through fee) on the App Store.
For example with your future alarm.. you could get another app or you could create a 'Time of Day' Shortcut automation which checks everyday to see if the date is the date you want the alarm on; if it is the day create the alarm. Delete the alarm on the next day. A (not so) fun fact about automating Alarms before iOS 17: you could only delete alarms through Siri and not Shortcuts lol...
I never have any confidence it will notify me of things. Often I just miss stuff. I actually have no idea how all the different "Focus" modes work. Notifications pile up and then seem to disappear without any action, and then reappear a week later.
The keyboard is really awful. I recently pulled an ancient Galaxy phone out of a drawer to test something and was reminded of just how much better the Android keyboard is. It just always guesses the wrong things!
And the settings are equally jumbled. Sometimes they're in the Settings app, sometimes they're in the app I'm using. It's just confusing.
It was honestly eye opening because I'd spent so many years assuming iPhones were better. I moved across because my Android phones kept having hardware issues. I think I'll probably go back to Android after this experiment.
Why am I unable to use Apple Music on my device while I can use it from a web browser or from an android phone that don't have encrypted iCloud backups enabled?
Unfortunately this also prevents me from jailbreaking the device because I have to sign into an Apple account in order to trust a developer certificate on the device required for the jailbreaking tool. It's my device! Let me approve of the certificate without an Apple ID!
I still use Mac for dev, but only because I don't really feel like messing around with Linux on a work computer.
setting ENABLED=0 in /etc/motd-news
apt remove ubuntu-advantage-tools
https://canonical.com/legal/motdhttps://support.tools/remove-ubuntu-pro-advertisement-apt-up...
They have shiny cases, yay. I'll take my ugly Thinkpad and actually get shit done over a shiny case and glossy screen.
- good thermals (especially vs. Thinkpad P series), even supporting fanless operation on the MacBook Air
- excellent microphone and speaker array (makes people much more intelligible on both sides during Zoom calls)
- excellent multitouch trackpad with good palm rejection (though for a trackpoint device Thinkpad is your best bet)
- unified GPU and CPU memory with up to 135 GB/s bandwidth (downside: DRAM is not upgradable)
- host-managed flash storage (downside: SSD is not upgradable)
And of course the 10-20 hour battery life is hard to beat. Only downside is I'll forget to plug in at all.
Historically, Apple has innovated quite a bit in the laptop space, including: moving the keyboard back for the modern palm rest design (PowerBook, 1991); active-matrix color display (PowerBook 180c, 1993); integrated wi-fi and handle antenna (iBook, 1999); Unix-based OS that could still run MS Office and Photoshop (Mac OS X, 2000 onward); full-featured thin metal laptop with gigabit ethernet (PowerBook G4 Titanium, 2001); pre-ultrabook thin laptop that fits in a manila envelope (MacBook Air, 2008); high-resolution display and all-flash storage in an ever-thinner design ("Retina" MacBook Pro, 2012); going all-in on USB-C/Thunderbolt and 5K external "retina" display (MacBook Pro, 2016); unprecedented performance, and a tandem OLED display with <10ms touch-to-pixel latency, in an absurdly thin iPad, which can also be used as a "laptop" (iPad M4 + magic keyboard, 2024); etc. Some of the innovations also failed, such as the touchbar, dual-controller trackpad, and "butterfly" keyboard which plagued the 2016 models.
Now they have plenty of money, the attitude during early Apple years is back.
So it just lives there permanently on the desktop and I avoid using the thing as much as possible. I do all of my work functions through SSH and leave the Mac in a corner of my desk with the screen closed.
I swear, MacOS actively tries to be as annoying and intrusive as possible. Every time I touch the damn thing some new behavior reveals itself. Like there are two sets of hotkeys for copy/paste and which one you need to use appears to be entirely random per-window.
Thankfully work lets me use linux on my main machine and I almost never have to deal with goddamn MacOS. I would rather daily drive Windows 11 with copilot and cortana and piles of bossware than plain MacOS.
lol
Apple is burning their remaining goodwill among longtime customers, myself included. It's sad to see. Next WWDC, they need to be incredibly transparent about how they plan to fix these issues and get their house in order. If they aren't capable of accepting feedback after this public excoriation, I don't have high hopes for their future.
I’m switching to android because why not? I mean, I have to install Google maps anyway because Apple Maps is horrible. But the UI on 26 is way worse than a pixel experience in my opinion. Plus, I could just do so much more with the pixel phone but then again I’m sort of a power user.
I was working on Apple since 1996 and started off as a computer support person. Now it pains me to help people with their computers because everything is siloed and proprietary and just makes no sense.
And I mean, I’m also annoyed that their voice to text is just horrible. I can’t even tell you how many mistakes I’ve had to correct in this comment alone.
On iPhone swipe keyboard something that feels like a random generator replaces not only the word you swipe, but the word before, and in 2/3rds of cases with random nonsense pairs.
And you can't turn it off without turning off the similar word proposals you definitely want.
It's a strange design decision and I think the implementation is not up to the task.
I'm not staying cause I like it, but because I dislike the other options more.
The one reason to use Android is so that you can actually switch out the awful stuff that ships with your device. Leaving Apple to join the "Google Ecosystem" seems absolutely insane. Google is so terrible at software, so terrible at UI and so terrible at having products.
I get that visual design is a complete preference, but the great thing about Android, to me at least, is that you can get away from Google totally goofy design and make your own choices.
>Plus, I could just do so much more with the pixel phone but then again I’m sort of a power user.
Google is starting to make that less and less feasible though, with it's start in restricting app installations.
There's a number of other map apps around as well (Mapfactor, MapQuest, maps.me), including some for specific purposes (transit, Waze, Polaris, backcountry, sygic)
Also in my experience, and maybe this is region dependent, OSM has very detailed location information.
I've only recently (3 years ago) bought my first MacBook, and it's because everything else is also getting worse.
On the bright side, Apple Silicon is amazing, and it seems like Apple decided in 2021 to make the MBP good again like it was in 2015.
<edit> spelling, since iOS 18 isnt as forgiving as iOS 6
When they release a new feature it needs to be everywhere. That happens every September. The cadence has not changed, but the scope since Apple was just making MacOS has been multiplied.
You can 10X your staff, but the coordination under 10X velocity will suffer.
I'm not trying to excuse Apple, but this article attempts to paint the impression that every issue is connected in some kind of serial incompetence, but that simply isn't the case.
I thought Apple was all about privacy. But their software needs location access to function properly?
It remains private because this runs locally. It's not sent up to the cloud.
iOS and Mac used to do a good job with things like animations, now they are horrible. Pre-beta quality.
And dark mode and accessibility settings need to just work. That is a core part of the job of every front end iOS developer, including the ones at Apple.
It absolutely is serial incompetence and the Apple engineering leadership that signed off on it should be ashamed.
The one thing that really changed is that every single company looked at Apple and saw something worth copying. Now there are dozens of phone makers, all seeking to emulate Apples success, putting effort into UI, polishing and design. This wasn't the case a decade ago. Just compare the circus bizarre design choice of Android Lollipop (either Stock or with manufacturer/user added layers on top) to iOS 7.
Now Apple is no longer particularly unique, in many regards. And I believe that they were absolutely aware of that and desired to continue being a defining force, instead of being "one of many". It's not that Apple has changed, it is that it hasn't and now desires to force through change.
IMHO, people are thinking about how well thought-out and usable the products and software tends to be - Yeah, Apple makes it so anyone can use it - But their software has always been buggy.
In my mind it is synonymous with style over substance. Bad software packaged in a user hostile interface, sitting atop shitty hardware that looks sleek and fashionable.
It doesn't matter anyway. It's fashionable enough that it will keep selling.
But nonetheless, there’s so many more bugs and visual glitches. Battery life is still unstable and feels markedly worse than before. Safari looks cool, but UI buttons being on top of content is foolish for the reasons highlighted in this article. Overall, it’s just much more visually inconsistent than before. And the glass effect on app icons looks blurry until you get 5cm away from the screen and really pay attention to the icons. I definitely won’t be upgrading my Mac any time soon.
I just wish we would get away from this annual upgrade cycle and just polish the OS for a while. We don’t need 1 trillion “features”, especially when they increase the complexity of the user experience. MacOS in general did this very well, ever since I switched I’ve been very impressed at how much you can accomplish with the default app in macOS, all while looking cleaner and leaner than windows software. No new feature is even close to that balance of power and UI simplicity anymore.
I don't use a cellphone anymore, but back when I did (circa iOS 12) it was not possible to have the text large enough for me to read while still displaying all ten digits of a phone number.
Example: `212-555-1234` would display as `212-555-12...` in my call list.
----
I have owned and primarily used Apple products since 1992; their accessibility options seem to be going backwards. An iPhone screen would still have more than enough space to display information if the UI weren't completely crammed full of junk.
It is so frustrating when I help older clients and (unbeknownst to them) some stupid notification re-focuses attention. Often they'll keep [-s-l-o-w-l-y-] typing and then get frustrated that "the computer did something I didn't tell it to, without even warning me it wasn't listening to my keystrokes anymore..." [e.g.] — this used to be eye-rolling user error, now it's just expected operating system behavior.
----
Pro-tip: You can essentially disable Apple notifications by setting `Do Not Disturb` from 3:01am - 3:00am
Launchpad didn't have this problem. Any text you type while the view is rendering goes into the search bar.
While the OP seems very unhappy and should just switch platforms considering it's a "shitty $1000 phone" to him, I'm just mildly annoyed by these UX regressions to what was otherwise a very good platform.
Fucking inexcusable that MacOS metal support for external monitors has been finicky and unstable since the very beginning, and they never resolved that (but at least external monitors were DETECTED, then somewhere in Sequoia things went completely south)-- and now it just seems to be completely broken. There are countless Reddit threads. Why can't the Apple engineering braintrust figure this out??
2) There is still no solution for this annoying-as-hell UI problem that I documented years ago on Medium: https://medium.com/@pmarreck/the-most-annoying-ui-problem-r3...
3) I had to buy Superwhisper (which is a nice product, but works a little janky due to how iOS handles keyboard extensions) because Siri's voice dictation is so abysmally worse than literally every other option right now, and has been for years. WTF, Apple?
Hey Tim, I love the Vision Pro too (I own one) but maybe get your head out of that for a bit and polish up the engineering on the rest of your lines!
It's literally a paid wrapper around a completely free program you would also be using for free if Apple wasn't actively hostile to Open Source software distribution.
I am willing to pay for things that work well (and don't forget support afterwards) because I know from experience that simply having an "open-source core" just gets you halfway there, if that; the last 5% of polish to a finished, reliable, supportable product is at least 50% more work.
And, to put it bluntly, since we don't live in a communist society, eventually SOMEone has to put food on the table from their efforts. I'm 100% fine with paying for that.
Lastly, I have a kid, which means that I have 0% time to spend on troubleshooting non-working open-source stuff (which I actually enjoyed doing before the kid); I am ALL about reliability now, which is why I paid for Eero home mesh networking (despite cheaper or free options), a NAS from iXSystems (despite me being quite capable of installing TrueNAS on my own hardware), pay for Apple products (despite loving my NixOS Framework laptop as well), and don't upgrade my 10 year old Tesla (because it still runs like a top, and that's my highest priority right now- reliability)
Once you get a kid, trust me, your views are gonna change, because you clearly do NOT have one.
This is what happens to your tinkering-around-with-open-source time after it:
tinker time before kid: [--------------------]
tinker time after kid: [-]
Now if you're lucky, you will end up in a position where you get paid to tinker around, and where they won't get upset if you use some work time to work on "aligned" tech.You refuse to vote with your dollar. You buy Apple's failed products like Vision Pro, and reinforce their insular self-destructive business model. You tacitly defend their right to commercialize free software and victimize yourself when alternatives are identified. There is zero feedback loop between you and Apple, just a wallet that keeps vomiting cash. The only outlet of communication that you can use to tell them to focus on the Mac or iPhone (eg. buying more Macs and iPhones) is being eschewed to support "the ecosystem" and replacement apps.
Therefore, you reserve no right to tell Tim Cook to stop focusing on the Vision Pro. You own one. You are one of the (sole) customers begging Tim to divert his attention away from the Mac, so he can focus on his visionary iPhone replacement. You wanted it bad enough to spend $3,500 that could have gone towards a college fund, three iPhones, six Macs, or an entire 16th birthday "come to the garage" surprise.
"WTF Apple" my ass, either take your lumps or buy a new product. We only hear this sort of moral floundering when monopolies form and people have to make themselves look helpless rather than an accomplice. Clearly Apple has no incentive to compete, but you also lack the balls to indict them despite rearing an entire family. Instead of commiserating with your faux helplessness, I will tell you to participate in a market economy to enforce the product of free market competition. You cannot make demands from a position of captivity.
Something I find worse: being unable to click a target while the animation is running! Because the target only gets focus after the animation is done: you start spending you time waiting for the animations in the end.
But only until the one day you really need Bluetooth to work and it just doesn't. We many Bluetooth sufferers will happily accept you into our ranks when (not if) that day comes. :)
or do you mean Cumulative Layout Shift, Google's term for it which I linked from there? https://web.dev/articles/cls
Haha so it's not just my wife's iPhone 15 Pro Max that keeps stuttering in any CarPlay
The only thing that's changed was upgrading to the new iOS.
There’s no way I’m (ever) upgrading to Tahoe, I’m just going to hold out as long as possible and hope Omarchy gets as stable and feature rich as possible in the time being.
No idea what to do about the mobile situation - I can’t see myself realistically ever using android. Switching off of iCloud and Apple Music would also be pretty tough, although I’ve seen some private clouds lately that were compelling.
I just wish there was a more Linux-minded less-Google oriented mobile operating system
Since there are a lot of die hard Apple fans and engineers on hacker news this is going to get downvoted to hell, but I’m going to say it again.
It looks like Apple doesn’t care about user experience anymore, and the 26 series updates all look like they’ve been developed by amateurs online, not tested at all, and Apple engineers just took long vacations while they’re on the clock. It’s a complete and utter disaster of an operating system.
Isn't Omarchy just config files for a bunch of existing, stable programs? Why wait?
It reminds me of stories I've heard about the Cold War and how Soviet scientists and engineers had very little exchange or trade with the West, but made wristwatches and cameras and manned rockets, almost in a parallel universe. These things coexisted in time with the Western stuff, but little to nothing in the supply chain was shared; these artifacts were essentially from a separate world.
That's how it felt as a Mac user in the 80s and 90s. In the early days you couldn't swap a mouse between a Mac and an IBM PC, much less a hard drive or printer. And most software was written pretty much from the ground up for a single platform as well.
And I remember often thinking how much that sucked. My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
Now so much has been standardized - everything is USB or Wifi or Bluetooth or HTML or REST. Chrom(ium|e) or Firefox render pages the same on Mac or Windows or Linux. Connect any keyboard or webcam or whatever via USB. Share files between platforms with no issues. Electron apps run anywhere.
These days it feels like Mac developers (even inside of Apple) are no longer a continent away from other developers. Coding skills are probably more transferable these days, so there's probably more turnover in the Apple development ranks. There's certainly more influence from web design and mobile design rather than a small number of very opinionated people saying "this is how a Macintosh application should work".
And I guess that's ok. As a positive I don't have the cross-platform woes anymore. And perhaps the price to be paid is that the Mac platform is less cohesive and more cosmopolitan (in the sense that it draws influence, sometimes messily, from all over).
Yes, as a long-time Mac user who now uses PCs at home but still uses a work-issued MacBook Pro, I greatly appreciate how Macs since the late 1990s-early 2000s are compatible with the PC ecosystem when it comes to peripherals, networking, and file systems.
However, what has been lost is "The Macintosh Way"; a distinctly Macintosh approach to computing. There's something about using the classic Mac OS or Jobs-era Mac OS X: it's well-designed across the entire ecosystem. I wish Apple stayed the course with defending "The Macintosh Way"; I am not a fan of the Web and mobile influences that have crept into macOS, and I am also not a fan of the nagging that later versions of macOS have in the name of "security" and promoting Apple products.
What the Mac has going for it today is mind-blowing ARM chips that are very fast and energy efficient. My work-issued MacBook Pro has absolutely amazing battery life, whereas my personal Framework 13's battery life is abysmal by comparison.
What's going to happen, though, if it's possible to buy a PC that's just as good as an ARM Mac in terms of both performance and battery life?
As someone who has never really enjoyed using macs, I do agree with this. It's probably why I don't mind them as much these days - Using MacOS in 2025 just kind of feels like a more annoying version of a Linux DE with less intent behind it. The way macs used to work did not jive with me well, but everything felt like it was built carefully to make sense to someone.
Their advantage against Microsoft is that the Mac UX may be degrading, but the Windows UX is degrading much more quickly. Sure modern Mac OS is worse to use than either Snow Leopard or Windows 7, but at least you don't get the "sorry, all your programs are closed and your battery's at 10% because we rebooted your computer in the middle of the night to install ads for Draft Kings in the start menu" experience of modern Windows.
Their advantage against Linux is that while there are Linux-friendly OEMs, you can't just walk into a store and buy a Linux computer. The vast majority of PCs ship with Windows, and most users will stick with what comes with the computer. It definitely is possible to buy a computer preloaded with Linux, but you have to already know you want Linux and be willing to special order it online instead of buying from a store.
Apple has a deal with Walmart to sell the M1 Macbook Air for $600, so that's their current low-cost option. For the future, data-miners have found evidence that Apple will be making a new low-cost Macbook with the A18 Pro (chip from the iPhone 16 Pro), set to launch in 2026. https://www.macrumors.com/2025/06/30/new-macbook-with-a18-ch...
Don't compare prices from wealthy countries like US with the rest of the world.
How many M1 do you think Apple is selling in African countries?
As an example.
They also had an extensive industrial espionage program. In particular, most of the integrated circuits made in the Soviet Union were not original designs. They were verbatim copies of Western op-amps, logic gates, and CPUs. They had pin- and instruction-compatible knock-offs of 8086, Z80, etc. Rest assured, that wasn't because they loved the instruction set and recreated it from scratch.
Soviet scientists were on the forefront of certain disciplines, but tales of technological ingenuity are mostly just an attempt to invent some romantic lore around stolen designs.
That's not my department!" says Wernher von Braun
DEC etched a great Easter egg on to the die of the MicroVAX CPU because of this: "VAX - when you care enough to steal the very best".
This is a biased take. One can make a similar and likely more factual claim about the US , where largely every innovation in many different disciplines is dictated and targeted for use by the war industry.
And while there were many low quality knockoff electronics, pre-collapse USSR achieved remarkable feats in many different disciplines the US was falling behind at.
https://en.wikipedia.org/wiki/Timeline_of_Russian_innovation...
That's a complete non-sequitur.
As opposed to the USSR who's wikipedia page for innovations proudly features, lets see;
Aerial Refueling
Military robot Paratrooping
Flame tank
Self-propelled multiple rocket launcher
Thermonuclear fusion (bomb)
AK-47
ICBMs
Tsar Bomb
to name a very small selection
It's almost as if you have it completely backwards and it was the USSR who was centrally planning to innovate in the art of killing.
Anyway just glancing the respective page for US "innovations" one can easily tell which country had the most obsessive offensive war industry.
I think they were in their own little world, and when they got past that with unix-based OSX and moved from powerpc to intel, they entered the best time.
The PC-based macs were very interoperable and could dual-boot windows. They had PCIe and could work with PC graphics cards, they used usb bt and more. Macs intereoperated and cooperated with the rest of the computing world. The OS worked well enough that other unix programs with a little tweaking could be compiled and run on macs. Engineers, tech types and scientists would buy and use mac laptops.
But around the time steve jobs passed away they've lost a lot of that. They grabbed control of the ecosystem and didn't interoperate anymore. The arm chips are impressive but apple is not interoperating any more. They have pcie slots in the mac pro, but they aren't good for much except maybe nvme storage. without strong leadership at the top, they are more of a faceless turn-the-crank iterator.
(not that I like what microsoft has morphed into either)
Right now, the quality and attention to detail have plummeted. There is also a lot of iOS-ification going on. I wish they focused less on adding random features, and more on correctness, efficiency, and user experience. The attention to detail of UI elements in e.g. Snow Leopard, with a touch of skeuomorphism and reminiscent of classic Mac OS, is long gone.
Not that it needs to, as it isn't bleeding money like on the A/UX, Copland and Taligent/OpenDoc days, however they risk to become only the iDevices company.
Yeah, Microsoft apparently is also back on their former self.
And then OS X came along, with bash and Unix and all, and there was a lot of shared developer knowledge.
But they still managed to keep a very distinctive and excellent OS, for 20 years after that.
The quality has dropped only recently.
This standard function doesn't exist on iOS but has been replaced with AirDrop. It's a big fuck you from Apple to everyone who prefers open standards.
Now my go-to is Dropbox/cloud/Sharik for small files and rsync for bulk backups.
https://github.com/schlagmichdoch/PairDrop/blob/master/docs/...
This isn't true - my shining moment as a 10 year old kid (~1998) was when the HD on our Macintosh went out and we went down to compusa and I picked a random IDE drive instead of the Mac branded drives (because it was much cheaper) and it just worked after reinstalling macos.
The true revelation was the B&W G3s. Those machines came from another universe.
I got a B&W G3 years later, when they became reasonably cheap second-hand, as a revenge for the envy back when they were top of the line.
It's certainly better than it was, that said Apple really try to isolate themselves by intentionally nerfing/restricting MacOS software to Apple APIs and not playing ball with standards.
> My sister had that cool game that ran on her DOS machine at college, or heck, she just had a file on a floppy disk but I couldn't read it on my Mac.
My MacBook Pro has an integrated GPU that supposedly rivals that of desktop GPUs. However, I have to use a second computer to play games on... which really sucks when travelling.
Apple doesn't even have passthrough e-GPU support in virtual machines (or otherwise), so I can't even run a Linux/Windows VM and attach a portable e-gpu to game with.
The M5 was released and has a 25% faster GPU than M4. Great, that has no effect on reading HN or watching YouTube videos and VSCode doesn't use the GPU so... good for you Apple, I'll stick to my M1 + second PC set up
Counter example: Blender
It used to have a extremely idiosyncratic UI. I will only say right click select.
A big part of the UI redesign was making it behave more like other 3d applications. And it succeeded in doing so in a way that older users actually liked and that made it more productive and coherent to use.
What I am saying is, those are different dimensions. You can have a more cohesive UI while adhere more to standards.
There is still lot of weird sacred cows that Macs would do very well to slaughter like the inverted mouse wheel thing or refusing to implement proper alt tab behavior.
You can have both, follow established standards and norms and be more cohesive.
The problem is simply that the quality isn't what it used to be on the software side. Which is following industry trends but still.
I mounted a 20MB external Apple hard drive:
https://retrorepairsandrefurbs.com/2023/01/25/1988-apple-20s...
... on my MSDOS system, in 1994, by attaching it to my sound card.
The Pro Audio Spectrum 16, weirdly, had a SCSI connector on it.
I don’t think that there is going back for Apple, the company is already too enshittified to get back to a company with a vision. They got drowned by AI, the releases and features are subpar to competition. I do care about detail when I’m buying premium products and Apple just doesn’t cut it any more.
Apple built a phone that would bend in pockets because they used flimsy aluminum without enough internal structure, something they should have had ample experience to avoid from the exact same thing happening to tons of iPods.
Apple insisted on developing a moronic keyboard implementation to save less than a mm of "thickness" that was prone to stupid failure modes and the only possible repair was to replace the entire top half of the laptop. They also refused to acknowledge this design failure for years.
Apple built a cell phone that would disrupt normal antenna function when you hold it like a cell phone.
Apple has multiple generations of laptops that couldn't manage their heat to the point that buying the more expensive CPU option would decrease your performance.
Adding to the above, Apple has a long long history of this, from various generations of macbook that would cook themselves from GPU heat that they again, refused to acknowledge, all the way to the Apple 3 computer which had no heat management at all.
Apple outright lies in marketing graphics about M series chip performance which is just childish when those chips are genuinely performant, and unmatchable (especially at release) in terms of performance per watt, they just aren't the fastest possible chips on the market for general computing.
Apple makes repair impossible. Even their own stores can only "repair" by replacing most of the machine.
Apple spent a significant amount of time grounding their laptops through the user, despite a grounding lug existing on charging brick. This is just weird
Apple WiFi for a while was weirdly incompatible, and my previous 2015 macbook would inexplicably not connect to the same wireless router that any other product could connect to, or would fail to maintain it's connection. I had to build a stupid little script to run occasionally to refresh DHCP
Apple had a constant issue with their sound software that inexplicably adds pops to your sound output at high CPU load or other stupid reasons, that they basically don't acknowledge and therefore do not provide troubleshooting or remedies.
Apple was so obsessed with "thinness" that they built smartphones with so poorly specced batteries that after a couple years of normal use, those batteries, despite reporting acceptable capacity, could not keep up with current demands and the phones would be unusable. Apple's response to this was not to let people know what was going on and direct them to a cheap battery replacement, but to silently update software to bottleneck the CPU so hard that it could not draw too much current to hurt the battery. The underpowered batteries were a design flaw.
Apple software quality is abysmal. From things like "just hit enter a bunch to log in as root" to "we put a web request to our servers in the hot path of launching an app so bad internet slows your entire machine down"
Apple prevents you from using your "Pro" iPad that costs like a thousand bucks and includes their premier chip for anything other than app store garbage and some specialty versions of productivity apps.
Apple has plenty of failures, bungles, poor choices, missteps, etc. Apple has plenty of history building trash and bad products.
The only "detail" apple paid "attention" to was that if you set yourself up as a lifestyle brand, there's an entire segment of the market that will just pretend you are magically superior and never fail and downplay objective history and defend a 50% profit premium on commodity hardware and just keep buying no matter what.
> Jobs insisted on the idea of having no fan or air vents, in order to make the computer run quietly. ...
> Many Apple IIIs were thought to have failed due to their inability to properly dissipate heat. inCider stated in 1986 that "Heat has always been a formidable enemy of the Apple ///",[15] and some users reported that their Apple IIIs became so hot that the chips started dislodging from the board, causing the screen to display garbled data or their disk to come out of the slot "melted".
Culture flows top-down. Cook is about growth, progressively flowing toward growth at any cost. It’s not a mystery why things are as they are at Apple.
https://www.joelonsoftware.com/2006/06/16/my-first-billg-rev...
Which, incidentally, is a great primer for younger developers on both what obsessive software quality looks like and why datetimes are a hard problem.
2024-01-30 -> 2024-01-29
2025-01-30 -> 2025-01-28
Fun. There is so much ambiguity in what “one month” means.Polars has an offset-by function which is a bit more explicit about how you want to handle date math. “Calendar date” vs number of days.
Edit: just ran polars and I’m not in love with its answer either.
Manager who can't do anything but delegate are fast and efficient for themselves, just not for the company.
I'd argue that apple is successful because of jobs attention to detail. Tim cook isn't the one who built apple foundation, he's merely trying to expand it but it's not the same difficulty.
That was when the design team began what I call the "one-off UI design" rather than use the same language across all apps.
Never mind the round mouse before that and the blind USB ports on the back of the newer iMacs (hate that scritchity sound of stainless steel on anodized aluminum as I try to fumble around with the USB connector trying to find the opening).
I mean fuck, even their failures don't seem to matter much (Vision Pro, Siri) to their stock price.
We'll get a foldable phone, and some new emoticons. Some font tweaks..
They think we're going to love it.
In the 90s Apple was in worse shape. They couldn’t even compete with Windows 9x for stability. There were memes about how MacOS needed just as many reformats as Windows 98.
The problem isn’t Apples attention to detail, it’s that people hold Apple to a higher standard. But in reality they’re just as fallible as every other software company.
This means that author never considered checking how it looks on any other non-Apple OS. Meanwhile Apple has a setting, which is enabled by default, to artificially make a pseudo-bold font out of normal font: https://news.ycombinator.com/item?id=23553486
And no, you don't know better than me about this cool feature.
They know. When a designer makes one of those prompts with only a "not now", they tend to mean a very specific thing, that is at the same time a subtle message to the user and a passive-aggressive stab at the company they work for.
What they mean: "the code path that handles what happens when you say 'no' here has been deprecated, because support for this feature was not part of the planning of a recent major-version rewrite of this app's core logic. When that rewrite is complete/passes through this part of the code, the option to say no will go away, because the code for that decision to call will be gone. So, in a literal sense, we think it's helpful to keep bugging you to switch, so that you can get used to the change in your own time before we're forced to spring it on you. But in a connotational sense, we also think it's helpful to keep bugging you to switch, as a way of protesting the dropping of this feature, because every time users see this kind of prompt, they make noise about it — and maybe this time that'll be enough to get management's attention and get the feature included in the rewrite. Make your angry comments now, before it's too late!"
Is the deprecated path that they just don't work without enabling those settings?
And I'm sure every UX person is different. The ones who know it's a dark pattern must have some pretty strong cognitive dissonance with a company that historically sold itself as 'the good guys'.
But there's a whole generation that probably grew up with software that fundamentally doesn't respect you, and this is just normal.
If I say no, that's an optional feature that I didn't ask for, and no longer need to think about it.
If I say 'not yet', it's one more thing I know I'm going to be reminded about, one piece of software that has no respect for me or consent.
While I can see the frustration by repeatedly using such software, it seems fair that a well-intentioned UX designer would use "not yet" if they found it works better for a majority of people. What then matters is that they respect the intent of everyone by not bringing the choice up again.
To your question, I'd say it's a false dichotomy.
The repeating behaviour and the 'not now' are two sides of the same coin. Per the original post, and my experience, those 'not now' buttons keep coming back.
And, for the sake of argument, how should respectful software behave? If I clicked 'not now', are you _ever_ going to ask me again? If so, how do you decide on the time-period? If not, why make 'no' more complicated than 'yes'?
So you're using "not now" as a shortcut to identify malicious design. That's fine but personally I'm not. I would go as far as claiming as most people don't feel this way about such a tiny wording detail, but I obviously haven't ran any A/B testing to confirm either interpretation.
> And, for the sake of argument, how should respectful software behave? If I clicked 'not now', are you _ever_ going to ask me again? If so, how do you decide on the time-period?
Honestly this question seems bizarrely focused on making the software give you a definite promise on what it will or will not do. It's a valid design question how to respectfully prompt the user for choices, but if that's your concern then this specific wording choice is an unimportant detail to get stuck on.
> If not, why make 'no' more complicated than 'yes'?
The reason was given my my opening argument: "no/yes" demands a conscious choice whereas "not now" doesn't. "not now" is plausibly less complicated than "yes" for a majority of users, and cognitive overhead generally matters more to people than imaginary contracts about future choice prompts.
You did say "playing devil's advocate", and you asked which of the two I was more concerned about. So I think it's OK to explore the arguments either way!
I'm going by gut feeling here, so I'm not pretending to bring any data to the discussion. My original post was "commonplace UX pattern A gives me this feeling B".
Modern UI trends which seem to prioritise the wishes of the product designer over good UX. And I feel that _that_ is driven by metrics that don't have my best interests at heart.
> "no/yes" demands a conscious choice whereas "not now" doesn't
These messages are often displayed in modal popups (e.g. the original post), or otherwise intrusive fashion (e.g. sign into BBC News website). So if you're asking this question you're _already_ forcing the user to read a message and make a choice. If the user isn't ready for a conscious choice, maybe you shouldn't be asking them to make any choice.
Going back a few years, a piece of software with options would put them in a preference panel. If the user wanted these features, they would intentionally seek out how to set them. They would be in control. And they would be both sufficiently contextualised and intent-driven to make that yes/no choice.
Maybe these days with rolling or frequent deployments it's a challenge to communicate new options. But I don't think that applies in many of the manifestations of 'not yet'.
"Not now" can be validly interpreted to mean "not in this context" or "not here". It's not necessarily temporal.
I tend to ignore these kinds of things, but sometimes applications are unresponsive, lose focus, and iOS apps don't show the keyboard, etc. so I cannot take it anymore.
I wanted to open a file from the Files app on iPad, a PDF. It opened the Preview app, but it couldn't allow me to scroll through the file. I tried to close it, but, back button goes to the Preview app, not to the Files. Then closed the app, and from the Files, but again it kept opening this separate app, instead of the in-app PDF viewer, and I guess I have never seen a malfunctioning state or application flows in default iOS apps ever.
The new reminders app is a joke. It has weird things that randomly jump from date selection to time selection, and sometimes select random dates.
It's like, they did, `claude new-reminder-app.md --dangerously-skip-permissions`, and "is it working? working! release it!" I know (hope) it's not the case, but, since the last few weeks, it feels it's like that.
And to be honest, it never really existed. It was more that everything else was cheaply manufactured garbage.
These days it feels like various teams are responsible for their part and they are managing toward a delivery date. As long as they check the box that the feature is there... ship it. There is likely not anyone around to throw the product in a fish tank if it isn't up to par.
- When an iPad is presented to you to enter your parent code to unlock an app, the name of the app isn't shown as the pin prompt is over the top of the app/time to unlock details.
- It's not possible to set screen time restrictions for Safari.
- If apps are not allowed to be installed, app updates stop. I have to allow app installations, install updates, then block app installations again.
- Setting downtime hours just doesn't seem to work. Block apps from 6pm - 11.59pm? Kid gets locked out of their iPad at school for the whole day.
- Most of the syncing between settings on a computer to the target iPads appear to be broken completely. If an iPad is in downtime, and the scheduled downtime time changes, it does not take the iPad out of downtime.
- Downtime doesn't allow multi-day hour settings. For instance, try setting downtime from 8pm - 8am.
- Popups in the screen time settings of MacOS have no visual indication that there is more beneath what can be seen. There is no scrollbar. You have to swipe/scroll on every popup to see if there are more settings hidden out of view.
- No granular downtime controls for websites. You can block Safari, or you can not block Safari.
Edit: Oh I almost forgot this nifty little bug reported back in 2023: https://discussions.apple.com/thread/255049918?sortBy=rank
Screentime randomly shows you a warning about being an administrator... no probs you just need to select another account and then re-select the one you want and it'll go away.
Presumably this is because apps could add individual features parents don't approve of between updates.
If you're locking down what apps you want your kids to use (to an individual whitelist of apps, not just by maturity rating), you're essentially stepping into the role of an enterprise MDM IT department, auditing software updates for stability before letting them go out.
What would you propose instead here?
I presume you'd personally just be willing to trust certain apps/developers to update their apps without changing anything fundamental about them. But I think that most people who are app-whitelisting don't feel that level of trust torward apps/developers, and would want updates to be stopped if-and-only-if the update would introduce a new feature.
So now, from the dev's perspective, you're, what, tying automatic update rollout to whether they bump the SemVer minor version or not? Forcing the dev to outline feature changes in a way that can be summarized in a "trust this update" prompt notification that gets pushed to a parent's device?
If my daughter's Spotify app breaks after an update she knows to immediately contact my on-call pager and alert our family CEO and legal department.
Just give me a checkbox that allows updates.
If an app developer changes something fundamental about the app, then the changes will be subject to the app store age guidelines. If the app is recategorised to 18+ it won't be able to install anyway. Billions of devices around the world have auto app updates turned on. The risk of a rogue update is outweighed by the benefit of getting instant access to security updates. I'm managing a kids iPad with a couple of mainstream games and school apps installed, not running a fortune 500.
I’m running iOS 18.7.1 and I can do that just fine. Maybe it wasn’t possible before, but it certainly is now.
I thought this too. I discovered it actually is possible though, just doesn't appear in the list. Go "Screen Time" -> "App Limits" -> "Add Limit". In the "Choose Apps" dialog, you won't see Safari in the list. But you can type "Safari" in the search bar and it'll appear.
But I agree with the overall sentiment on this thread. iOS Parental Controls aren't where they need to be.
I really haven't had many problems, and I actually like some of the features. Sure, the UI/UX is not perfect from the start, but there hasn't been anything I have been unable to accomplish because of the new OS. The liquid glass can even be nice with certain backgrounds too.
This is just my hypothesis, but I have noticed that a lot of the people that have been complaining about macOS have been using 3rd party applications for a in their workflow. If I am not mistaken, there were issues with many Electron apps in the beginning. On macOS, I mainly Apple's apps or I'm deep in the command line. So, perhaps I have been fortunate to avoid many of the UI/UX features that many have faced?
macOS is essentially an iCloud client and sales funnel these days, it's clear that's all that Apple sees it as.
It was pancreatic cancer IIRC.
Just one example: I was excited by the idea of having two apps on screen at the same time: there are two I like to look at side-by-side all the time. But one of them (an iPhone app) randomly decides to switch to landscape mode, making the layout unusable. More generally, the window controls keep getting activated unexpectedly by taps when I use full-screen apps like games, resulting in the window reverting to not-full-screen. So I guess I'll just have to turn that feature off until it's actually usable.
Maybe the Windows Vista of Tablet OSs though.
It is terrible, does not anything visually or funcionally to the Apple experience.
oddly, kde plasma is more pleasing and consistent.
Kind of bizarre that they are destroyed their reputation for software perfection.
These are all things which have been broken for years.
He already covered this: https://youtu.be/K1WrHH-WtaA?si=tHrGBNmLlIfp4NSv
Steve truly is dead.
[1] https://cdn.social.linux.pizza/system/media_attachments/file...
1) battery warning above tabs in browser with no x to close it
2) WebKit bugs that make inputs and visual diverge so you have to click under the input to hit it
3) flickering email app when it’s opened
My Apple monitor has USB ports on the back side. Sigh.
My mouse had a charger cable on the bottom. Sigh.
My keyboard has no dedicated copy and paste keys. Sigh.
My keyboard has no dedicated undo and redo keys. Sigh.
At one point I had to start iTunes to update my OS. Sigh.
Really, the next time someone says Apple nails UX I am just going to cry.
When they moved production to Foxconn, Quanta, and Pegatron then the quality went up...
Apple priorities: Emoji and emoji accessories, realistic glass physics, battery life, new kinds of ports, iCloud subscriptions, rearrange system preferences, iTunes delenda est
I'm just glad as a SWE the Mac still covers my workload
Sometimes you need the Jobs at the top of it all telling people it's not working well and they need to get their shit together.
I buy more stock every time one of these articles comes out, because the quiet part is 'Apple is still the best, and I can elevate my brand by criticizing it'
he lists "publicly confronted Apple at the European Commission's televised DMA hearing in Brussels on browser competition." as a highlight. lolol. Time to buy even more stock.
I just want Safari to work again. The rest I'll wait. I'm checking for software updates daily. it's gotten so bad that I looked up how to report bugs to Apple but I can't submit screenshots!?
I'll settle for just being able to enter data into a form in Safari without needing to reload the whole page.
just to add I had to cut this comment, reload the page, paste it in in order to be able to submit it
And then there's the bugs. What software is more consistently buggy than Apple software?
A lot of people will disagree that Apple had great attention to detail before because of the things they choose not to focus on. But I think what was counter arguable before is that they were meeting their own internal vision with a high expectation for quality, and that that vision covered every part of the experience. The counter argument doesn't feel as valid today.
Don't get me wrong, I also enjoy and benefit from returns from the sharemarket. But I think there are downsides to it, and you can clearly see it here, with Apple. A company that was an underdog, and didn't really need to worry too much about the share price because anything they did at that time was attracting customers at a smaller scale, but it was still growth. Now at the size that they are, they have to lose their soul to keep the growth happening.
It's sad, but it's also the reality. Netflix for example - once tweeting "Love is sharing a password" now resorting to stopping this.
There's just no contentment in the sharemarket.
This thing is laggy…on my brand new 17 Pro. Why not just make the entire OS an electron app at this point?
These articles sometimes confused me until someone on HN had a quote:
"MacOS has never been worse, but the distance between MacOS and Windows has never been greater."
So I see these articles and I ... I'm really happy in MacOS land.
Steve Wozniak left Apple in 1985 because he felt the company was no longer an engineering-led one and missed the fun of creating things rather than dealing with management.
That was 40 years ago. Jobs cared immensely, but the snowball has been rolling for a long while. Decisions used to be made by people that actually cared, but now they're made mostly by poseurs.Probably just to make it slick looking (fluent)…
- BUT that’s completely non-effective as it allows for the cursor to be positioned on top of a single letter in >10 different positions.
So when you’re editing you are having a much more difficult time placing the cursor just between the two letters you want.
I noticed it when using some app the had disabled this stupid feature and it was just so much more effective to do mybediting as the cursor jumped to the position BETWEEN the letters in stead of FLOATING ALL OVER.
It’s nice on slides when presenting a new fancy feature, but completely useless for s ‘professional’ (focused) user.
PS yes I recall those old Apple adds - saw them when they were brand new and Apple was a better details oriented company (I miss those days….)
2) With the new dual-language keyboards, text prediction is doubly messy. My Dutch-French keyboard suggests English words and almost never understands which language I'm actually typing in.
2b) Fine, but then at least my English keyboard will be okay? Nope. The other keyboards still get polluted with suggestions and auto-correct from the other language keyboards you have. For example, it's now impossible to write “as a” without iOS correcting it to “às à”, even though my Portuguese keyboard is separate and I'm clearly typing in English.
3) Search on iOS always searches for files, even if you remove the Files app contents from search results. (‘App’ just got corrected into ’all’.)
4) In iOS, HealthKit occasionally locks down and the apps that need it to function become unresponsive (AutoSleep, WaterNow, Cronometer). This can only be fixed with a restart.
5) In macOS's Mail, if you Command-Z to undo deleting an email, 90% of the time it goes in a Recovered Emails folder instead of where it was. 10% of the time it's not undeleted at all and is instead completely gone (also from the trash).
6) Calendar.app on macOS can become unusable if you have attachments in events. The solution is to open the event on iOS and delete the attachment.
7) Screen Time asks me twice if I want to unlock an app or website. Sometimes the animation plays and nothing happens, so I have to do it thrice.
8) The Game Mode notification can't be disabled, because no one at Apple has ever alt-tabbed away from a game.
9) If you search for a setting in iOS or the redesigned macOS System Preferences, I think 3/4 of the time it will take you to a nearby screen but not the actual setting. This is because Apple designers don't know that they can scroll you or link you to the right place.
I've reported all these bugs to Apple with screenshots or videos. None of them were ever fixed.
so to answer your question... HELL YES!!! unless you are talking about detail of the bottom line.
To my view Apple care little about quality of product these days mostly because they Cook cares more about deliverables then usability. 1st piece of evidence developers no longer need to adhere to anything like "The Human Interface Guidelines" so there is no consistent UI. Developers all think they are designers, they are not!
Steve was and ASSHOLE but he was also a humanitarian; among his many skills was that he understood that humans crave at least the appearance of control over our environment, he looked for how we did things in the real world and worked with great developers, designers & END USERS to build them. AND LEGACY was at least considered carefully before replaced.
This was the Apple we loved and "evangelized" The Macintosh Way
So at this point i'm not buying new Apple hardware or staying in their ethos. I'm looking for a couple of guys in a garage... Maybe Ive's 'little' project?
x3n0ph3n3•3mo ago