The platform would aggregate by major/minor version, and you could see in totality whether the current version of macOS/iOS would make Steve proud of miserable.
Ultimately I decided against it, for defamation/cease-and-desist reasons, and not wanting to find out. But it needs to exist.
What does it say?
Does anyone actually do this? Especially for heavy-duty applications like my web browser and IDE, this has always felt like a bizarre assumption to me.
I haven’t maximized a window in years. They look ridiculous like that. Especially web pages with their max width set so the content is 1/4 the screen and 3/4 whitespace.
If I ever accidentally full screen a window, and it’s not in night mode, I am instantly blinded by a wall of mostly white empty background!
I frequently use macOS on a projector, it doesn't quite fill my wall floor to ceiling but it comes close. I don't use full screen often, but I do it occasionally as a focusing strategy, and it's fine.
You're shining a bright light on a wall, which you are looking at.
With a monitor you are shining a bright light at your face, while staring directly at the lightbulb!
If you're using a monitor in the dark the way you use a projector, you should turn the backlight down. If you're using it in a well lit room, the brighter backlight should have less of an effect.
My actual biggest pet peeve with this setup is the vast number of web sites that deliberately choose to limit their content to a tiny column centered horizontally in my browser, with 10cm of wasted whitespace on each side.
It seems like a lot of this generation's design language was based on the assumption that people would be spending a significant amount of time in visionOS. That's turned out not to be the case (at least so far).
I sometimes maximize something - other than video calls: those are always full-size - on the laptop screen, but otherwise not at all.
I can see how a full-screen IDE makes sense, but I don't use one, so I always want a couple of terminal sessions running alongside my editor.
There are vanishingly few contexts in which I find full-screen helpful. Not criticizing anyone else, or recommending my way of working, but it's what works for me.
[0] I would like better support for desktop management: naming and shortcutting, particularly. Years ago I tried some (I think it was Alfred, or a predecessor) add-on that promised that, but it was super flaky. Does anything exist that works well?
IMO, this has been their assumption for years, and it actually turned me off when I tried getting used to Mac circa 2006-2007. Coming from Windows at the time, I just couldn't get over a weird anxiety that my application window wasn't maximized, because it didn't look like it completely snapped into the screen corners.
Now, using 34-inch ultrawide monitors almost exclusively, I never maximize anything... it'd be unusable.
I maximize windows of graphics and video editors.
Browsers only ever get maximized to the left/right half screen for me too
Which is something macos should really improve on though, the ux is pretty bad compared to Windows and Linux there
This goes towards something that I've felt for a little while: at some point in time around the early 2000s, operating system vendors abdicated their responsibility to innovate on interaction metaphors.
What I mean is, things like tabbed interfaces got popularized by Web browsers, not operating systems. Google Chrome and Firefox had to go out of their way to render tabs; there was no support built into the OS.
The OS interfaces we have now are not appreciably different from what we had in the early 2000s. It seems absurd that there has been almost no progress in the last 25 years. What change there has been feels like it could have been accomplished in user-space, plus it doesn't get applied consistently across applications, thus making it feel like not a core part of the OS.
MacOS in particular was supposed to an emphasis on the desktop environment being the space of window and document level manipulation, as exemplified by the fact that applications did not have their own menubars. All application menu bars were integrated together at the top of the screen. Why should it be any different with any other UI organizational feature? Should not apps merely be a single window pane, accomplishing a single thing, and you combine multiple apps together to get something akin to an IDE out of them?
Well, I don't know if they should be. But they can't. Because OS vendors never provided a good means to do it. Even after signalling they wanted it.
Meanwhile, I want to use my graphical, mutli-window preemptive multitasking operating system to, you know, use multiple applications at the same time.
As you said, browser and IDE are the big exceptions, plus things like Lightroom or my 3d printer's slicer.
Even VS Code usually lives as a smaller window when I'm using more a text editor rather than as an IDE.
I have been using it for years and I just gave up entirely on managing anything and if I zoom out to see all my windows it looks like the freaking Milky Way from windows I forgot
If the biggest flaw of a OS is the border radius of its windows, you've got yourself a pretty decent OS!
It's not gonna make me leave my darling Linux, ofc, but i think this whole debacle can only be interpreted as praise.
On second thought, it might also be considered a mediation on people's tendency to bike-shed.
Or to stay it another way, if we see shit like this then we know the whole thing is a hack.
For example, there is not much you could do to Finder to make it worse.
There are loads of other flaws with the OS. It just so happens that people care a lot about the design of Apple's products, so people talk about these details.
This argument would also make Windows 11 a pretty decent OS by extension via "If the biggest flaw of a OS is the position of the start menu you've got yourself a pretty decent OS".
In general I could use any minor nuisance as a proof of decency - or inject some to form this argument on purpose as a manufacturer.
People don't like if their environment changes in minor unsolicited ways. There's always gonna be fuzz about these things and that means that the fuzz itself can't be used to make any strong argument whatsoever.
And the updates to Music (formerly iTunes) are so bad the entire team should be dressed down, Steve Jobs style.
Not really, if you have malware that has root access on your system I think you're already pretty screwed, especially considering that you don't even need root to read all your saved passwords and personal files https://xkcd.com/1200/
I get the UI consistency thing but it's okay to transition to new UI things gradually than making radical changes all at once. If this is still an issue 2yrs from now it will be more of a concern about their commitment.
Ads in a start menu can die in a fire though.
post-it•1h ago