Having set one parent up on Mint, I can say categorically that it is still a bit of a config nightmare.
1. Nothing is googleable. People have to google how to do things like adjust the layout of external monitors, and it's significantly harder to do that on linux.
2. There are a lot of different ways to install applications, and different options are available depending on which distro or application you're targeting
2. Most distros have an App Store that’s easy to find these days. Works great for non-cli tools
It's like 900x easier to install random software you find about online on a Mac (there's zip containing the .app directory, done), and about 10x easier to install random software on Windows (they give you a .exe you double click, click next a few times, done). Versus Linux where you look at a list of different file types, consider the differences between a .deb, .rpm, figure out if it should come from Flathub, deal with enabling unverified Flathub packages, possibly disable a Flathub package from your distro that sucks and overrides the maintainer's package, etc. See things like https://www.reddit.com/r/pcmasterrace/comments/1htu87i/it_to...
I only wish the process/instructions were a little more friendly for normies.
In practice, it may not work properly even on their "supported" models. For example, sound does not work on my Dell E7270. Secondly, you must be willing use the Chrome browser. I will not because Chrome no longer has the option to always show the scrollbars. I am convinced that modern UX/UI designers hate their users.
I'm not holding my breath for this to happen though.
https://support.apple.com/guide/security/securely-extending-...
But with Linux being open, they certainly would produce a loadable module if there was enough install base to justify it.
True, but the main point of a kernel mode anticheat is the ability to verify that the OS and game isn't being tampered with. If the OS has that capability already built in, then the needed for a kernel mode anticheat diminishes.
>they certainly would produce a loadable module if there was enough install base to justify it
It's not realistic for there to be such an install base to support such complexity compared to having them implement a simple API into their game and server.
It's not actually the message from the kernel that provides the value, it's the work needed to fake such a message.
The issue is that Windows is designed to be able to protect the will of proprietary software publishers against the will of users that want to assert control over the software running on their computer. It's very similar to the story with DRM.
Linux desktop OSes will never put in place the measures to make a Vanguard-like system work, because it's just unethical for a bunch of reasons, the most basic of which being that it's a violation of freedoms 0 and 1.
This isn't true. And supply chain wise just look at the xz backdoor. A random person was able to compromise the supply chain of many Linux distros. Security also is not just supply chain integrity.
>Windows is designed to be able to protect the will of proprietary software publishers against the will of users
I'm not sure what you mean by this. Just because Micrsoft cares about developers, it doesn't mean they don't care about users.
>that it's a violation of freedoms 0 and 1
It's not. Freedom 0 and 1 does not give you the freedom to cheat against other players without being banned. You can be free to modify the game client, but you aren't entitled to play with others using it.
The xz backdoor was successfully caught before it landed in mainstream release branches, because it's free software.
But broadening the scope a bit, the norms of using package managers as opposed to the norm on Windows of "download this .exe" is a much stronger security posture overall.
I am aware the Windows Store exists, it's not widely used enough to make exes a marginal distribution pathway. I am aware curl | bash exists, it's more common than it should be, but even in those cases the source is visible and auditable, and that's very uncommon for non-technical users to ever do (unlike downloading random exes).
> Freedom 0 and 1 does not give you the freedom to cheat against other players without being banned.
That's a strawman, I never claimed you should have the right to cheat against other players.
> You can be free to modify the game client, but you aren't entitled to play with others using it.
And that's the issue, Windows has functionality to impede your ability to run the software as you see fit and modify it to your needs. Perhaps you want to run your own server, with different moderation policies.
What? It literally got included with several distros. It wasn't caught before it shipped to end users. Just because it got caught before slower to update distros got it, that doesn't mean it is okay. It reveals how low the barrier is for an anonymous person to get code into the OS.
>I never claimed you should have the right to cheat against other players.
Attestation doesn't take away your ability to modify and run software which means that you still have freedom 0 and 1. It just means that you can not prove to a remote server that you bare running unmodified software. To me you were implying that the server being able to kick people who modified the client to cheat was violating their freedom.
>Perhaps you want to run your own server, with different moderation policies.
Nothing would stop you from running your own server like that.
For a multiplayer game, I'd argue that playing with others (even if you're restricted to private servers, not that most games support that anymore..) is running the software. Being able to use a piece of software for its intended purpose is more relevant than a literal reading "you are allowed to exec the binary and nothing more"
Linux's inability to run specific anti-cheat solutions is a vendor support issue on the anti-cheat maker's part, because they don't care about your security, and they've managed to convince game developers that this practice is acceptable. It's not. Vote with your wallet.
Fortnite uses EAC which does work on Linux, only they decide to block it.
Of course the well known gaming company that releases a distro is Valve. But, rootkits don’t seem like they fit their particular ethos (they are well known for their less annoying DRM scheme, right?). TBH, it seems like a rare opportunity to break the hold they have on the “game store” concept.
In a way I kind of wish this was how more windows support was handled just because PowerShell is so uhh... powerful.
It might be that Linux is less capable for your use case, but people seem to be generally content with ChromeOS and I think that the standard Fedora desktop install is more capable than that so I think the market exists.
At the same time, we still have a major problem at work if Microsoft goes through with this. I work in a research lab with 10s of 1000s of dollars worth of Windows 10 workstations that cannot be upgraded. We use Windows remote desktop and plenty of other software that is Windows only. The hardware is still pretty new and capable. With NIH cuts the last thing we need now is to have to spend money and lots of time to replace all that for no good reason.
You can buy extended support for orgs like yours that require it - https://learn.microsoft.com/en-us/windows/whats-new/extended...
1. in higher use than its successors
2. only had one possible successor
3. the successor did not support hardware in use at the time
?
I'm sure it won't stop them, as you say, but really Microsoft, as someone who used to be a (relatively rare at the time) defender of yours, get fucked. The Raymond Chen camp is truly dead (https://www.joelonsoftware.com/2004/06/13/how-microsoft-lost...)
Microsoft (well, the Windows part) is looking more and more like the Apple and Sun in that article. It’s the #2 or #3 user-facing OS these days. The fancy new programming environment happened and most stuff moved there, but it’s JavaScript and the browser rather than C# and .NET. Running old software is becoming a niche and getting more so by the day.
It's the same situation as last time with Windows 7. You can get three years of extended support for the monthly cumulative update, which I assume is being done given it is fairly inexpensive. The US government gets favorable pricing from Microsoft.
The consumer price for Windows 10 ESU is $30/$60/$90 for the first/second/third year.
There are a huge number of examples here: https://www.reddit.com/r/unixporn/
I used to use Openbox and compile my own freetype with patches but these days want to spend my time on other things, so I'm just using macOS which has the best out of the box experience with the lowest TODO list when setting up a new computer.
It's hard for me to imagine anything uglier than the above, but beauty is in the eye of the beholder as they say.
I've found Ubuntu's default, and "vanilla gnome shell" to both be pretty cohesive and "modern".
And at the same time, I've never really felt like Windows or Mac actually end up with a more cohesive UI than the various linux desktop envs. For every Qt/GTK theming mismatch, I find a Windows mismatch between apps due to Windows being 12+ generations of design languages and toolkits built on top of each other. (e.g. the 3+ distinct "current" windows control panel looks (11, then 10, then 7, then XP as you keep digging into more and more obscure settings). And apps typically "freeze" at the UI design when they're born. e.g. XP apps still look XP, and so on.
And on Mac, you have the (relatively!) small number of apps actually artfully designed for macos. And then you have all the other ones - electron, java-based, cross-platform Qt apps (which naturally look like Qt apps... just like on KDE/gnome).
There's of course various quibbles over font render, that have existed since time immemorial. I don't think any one platform really wins hands-down here, though it's my understanding that mac typically does the best (as long as none of the non-mac-native apps manage to mess it up).
I really think people just have double-standards at this point, where their "home" platform's flaws are minor, and candidates to replace it must be flawless. (I'll also admit I'm the same, though NATURALLY I think I'm right - i figure if everything is electron and mismatched anyway, I might as well have a free-as-in-freedom operating system under it. Nobody is putting ads in my start menu or advertising xbox game pass to me in my notifications.
Then again plenty of modern browsers have some type of profile syncing built in, which does all this for you.
> email inboxes
Please don't use POP3. Your inbox should live on a remote server and simply follow your account. Storing your inbox exclusively on your PC will make you very sad some day.
most cheaper/free email providers have a storage limit.
besides, i disagree conceptually. if i want to reduce the risk of my email being read or handed to someone i don't trust, then removing it from the server is a good idea. i can make my own backups.
On the desktop side, the GNOME online accounts feature is pretty good at getting you most of the way there.
Take Ubuntu, for example. It’s one of the most popular and recommended distros for non-techy users, but just look at the install process: https://ubuntu.com/tutorials/install-ubuntu-desktop#1-overvi...
Let’s be honest, I don’t think most people would actually go through with that.
One idea to fix this and get more people to switch would be for Ubuntu to offer a Windows app that handles everything. It could download the ISO in the background, format the flash drive, install Ubuntu in dual boot with Windows by default, and clearly explain each step so users know how to start using Ubuntu or go back to Windows.
„Running Linux in VM“ as you have put it, is miles better because it works all the time with 0 friction, driver issues, random freezes, reboots, etc.
EDIT: Beyond skill, just getting the external media is a substantial friction. I haven't used a thumb drive besides for Linux install media in 15 years; I'm good at computers but just finding / buying one of those things is its own roadblock.
This sort of thing used to be more common. My first exposure to Linux was before CD-Rs were ubiquitous so there was often no possibility of using external media if you downloaded Linux. Partitioning the drive and installing there was typical.
Ubuntu and Linux Mint are now recommending balenaEtcher, which is easier to use than Rufus.
For the tech, sure but for common people not so.
Why cannot Ubuntu just offer a download media creation tool like Windows does. Surely it's not that hard to couple dd with a batch gui.
> to fix your busted drive, just nuke the boot sector and send it
> bash
> dd if=/dev/zero of=/dev/xxx bs=512 count=1 conv=notrunc
On the other hand, if someone finds that part too complicated to follow perhaps they may not be able to install Linux - or Windows for that matter - by themselves and come across other issues down the line. Ultimately replacing your OS with another one does require some minimum level of technical knowledge that you either need to have or be fine with learning during the process.
The biggest sticking point is the fear of losing what they do have, but we're at the point where even their previous generation computer could be made to run Linux.
I guess I'm not surprised how frequently "reinstall Windows" is offered as a solution, that there is now some lighter version of that. But really I was talking about obtaining/creating installation media and reinstalling from scratch.
Can't help thinking that should be in a bigger font. It's a shame there doesn't seem to be a away to install Linux and keep your Documents directory at least. Is that due to file systems?
[Yes, yes, backup to memory stick/external drive but I'm talking about for your average person on the street]
So long as enough contiguous space is available to install the desired Linux distro.
You can't do this all on the same drive, because you need a place to copy the documents directory to. You need to delete the NTFS partition to create the place to copy the files to, but by the time you've done that, the Documents are inaccessible. You could do it in memory, feasibly, if you create a RAMdisk and are lucky enough to have enough memory for all your documents, but then you're still gambling on not running out of memory during the install.
So it is possible to copy the documents on the same device, and it's possible to even automate the process, but it's not possible to do it reliably or safely, and the reliability is so low that it's not worth even offering the possibility. If somebody has a handful of gigabytes of documents, it's already a nonstarter. To be safe you'd demand the user make a backup onto another device anyway, in which case they might as well do that and then copy the files into a fresh install themselves
It's not just shrinking and copying over to the new `/home` because of the locality of the data. If your NTFS partition is taking the entirety of the disk (minus EFI and system partitions), shrinking it will then make it take up the first X% of the disk. Then you have to make the linux installation on the last (100-X)% of the disk, copy the files over, and then when you delete the NTFS partition, your Linux filesystem is on the last half of the disk with a big blank unallocated area on the beginning. BTRFS or LVM2 could help a little bit there, but that's far from ideal in any case.
Probably the best approach would be to shrink NTFS, create a new partition at the end of at least the right size, copy the files over, then wipe the NTFS partition, install Linux as the first partition (after system/EFI and such), then copy the files into the user's home, and then remove the documents partition. That's still not super reliable, though. You are at the mercy of your documents sizes, filesystem fragmentation (remember, even if your filesystem is mostly empty, you might not be able to shrink if fragmentation is in a bad place. You could defrag, but then the install time can balloon up many hours for the defrag process alone, just to shrink a filesystem that you're going to delete anyway), how big the Linux install will end up being, and many other factors. You'd have a lot of people who simply can't copy their documents over on install who will be simply SOL. I can't think of a situation where this kind of thing wouldn't be better served by just telling the user to backup their documents to a USB drive and move them back afterward, because many people are going to have to do that anyway.
It’s still a great device, it just sucks I’m stuck with windows (10).
mrweasel•3h ago
mathattack•3h ago
Think about the demand and supply curves of calculations (or computation). For most of history, they moved in tandem, with supply moving slightly faster, so computers would always do more at slightly lower costs.
Now both curves are speeding up, but demand is moving faster, so the costs of hardware are going up. And when high end servers (with GPUs) are unavailable, people hold onto the older ones longer.