Having set one parent up on Mint, I can say categorically that it is still a bit of a config nightmare.
1. Nothing is googleable. People have to google how to do things like adjust the layout of external monitors, and it's significantly harder to do that on linux.
2. There are a lot of different ways to install applications, and different options are available depending on which distro or application you're targeting
2. Most distros have an App Store that’s easy to find these days. Works great for non-cli tools
It's like 900x easier to install random software you find about online on a Mac (there's zip containing the .app directory, done), and about 10x easier to install random software on Windows (they give you a .exe you double click, click next a few times, done). Versus Linux where you look at a list of different file types, consider the differences between a .deb, .rpm, figure out if it should come from Flathub, deal with enabling unverified Flathub packages, possibly disable a Flathub package from your distro that sucks and overrides the maintainer's package, etc. See things like https://www.reddit.com/r/pcmasterrace/comments/1htu87i/it_to...
As someone who has setting up new computers regularly dumped on them, having to click thru all of those dumb screens before being allowed to start using the browser has been the biggest contributor in my decision to ditch Windows
This is the first post to get substantial conversation, though. The impression I get is that on-topic reposts are fine until such time as they get traction - provided that they a. aren’t self-promotion and b. are made by different users.
I only wish the process/instructions were a little more friendly for normies.
In practice, it may not work properly even on their "supported" models. For example, sound does not work on my Dell E7270. Secondly, you must be willing use the Chrome browser. I will not because Chrome no longer has the option to always show the scrollbars. I am convinced that modern UX/UI designers hate their users.
I'm not holding my breath for this to happen though.
https://support.apple.com/guide/security/securely-extending-...
But with Linux being open, they certainly would produce a loadable module if there was enough install base to justify it.
True, but the main point of a kernel mode anticheat is the ability to verify that the OS and game isn't being tampered with. If the OS has that capability already built in, then the needed for a kernel mode anticheat diminishes.
>they certainly would produce a loadable module if there was enough install base to justify it
It's not realistic for there to be such an install base to support such complexity compared to having them implement a simple API into their game and server.
It's not actually the message from the kernel that provides the value, it's the work needed to fake such a message.
The issue is that Windows is designed to be able to protect the will of proprietary software publishers against the will of users that want to assert control over the software running on their computer. It's very similar to the story with DRM.
Linux desktop OSes will never put in place the measures to make a Vanguard-like system work, because it's just unethical for a bunch of reasons, the most basic of which being that it's a violation of freedoms 0 and 1.
This isn't true. And supply chain wise just look at the xz backdoor. A random person was able to compromise the supply chain of many Linux distros. Security also is not just supply chain integrity.
>Windows is designed to be able to protect the will of proprietary software publishers against the will of users
I'm not sure what you mean by this. Just because Micrsoft cares about developers, it doesn't mean they don't care about users.
>that it's a violation of freedoms 0 and 1
It's not. Freedom 0 and 1 does not give you the freedom to cheat against other players without being banned. You can be free to modify the game client, but you aren't entitled to play with others using it.
The xz backdoor was successfully caught before it landed in mainstream release branches, because it's free software.
But broadening the scope a bit, the norms of using package managers as opposed to the norm on Windows of "download this .exe" is a much stronger security posture overall.
I am aware the Windows Store exists, it's not widely used enough to make exes a marginal distribution pathway. I am aware curl | bash exists, it's more common than it should be, but even in those cases the source is visible and auditable, and that's very uncommon for non-technical users to ever do (unlike downloading random exes).
> Freedom 0 and 1 does not give you the freedom to cheat against other players without being banned.
That's a strawman, I never claimed you should have the right to cheat against other players.
> You can be free to modify the game client, but you aren't entitled to play with others using it.
And that's the issue, Windows has functionality to impede your ability to run the software as you see fit and modify it to your needs. Perhaps you want to run your own server, with different moderation policies.
What? It literally got included with several distros. It wasn't caught before it shipped to end users. Just because it got caught before slower to update distros got it, that doesn't mean it is okay. It reveals how low the barrier is for an anonymous person to get code into the OS.
>I never claimed you should have the right to cheat against other players.
Attestation doesn't take away your ability to modify and run software which means that you still have freedom 0 and 1. It just means that you can not prove to a remote server that you bare running unmodified software. To me you were implying that the server being able to kick people who modified the client to cheat was violating their freedom.
>Perhaps you want to run your own server, with different moderation policies.
Nothing would stop you from running your own server like that.
For a multiplayer game, I'd argue that playing with others (even if you're restricted to private servers, not that most games support that anymore..) is running the software. Being able to use a piece of software for its intended purpose is more relevant than a literal reading "you are allowed to exec the binary and nothing more"
Linux's inability to run specific anti-cheat solutions is a vendor support issue on the anti-cheat maker's part, because they don't care about your security, and they've managed to convince game developers that this practice is acceptable. It's not. Vote with your wallet.
If a user agrees to a kernel level anti-cheat, it's not a rootkit.
Fortnite uses EAC which does work on Linux, only they decide to block it.
Anticheats like BattleEye started as private servers add-ons like this too, not official support, but admins choose to install them. I even remember Brood War's private ICCUP servers had their anti-hack as they called it.
Of course the well known gaming company that releases a distro is Valve. But, rootkits don’t seem like they fit their particular ethos (they are well known for their less annoying DRM scheme, right?). TBH, it seems like a rare opportunity to break the hold they have on the “game store” concept.
In a way I kind of wish this was how more windows support was handled just because PowerShell is so uhh... powerful.
It might be that Linux is less capable for your use case, but people seem to be generally content with ChromeOS and I think that the standard Fedora desktop install is more capable than that so I think the market exists.
At the same time, we still have a major problem at work if Microsoft goes through with this. I work in a research lab with 10s of 1000s of dollars worth of Windows 10 workstations that cannot be upgraded. We use Windows remote desktop and plenty of other software that is Windows only. The hardware is still pretty new and capable. With NIH cuts the last thing we need now is to have to spend money and lots of time to replace all that for no good reason.
You can buy extended support for orgs like yours that require it - https://learn.microsoft.com/en-us/windows/whats-new/extended...
1. in higher use than its successors
2. only had one possible successor
3. the successor did not support hardware in use at the time
?
I'm sure it won't stop them, as you say, but really Microsoft, as someone who used to be a (relatively rare at the time) defender of yours, get fucked. The Raymond Chen camp is truly dead (https://www.joelonsoftware.com/2004/06/13/how-microsoft-lost...)
Microsoft (well, the Windows part) is looking more and more like the Apple and Sun in that article. It’s the #2 or #3 user-facing OS these days. The fancy new programming environment happened and most stuff moved there, but it’s JavaScript and the browser rather than C# and .NET. Running old software is becoming a niche and getting more so by the day.
2. ... I mean, that's every version of Windows. XP? Vista. Vista? 7, etc. The last time you had two choices of Windows was in the '90s.
3. It does support hardware in use 'at the time'. I upgraded from 10 to 11 on existing hardware.
If you mean older hardware, 98 and NT4 were the last to support the 486, yet 486s were still in use by the time of release of Me/2000 (I sadly had to interact with said 486s in a school lab). XP -> Vista made the jump from a Pentium 233Mhz minimum to 800Mhz minimum, /and/ caused many issues due to the introduction of WDDM causing a lot of graphics hardware to become incompatible.
This is nothing new. Those pulling the shocked pikachu face perhaps just haven't been around the Windows block enough to realize... this is nothing new.
It's the same situation as last time with Windows 7. You can get three years of extended support for the monthly cumulative update, which I assume is being done given it is fairly inexpensive. The US government gets favorable pricing from Microsoft.
The consumer price for Windows 10 ESU is $30/$60/$90 for the first/second/third year.
Some companies may be buying prolongation for specific equipment which run win10.
Computers are cheap!
Windows 10 ending in October blows my mind in contrast to the free as in beer near GUI-less Microsoft Hyper-V Server 2019 receiving extended support (security updates) until 2029. I'll probably assemble a patched-up/slipstreamed installer for recycling older equipment!
There are a huge number of examples here: https://www.reddit.com/r/unixporn/
I used to use Openbox and compile my own freetype with patches but these days want to spend my time on other things, so I'm just using macOS which has the best out of the box experience with the lowest TODO list when setting up a new computer.
It's hard for me to imagine anything uglier than the above, but beauty is in the eye of the beholder as they say.
I've found Ubuntu's default, and "vanilla gnome shell" to both be pretty cohesive and "modern".
And at the same time, I've never really felt like Windows or Mac actually end up with a more cohesive UI than the various linux desktop envs. For every Qt/GTK theming mismatch, I find a Windows mismatch between apps due to Windows being 12+ generations of design languages and toolkits built on top of each other. (e.g. the 3+ distinct "current" windows control panel looks (11, then 10, then 7, then XP as you keep digging into more and more obscure settings). And apps typically "freeze" at the UI design when they're born. e.g. XP apps still look XP, and so on.
And on Mac, you have the (relatively!) small number of apps actually artfully designed for macos. And then you have all the other ones - electron, java-based, cross-platform Qt apps (which naturally look like Qt apps... just like on KDE/gnome).
There's of course various quibbles over font render, that have existed since time immemorial. I don't think any one platform really wins hands-down here, though it's my understanding that mac typically does the best (as long as none of the non-mac-native apps manage to mess it up).
I really think people just have double-standards at this point, where their "home" platform's flaws are minor, and candidates to replace it must be flawless. (I'll also admit I'm the same, though NATURALLY I think I'm right - i figure if everything is electron and mismatched anyway, I might as well have a free-as-in-freedom operating system under it. Nobody is putting ads in my start menu or advertising xbox game pass to me in my notifications.
Then again plenty of modern browsers have some type of profile syncing built in, which does all this for you.
> email inboxes
Please don't use POP3. Your inbox should live on a remote server and simply follow your account. Storing your inbox exclusively on your PC will make you very sad some day.
most cheaper/free email providers have a storage limit.
besides, i disagree conceptually. if i want to reduce the risk of my email being read or handed to someone i don't trust, then removing it from the server is a good idea. i can make my own backups.
On the desktop side, the GNOME online accounts feature is pretty good at getting you most of the way there.
Then everything works... until you try to adjust the display brightness.
This on pre-2020 Lenovo laptops.
Take Ubuntu, for example. It’s one of the most popular and recommended distros for non-techy users, but just look at the install process: https://ubuntu.com/tutorials/install-ubuntu-desktop#1-overvi...
Let’s be honest, I don’t think most people would actually go through with that.
One idea to fix this and get more people to switch would be for Ubuntu to offer a Windows app that handles everything. It could download the ISO in the background, format the flash drive, install Ubuntu in dual boot with Windows by default, and clearly explain each step so users know how to start using Ubuntu or go back to Windows.
„Running Linux in VM“ as you have put it, is miles better because it works all the time with 0 friction, driver issues, random freezes, reboots, etc.
Hardware support issues are certainly understandable, but blaming "opinionated nerds" for them is asinine. It cannot be understated how difficult it is to deal with certain OEMs.
EDIT: Beyond skill, just getting the external media is a substantial friction. I haven't used a thumb drive besides for Linux install media in 15 years; I'm good at computers but just finding / buying one of those things is its own roadblock.
This sort of thing used to be more common. My first exposure to Linux was before CD-Rs were ubiquitous so there was often no possibility of using external media if you downloaded Linux. Partitioning the drive and installing there was typical.
Ubuntu and Linux Mint are now recommending balenaEtcher, which is easier to use than Rufus.
For the tech, sure but for common people not so.
Why cannot Ubuntu just offer a download media creation tool like Windows does. Surely it's not that hard to couple dd with a batch gui.
> to fix your busted drive, just nuke the boot sector and send it
> bash
> dd if=/dev/zero of=/dev/xxx bs=512 count=1 conv=notrunc
On the other hand, if someone finds that part too complicated to follow perhaps they may not be able to install Linux - or Windows for that matter - by themselves and come across other issues down the line. Ultimately replacing your OS with another one does require some minimum level of technical knowledge that you either need to have or be fine with learning during the process.
The biggest sticking point is the fear of losing what they do have, but we're at the point where even their previous generation computer could be made to run Linux.
I guess I'm not surprised with how frequently "reinstall Windows" is offered as a solution, that there is now some lighter version of that. But really I was talking about obtaining/creating installation media and reinstalling from scratch.
I am almost certain something like this existed 15-20 years ago from Canonical.
https://atkdinosaurus.wordpress.com/2023/03/24/another-way-t...
Installing Ubuntu bricked a Samsung laptop I had some years back. Never again.
- Avoid requiring the user to figure out how to get into BIOS/EFI and change boot order. Windows has APIs for manipulating EFI things, may be worth looking into that.
- Replace GRUB with something more modern like rEFInd or Clover with a nice looking theme.
For the latter point, while GRUB is technically functional, it looks scary and arcane to new users and has little resiliency to things like Windows updates mucking with boot entries. It makes for a bad first impression (“why is my computer showing hacker screens suddenly”) and when it breaks your average user doesn’t have a prayer of fixing it. Something that looks more modern and self-heals would be a big improvement.
Can't help thinking that should be in a bigger font. It's a shame there doesn't seem to be a away to install Linux and keep your Documents directory at least. Is that due to file systems?
[Yes, yes, backup to memory stick/external drive but I'm talking about for your average person on the street]
So long as enough contiguous space is available to install the desired Linux distro.
You can't do this all on the same drive, because you need a place to copy the documents directory to. You need to delete the NTFS partition to create the place to copy the files to, but by the time you've done that, the Documents are inaccessible. You could do it in memory, feasibly, if you create a RAMdisk and are lucky enough to have enough memory for all your documents, but then you're still gambling on not running out of memory during the install.
So it is possible to copy the documents on the same device, and it's possible to even automate the process, but it's not possible to do it reliably or safely, and the reliability is so low that it's not worth even offering the possibility. If somebody has a handful of gigabytes of documents, it's already a nonstarter. To be safe you'd demand the user make a backup onto another device anyway, in which case they might as well do that and then copy the files into a fresh install themselves
It's not just shrinking and copying over to the new `/home` because of the locality of the data. If your NTFS partition is taking the entirety of the disk (minus EFI and system partitions), shrinking it will then make it take up the first X% of the disk. Then you have to make the linux installation on the last (100-X)% of the disk, copy the files over, and then when you delete the NTFS partition, your Linux filesystem is on the last half of the disk with a big blank unallocated area on the beginning. BTRFS or LVM2 could help a little bit there, but that's far from ideal in any case.
Probably the best approach would be to shrink NTFS, create a new partition at the end of at least the right size, copy the files over, then wipe the NTFS partition, install Linux as the first partition (after system/EFI and such), then copy the files into the user's home, and then remove the documents partition. That's still not super reliable, though. You are at the mercy of your documents sizes, filesystem fragmentation (remember, even if your filesystem is mostly empty, you might not be able to shrink if fragmentation is in a bad place. You could defrag, but then the install time can balloon up many hours for the defrag process alone, just to shrink a filesystem that you're going to delete anyway), how big the Linux install will end up being, and many other factors. You'd have a lot of people who simply can't copy their documents over on install who will be simply SOL. I can't think of a situation where this kind of thing wouldn't be better served by just telling the user to backup their documents to a USB drive and move them back afterward, because many people are going to have to do that anyway.
It’s still a great device, it just sucks I’m stuck with windows (10).
And that's it, they are lost and tired at that point. They will just go back to Windows.
In a more reasonable world they’d owe their customers a recall.
A well configured firewall between your computer and the internet, uBlock Origin in the browser, and not downloading untrusted files off the internet can do a long way to help. Not stopping everything but at least shielding you from the worst.
I think the bigger issue is like on iPhones and Androids. Your software and apps stop supporting your OS long before the hardware or OS fails you.
Which from what I understand is that even Windows 11 still has support for SMBv1.
But my point was that your standard “up to date” XP install in 2016 was highly vulnerable and could effectively be nuked by such an attack. It took nearly 7 years after support ended for that to happen. So you could theoretically get another 7 years out of Windows 10 before a similar situation happens where a global cyberattack negatively impacts you with no way to protect yourself because your OS doesn’t support a configuration that would prevent you from being a victim.
Btw I do have a spare PC, it only got Win10 because the GPU didn't support 7, and it's not getting 11 even though it supports it. Microsoft's job to keep that secure.
Yes, it is often possible to upgrade your PC hardware to make it compatible with Windows 11, but the feasibility and cost depend heavily on which specific requirements your current PC fails to meet.
Windows 11 has stricter hardware requirements than Windows 10, primarily focusing on security and modern capabilities. The key hurdles for older PCs are usually:
CPU (Processor) Compatibility:
Requirement: 1 GHz or faster with 2 or more cores on a compatible 64-bit processor. Microsoft maintains a list of approved CPUs. Generally, this means Intel 8th Gen (Coffee Lake) or newer, and AMD Ryzen 2000 series or newer.
Upgradability: This is often the trickiest and most expensive upgrade. If your CPU isn't on the list, you would likely need to replace your motherboard AND CPU (and possibly RAM, as newer motherboards often require different RAM types). This is essentially building a new core system and might not be cost-effective for an older PC. TPM (Trusted Platform Module) 2.0:
Requirement: TPM version 2.0. This is a hardware security module that stores cryptographic keys. Upgradability: Enable in BIOS/UEFI: Many PCs manufactured in the last 5-7 years actually have TPM 2.0 (or fTPM/PTT, firmware-based TPM) but it might be disabled in the BIOS/UEFI settings. This is the easiest fix – just enable it. Add a TPM Module: Some older motherboards (typically from around the Intel 6th/7th gen or similar AMD era) have a TPM header where you can purchase and install a physical TPM 2.0 module. This is a relatively inexpensive upgrade if your motherboard supports it. Motherboard Replacement: If your motherboard doesn't have an integrated fTPM/PTT and lacks a TPM header, you would need to replace the motherboard (which usually means a new CPU and RAM too). UEFI Firmware with Secure Boot Capability:
Requirement: Your system firmware must be UEFI (Unified Extensible Firmware Interface, a modern BIOS replacement) and Secure Boot capable. Upgradability: Enable in BIOS/UEFI: Similar to TPM, many modern PCs are UEFI-capable but might be running in "Legacy BIOS" or "CSM" (Compatibility Support Module) mode. You can often switch to UEFI mode in your BIOS/UEFI settings.
Enable Secure Boot: Once in UEFI mode, you can usually enable Secure Boot from within the BIOS/UEFI settings. Motherboard Limitation: Very old PCs might only support Legacy BIOS and not UEFI at all. In this case, a motherboard replacement would be necessary. RAM (Memory):
Requirement: 4 GB or greater. Upgradability: This is usually the easiest and cheapest upgrade. Most desktops and many laptops allow you to add more RAM. Storage:
Requirement: 64 GB or larger storage device. Upgradability: Easily upgradable. You can replace a smaller HDD/SSD with a larger one. Graphics Card:
Requirement: Compatible with DirectX 12 or later with WDDM 2.0 driver. Upgradability: Most integrated and dedicated graphics cards from the last several years meet this. If yours doesn't, you could install a new graphics card (for desktops) or be out of luck (for laptops). How to Check Your PC's Compatibility: The best way to determine what specifically is holding your PC back is to use Microsoft's PC Health Check app. It will tell you exactly which requirements your system meets and which it doesn't.
Summary of Upgrade Possibilities: Most Common & Easiest: Enabling TPM 2.0 in BIOS/UEFI. Enabling Secure Boot in BIOS/UEFI (after switching to UEFI mode if needed). Adding more RAM (if less than 4GB). Upgrading storage drive size. More Involved & Potentially Costly: Adding a physical TPM 2.0 module (if your motherboard has the header). Upgrading the CPU (often requires a new motherboard and RAM too). Replacing the motherboard (almost always requires new CPU and RAM). Upgrading the graphics card (for desktops). Is it worth it? For older PCs that require a new CPU and motherboard, it often makes more sense financially to purchase a new PC that comes with Windows 11 pre-installed or is fully compatible out-of-the-box. The cost of individual component upgrades can quickly add up, and you'll end up with a system that's still fundamentally older than a brand-new one.
However, if you only need to enable TPM/Secure Boot in BIOS or add RAM, it's definitely a viable and cheap way to get on Windows 11.
So what?
And if enough people move to Linux even those holdouts will eventually have to support it. The Steam deck has been the gateway drug to Linux for the masses, and I’m stoked for it. Moving to Linux for my desktop gaming machine was the single best decision I made 5 years ago, and I haven’t used Windows since. It’s more stable than Windows ever was, and I also don’t have an errant update break a game, the system, or cause a reboot at the worst possible time.
There's been ton of progress, thankfully people keep using linux besides the very vocal frustrated "failed" migrations.
If you are not driven by curiosity, most of the time the driver is either money, a vision of software as only an occupation, work life balance, etc.
Which is usually the kind of people that is not excited by software, doesn't have a passion for it and even take passion away from others.
Even if it said go install Ubuntu or something... Very few people think of a kernel and OS as separate things. Hardware and software separation is already sketchy enough. Instead of people interjecting for a moment, can there just be a penguin-branded "Linux" OS already?
Nobody in their right mind would claim that they are building the official Linux OS without turning the whole community against them.
And it's not as if the average user need to use linux. If developers move from windows 10 to linux, the impact would be huge.
Nobody is upset that there's an official Linux kernel. Of course it takes Linus Torvalds to declare it, and he's not interested in making an official OS. But this is the consequence.
One of the biggest faults of Linux is we don't have an easy, user friendly, idiot proof distro for normies, but Ubuntu is just broken corporate slop.
When I was wearing the various "save users from themselves" hats in my previous life, Ubuntu users were 100% the bane of my existence... since they were all server customers, the ones that took my advice and let me help them switch over to Debian suddenly stopped being frequent footgun fliers, no matter what their original issue was.
Ubuntu, to me, is simply Debian that has been aggressively turned into enterprise slop.
Even more confusing and frustrating is the "it depends on your use case" thing, as if 99.99% of PC users aren't all trying to do the same basic things.
Anecdata— a mate of mine plays Hell Divers 2, and thought he couldn’t play it or it wouldn’t work well. I told I had played it and it worked fine. Two days later, he’s using Linux and getting better performance than he was on Windows.
It has been five years of gaming exclusively on Linux, and I have yet to find a game I can’t play with the only exceptions (for me) being League of Legends and iRacing. But I can live without them. If you don’t play extremely competitive online games you can probably play it. My rule of thumb is, “are there IRL pro tournaments for money?” if there aren’t it’ll very likely just work.
My only tip is just use something like common. Ubuntu, Mint, PopOS, Arch, ZorinOS, Kubuntu… all will probably work with zero effort. Don’t go mucking about with weird distros, and bizarre tweaks, and you’re more than likely gonna have the most stable system you’ve ever used.
I cannot recommend Linux highly enough. Five years ago I was skeptical and unsure but tired of Windows bullshit and here I am— still loving it. I’ve fully upgraded the system recently, except for the GPU (because 5090 prices are ridiculous and I don’t want less VRAM than my 3090 has) and it even booted from my old install and just worked.
Try Linux, friends. It’s pretty freaking great these days.
At a fraction of time spent following this guide you can extend win 10 by a few more years by switching to ltsc or go win11 bypassing all software restrictions
mrweasel•5h ago
mathattack•4h ago
Think about the demand and supply curves of calculations (or computation). For most of history, they moved in tandem, with supply moving slightly faster, so computers would always do more at slightly lower costs.
Now both curves are speeding up, but demand is moving faster, so the costs of hardware are going up. And when high end servers (with GPUs) are unavailable, people hold onto the older ones longer.
WorldPeas•1h ago