That, of course, imposes some limits WRT package visibility and other policies you might want to enforce - you can't easily limit a certain set of users to a subset of your repo.
WinGet is more or less just downloading the installers and running them, and doesn't properly track the installed applications and isn't always able to update them.
How so? I’ve been using it for years and haven’t had a problem yet updating all applications at once.
Things can get tricky with applications that are installed with WinGet but come with mechanisms to update themselves. If this self-update skips adjusting the right knobs and values in the registry, WinGet will assume that the application is still on the initial version.
For example, this is the case with Obsidian.
It’s « just » a tool which will fetch installation manifests on a centralized Microsoft GitHub repository and execute it. Exactly like brew or chocolatey. It’s fine for a third party « package manager » but it feels pretty weak for an official system tool.
Also, if I’m not wrong, it’s only available as a CLI tool which makes it pretty useless for 95 percent of Windows users and for developers to distribute software with it.
The thing is useful for sure but it’s far from a Linux package manager.
Back in my day this would be seen as an exercise left for the user, and thus a new junior dev was born building a front end.
UniGetUI looks really cool.
https://github.com/marticliment/UniGetUI - 16.2k stars
If you have the ability to financially support them or contribute code somehow, do it.
I find that behavior incredibly annoying. I mainly use Chocolatey, so every once in a while when a package is heavily outdated or missing from the repo I end up using Winget instead for convenience's sake. That means Winget keeps trying to update or manage Chocolatey packages, and as far as I know, there's no easy way to stop that.
Hence WinGet, a Microsoft owned and operated alternative that those firms may feel less jittery about.
The winget repo has a ton of useful installation manifests, sure, but isn't "just" a tool to fetch manifests from that one repo. (Also, it doesn't fetch the data directly from GitHub, even though that is the source of truth, it has a light REST service in between which does a lot of caching and DDoS management and what have you.) Winget also by default installs Windows Store apps, too. It's also configurable so you can add your own installation manifest repos if you wish (such as on-premise private feeds).
Even now corporate customers need to individually package software themselves to manage applications in their fleet.
My guess is that Microsoft encouraged applications to share DLLs from the start, and to provide backwards compatibility Microsoft never enforced MSI or a mature software management framework.
Finally, many of these same apps had no uninstaller. You had to hunt throughout the system to remove all the stuff they installed, including preference files and cache files just in case you wanted to reinstall without having problems down the road.
1. It's easy to inadvertently break one's system. How often have users accidentally uninstalled their desktop environment due a buggy dependency specification or dependency solver? Shouldn't there be a whitelist of core system packages and files that should never be touched during ordinary package transactions? There was also a Fedora bug maybe 1 year ago where a problem with the Google Chrome RPM's GPG signing key blocked system updates unless one manually overrode the package manager transaction to skip broken packages. Imagine if Chrome could cause Windows updates to fail or if a misconfigured Homebrew package could block MacOS updates.
2. It's easy to accumulate cruft over time because there's no out-of-box tracking of software I've added compared to be the base system. I could manually keep a list in a text file, but what about any dependencies of the packages on that list? What about any config files in `/etc` left behind by packages even after they are uninstalled? I'd like an easy way to revert my system to its out-of-box condition without carefully inspecting every line of `dpkg -l` (of which there could hundreds or thousands). With Homebrew on MacOS I can just blow away `/opt/homebrew`.
rpm/yum/dnf actually have a system for this called protected packages which can't be uninstalled without some ceremony on the part of the caller. Distros use this feature quite sparingly and reserve it for cases where you will truly break your system. Sometimes you want to uninstall your DE.
But I agree that it could be so much better.
WinUI3 (if anyone ever bothers to use it, including Microsoft) already distributes its library dependency this way, as a store package.
I think this is a large part of the problem, within the range of applications MS offers there's range of ways they get distributed, installed and managed. Will office use it? How about visual studio, teams, various windows components? It'd be more 'sit up and listen' interesting if MS committed to using it themselves, showed it works for a range of use cases and was great at doing it.
If Office is no longer the special case in Windows Update and more applications can use it, that would be interesting. A lot of third party drivers have already been using it more, and that also seemed a special case before. Opening it up as a platform for any third party seems like a long time coming.
(Visual Studio is an interesting case, too, because some of it has always had security updates in Windows Update, but yet more of it is not updated that way than is. Originally the border lines were "owned by Windows components" versus "Visual Studio owned components" but those lines have become so blurry, especially in the .NET 5+ era where Windows no longer owns anything about .NET, but Windows Update still serves critical security patches.)
In the mid 90s, a FreeBSD user could build their entire operating system and apps with code and tools managed by FreeBSD.
Eventually systems like Debian improved on this, but FreeBSD was first.
Unless i am misunderstanding what you mean by build.
Then, once you've built it out, you need to convince software vendors to use your gatekeeping installation mechanism, and hope they believe the executives won't see this as leverage to extract rents later.
I am surprised that something like sparkle hasn’t found footing on Windows.
I was shocked when I switched to macOS. I couldn’t believe how much better the typical install experience was compared to Windows. Just drag the downloaded file into a folder. No need to run some bespoke install wizard. Even when applications did need to run something to install, it was almost always just the same (presumably system-provided) install flow.
It's really interesting to compare MSI to MSIX which is ZIP/XML instead of CAB/weird JET DB file.
[1] https://learn.microsoft.com/en-us/sysinternals/downloads/aut...
A user could accidentally do it and end up with a 'broken' menu they don't know how to fix, and Windows being 'broken' in that way is Windows' fault from the perspective of such a user.
This sort of thing can and does cause a support burden, which is an expensive tradeoff. So rather than it being a built in capability, a user would need to manipulate the registry or use a third-party program to do it for them.
At least, that's the reasoning that would've come up at MS when adding such a feature was suggested internally (and it certainly has been)
Classic Macintosh systems did not have a user-facing command line at all.
- No concept of installers apart from an INSTALL.COM or INSTALL.EXE provided by the vendor.
- Installer often just copied stuff to a new root-level subdirectory, selectable in the installer, if one was there. Sometimes you just had to make your own subdirectory and copy everything yourself.
- Often everything regarding the application was done in that subdirectory, including running executables, reading data, writing data, and often saving documents. This was very different from the UNIX tradition of putting executables in /bin, and read/write data in /etc or /var, with appropriate permissions set.
Other interesting stuff:
- Apart from a couple of files (IO.SYS, MS-DOS.SYS) needing to be the 1st and 2nd "inodes" on the disk (so the bootloader could find them), and CONFIG.SYS and AUTOEXEC.BAT having to reside somewhere in the root directory, the kernel of MS-DOS didn't really care at all about any other file. Even COMMAND.COM could be anywhere you want - you would tell MS-DOS where it was with the COMSPEC= setting in CONFIG.SYS. So all your DOS external commands could be anywhere (and reachable if a PATH command was in your AUTOEXEC.BAT), although I believe the MS-DOS installer put them at \DOS or \MSDOS, so that was probably pretty de-facto standard.
So... DOS, the precursor to Windows - it was anything goes.
When Windows became a thing (version 3.x was when it took off), the above is typically how users worked with programs under MS-DOS at the time. It's why programs tended to do everything in their "C:\Program Files" folder.
And I don't know when Microsoft developed the arcane and overengineered .MSI system but it wasn't right when Windows NT came out in 1993 and I think it wasn't even there for Windows 95 when that came out. Even if Microsoft did have .MSI right with the first release of Windows NT/95, there were still many existing programs that didn't use it and wouldn't use it right away. So Microsoft had to support the existing mess and habits from DOS days.
I do remember the full screen setup.exe programs with the blue background...
I would always cringe whenever I noticed how other folks would have everything installed in the top-level folder, or sometimes in C:\WINDOWS or other random places.
If I were to do it over again today, I would do it differently: I’d install programs that are strictly for doing stuff TO the computer itself in C:\UTILS, and everything else in C:\APPS.
https://news.ycombinator.com/item?id=44118703
Not much uses it because very little new development happens for Windows, even by Microsoft. Everyone either uses portable frameworks and inherits the defaults, which aren't MSIX, or has legacy systems they developed from before MSIX got good.
You only get windows laptops and desktop from Microsoft, but they are highly secure (similar to what apple achieves).
Everything else needs a windows pro license (with tight checks).
I’m fairly sure that would improve windows’ security posture by a huge lot.
If they want to compete with Apple using Apple’s strategy, they may face a losing battle.
What package handler ? Installing things on macOS is still a mixed bag of disk images with the app to move yourself, or .pkg files or the App Store.
The thing is so broken that brew is the first thing I install on a new Mac.
they're probably referring to homebrew. which quite honestly, makes MacOS barely bearable. The terminology sucks and the ruby language doesn't help. MacOS without homebrew is unbearable.
Microsoft backwards compatibility got them massive market share but also backed them into a corner. Package Managers only work if there is some constrants but I came across software that was dropping .ini files into C:\Windows\System32 in 2017.
I think they are tried with secure boot, but pushback from Linux people and maybe fear of anti-trust stopped them (for now).
And maybe if they do this, hardware vendors may fear a market split where they loose Linux people to other vendors. Not that many people but it still is revenue loss. I know I will never ever bye a microsoft only device. Bad enough Smart phones are locked down, at least I can ignore the phone.
Secure boot as far as Linux is concerned is extortion from the users.
During covid and the supply/demand mismatch as everyone rushed to WFH I was wondering if they could repurpose the cheap S xboxes as cheap desktops. Essentially a reversal of the original 'xbox as a trojan horse' idea, instead of using consoles to get windows in the living room, it's to get windows in the home office.
To be honest, I kind of understand they don't want to do that. I bought a Surface Pro 8 some years ago and is probably the worst computing hardware I've experienced in a long time. Even basic things like thermal management is horribly broken when using Windows on it. Running Linux on it gave a slightly better experience, but seems so backwards that they cannot even make their own hardware work well, I thought the combination of hardware+OS by same company would lead to a better experience but nope.
Besides, the antitrust regulators would absolutely hate this.
One of the reasons they're still much more prevalent than Apple is because they don't.
No one seems to care, I expect the AI PCs to eventually sell for 75% discounts.
They make the majority of their money from businesses with lucrative support contracts, Azure, Active Directory + Office Enterprise suite, etc.
They make the majority of their money from consumers via stuffing Windows with "promotions" (ads) and from Office 365. It is services. Windows barely earns them anything.
Hell, the fact that they're thinking about opening up the Xbox to Steam and have official tutorials on running Gamepass on iOS and Linux should tell you that they don't care what OS on which device you use, so long as you are subscribed to their services.
In an environment like that, a vertical integration play makes little sense. You want your services to be on as many platforms as possible, not attract ire and roadblocks from your partners.
On top of that, they don't have a phone platform onboarding people to the whole hardware ecosystem. Even for Apple, Mac + iPad + AirPod profits are dwarfed by iPhone profits.
> Microsoft’s Windows Package Manager has also tried to solve some of the problems with installing and updating apps on Windows
Nth time is the charm?
> but it’s not a widely used way to install and manage apps outside of power users and developers.
Nth time of ignoring what's there and instead building another system.
I think it is actually a good move for them to embrace supporting updates for all the non Windows Store stuff in a first party way. There is just way too much software that will never be a Windows Store application. Besides that, more trustworthy checks for malware etc. are included in such a first-party system than in what was already there, simply by having more available infrastructure. It also increases the chances that they can convince developers to move to using install/update libraries that play nice with this official package management tool.
At this point in time, I would definitely advise everybody to start out installing most Windows stuff via winget or if it isn't listed there via Chocolatey.
[0] https://en.wikipedia.org/wiki/Windows_Package_Manager#Histor...
Meanwhile, any chance finishing OneDrive file renaming to work without issues finally? Is there an 'update' pending about that? Just a random thing I come accross daily, among the dozens others slowing my work and distracts me on proudly presented OS level.
... or perhaps when I say 'Update and shut down', then actually shutting down in the end instead of restarting and spinning the fans the whole night (me believing the poor thing was shut down as prom... suggested)?
For home use, I can see this being good as a large segment of users don't stay up to date on security patches, but what about breaking workflows in businesses?
Some businesses deliberately do not upgrade some software packages, because it would break stuff. It makes sense to just push all updates in a subscription based economy, but this could also quickly become exactly that: A subscription hell-hole where companies push updates and now you're old licensed copy of Photoshop is converted to Creative Cloud.
The idea isn't bad, I just don't trust modern software companies to be able to manage this in a way which makes customers happy.
> The main goal of this project is to create an intuitive GUI for the most common CLI package managers for Windows 10 and 11, such as WinGet, Scoop, Chocolatey, Pip, Npm, .NET Tool, PowerShell Gallery and more (Check out the package manager compatibility table)!. With this app, you can easily download, install, update, and uninstall any software published on the supported package managers — and much more!
https://github.com/marticliment/UniGetUI - 16.2k stars
Because they can just create and write to random directories whenever they want. And any Uninstaller, either provided by the app or Microsoft could just miss these files, because they aren't recreating the full program control flow.
They did not even use Windows update for that one.
There's a better way, which I am shamelessly self-promoting in this thread (as it's 100% on topic) - my company makes a tool that can ship self-updating Electron apps and beyond being not abandoned, it's got a lot of really useful features, like being able to do the build and upload of signed updating packages (using the tech MS is pushing here) from Linux CI workers, without needing a Windows license.
It can also do forced updates on launch, which can be helpful for apps where the protocol between client and server changes regularly. And it plays well with corporate Windows deployments. People can install apps locally without needing administrator access but it goes into c:\Program Files
Think about it another way: if they install in AppData, they can likely bypass IT depts and other business bureaucracies and get a foothold somewhere in an organization. It’s absolutely malicious, both in terms of tech and business practices, but it works.
This surprises me, I would have thought usees would think package removal would only meant package removal.
Most of the time there isn't one.
Where do you draw the line at removal, configuration files, user created documents, config files on networked home directories.
I dont think you are being unreasonable, maybe we need better clean up depending on what the user needs..
It would mean tracking the creator of filws and tagging them appropriately, using this list during uninstall.
And this is why having a solid working relationship between all levels of IT and your users are so important. It really is customer service first, tech second. If your users trust you as the IT admin, they’ll know to ask first before downloading AppData installers like this before they become a job ending issue for them.
I don’t understand why Nadella hates windows so much.
But for most home users, it's not a big deal. I imagine 99% of home users are behind a NAT, and being behind NAT means external attackers aren't going to be able to connect to your machine and run remote exploits (ie, EternalBlue). The only way to get compromised is to get trojaned, in which case a Windows update wasn't going to save you anyways. At best, it means a trojan might have a slightly harder time escalating to Admin/SYSTEM without getting caught, but a trojan doesn't need Administrator permissions to ransomware your Documents folder or add your machine to a botnet.
As long as your browser is up to date, you'll be fine.
That said, I also hate Windows updates, and especially the way Windows handles them. LTSC is also my way to avoid some of it, especially the ""feature"" updates. LTSC is something I also recommend, if people can manage an activation server, or I can point them to mine.
Hitting default gateways for web admin panels etc.
I found the solution for Windows update though.Just don't use Windows. Microsoft can't be trusted.
Microsoft isn't going to test any of the updates themselves before it sends them through the central update feature
And if you can donate money to this person at all, do it.
MSIX is an interesting beast. My company sells a tool called Conveyor [1] that can create these packages from any platform including macOS and Linux given a simple config file for apps using runtimes like Electron, Flutter or the JVM (it's free for open source projects). We do a lot of work to make MSIX work better and be easier to use, because out of the box it's quite raw and in particular there are a lot of bugs in Windows 10 that Microsoft never fix because they view it as EOL. Conveyor creates a tiny 500kb installer EXE that drives the MSIX package manager API to do the install whilst working around these bugs.
Amongst other things, MSIX gives you:
• Chrome-style silent background updates on a regular schedule, even if the app isn't running.
• Incremental block-based delta updates.
• Incremental block-based downloads and installs, i.e. Windows can re-use parts of one app to install another, based on file block hashes. Makes installs very fast when they share a common runtime!
• And those installs/updates can pull blocks from other machines on the LAN too!
• Declarative installs and OS-controlled uninstalls. Writes to the user's AppData directory are virtualized, so uninstalls can be clean.
• Packages can be installed without admin rights, without dumping stuff in the user's home directory. Windows runs an elevated service that does the install for you.
• You can sandbox apps if they're shipped with MSIX.
• EXEs can be automatically added to the user's path, without needing any terminals or shells to be restarted. MacOS can't do that!
• Windows admins can easily deploy and manage them.
• They're cryptographically signed and their integrity is protected by the OS, so malware can't fiddle with the binaries (unless it manages to elevate to root).
• Although you can't create them with a regular zip tool, you can extract them with one.
Conveyor adds some other features on top like the ability to have web-style "update immediately on launch" updates, and a simple Electron/JVM control API so you can force updates on users.
It's a pretty nice feature set overall and has some big advantages over Squirrel, which is what Electron uses. However, I would definitely NOT recommend you try and work with MSIX directly. Microsoft's tooling is quite awkward, and their policy of only supporting fully up to date Win11 machines - and only from Windows - means you can't realistically distribute apps using MSIX unless you go via an intermediary like Conveyor that's committed to making it work. You'll just hit lots of weird bugs and installs that fail for no obvious reason with mysterious error codes. We ploughed through the pain so you don't have to.
blueflow•1d ago