https://devblogs.microsoft.com/oldnewthing/20180122-00/?p=97...
The first release of git was in 2005, around a decade after Windows 95.
Win 95 feels from era1, xp and git was already in era 2.
Once those two changes were done by 2010 though, there’s been no game changer, if anything we've regressed through shittyfication (we seem to have fewer social networks vs the original Facebook for example, as most of them turned single player feed consumption).
Maybe pre and post LLMs will feel like an era change in a decade as well?
Of course, `git send-email` has a plethora of options, e.g. you'd typically add a cover letter for a patch set.
Also, in the Linux kernel tree, there are some additional helper scripts that you might want to run first, like `checkpatch.pl` for some basic sanity checks and `get_maintainer.pl` that tells you the relevant maintainers for the code your patch set touches, so you can add them to `--cc`.
The patches are reviewed/discussed on the mailing list that you sent them to.
On the receiving side, as a maintainer, you'd use `git am` (apply mail) that can import the commits from a set of mbox files into your local git tree.
> ...and I immediately got flamed by several people because no one used patches any more.
How are these ideas connected? The intent of git is that you work with patches.
Unlike patches, pull requests aren't even a feature of git.
It just really highlights how much better BitKeeper and then Git's design was compared to what came before. You then pile on being free/OSS, and being "proven" by an extremely large, well known, and successful project on top, and you have yourself explosive growth.
There are developers around these days who never had the displeasure of using the pre-Git source control offerings; it was rough.
Diff3 is from 1979 (https://en.wikipedia.org/wiki/Diff3), so three-way merges (https://en.wikipedia.org/wiki/Merge_(version_control)#Three-...) predate git by decades.
Priceless.
Same year I deleted all our customer's websites by simply dragging the hosting folder somewhere into C:\programs or something by mistake... A double click + lag turned into a drag and drop! Whoops!
I was pale as a ghost as I asked for the zip drive.
We had to reboot the file server first, which we did a swift kick to the power button.
At least today we employ very secure mechanisms; like YAML rollouts of config, to keep things interesting.
But the percentage is probably small, yes.
Google still uses a clone of Perforce internally (and various wrappers of it). Perforce was released in 1995.
For code, I prefer git as I said, but in a game's depot most files are not code, and Perforce is built around handling those other assets well.
Which makes sense, between the "if we change it we break it in some subtle way" and "we don't expose that in UI anymore so the new panel doesn't have it".
My understanding is that windows want to move to a "you can't configure much of anything, unless you use group policy and then you set everything through that" so they don't update the settings and don't include them in the new screens for 90% of the things, but then they have this huge moat of non active directory users who need to go into the settings and my god are they bad.
ODBC Data Source Administrator (64-bit)
Configure > untick "Use Current Directory", Select Directory
Gotta love that the disk and directory picker survived 20-30 years.
In Windows 10, Wordpad and Paint can both bring up the classic Windows 3.x colour picker Window, complete with the inscrutable Custom Colours bit. Although Wordpad is gone in Windows 11 and I don't think the Windows 11 Paint has the classic picker. It still (IIRC) has a colour arrangement in its new picker that is based on the classic pickers default colour set. Which were chosen because they dither nicely to 16 colours with the Windows 3.x dither algorithm.
It curled, and we got the registry.
Simple example, I wanted to customize my gestures in gnome. I installed another app for it on the recommendation of multiple stack overflow and Reddit threads.
I ended up losing the default gnome gestures, and even disabling the app didn’t help.
I only use my windows (10) ltsc installation now. (Where, fwiw, I do have an absolute *ton* of customization/“ricing” apps for everything from custom ux themes to taskbar tweaks. Amazingly, pretty much everything is stable.)
as a long time windows user, I wish linux copied this feature more
And that's not getting into the issue of whether or not something is a kernel issue or not. And it could be the responsibility of the distro to provide the tools to change the settings.
Basically, it's a lot of people with no obligation to each other trying to work in concert.
The situation on Windows is different. Windows is both the kernel and the shell and the window manager and the provider for a lot of the core tools.
Apple sidestepped the issue with OSX. They took a robust kernel, FreeBSD, and created a GUI and tools on top of that. I think they also essentially took over FreeBSD or at least forked it internally.
They used NeXT’s XNU kernel which was a merger between CMU’s Mach and Berkeley’s 4.3BSD. They later refreshed it with code from OSF’s MK derivative of Mach (which also incorporated some code from the University of Utah) and code from FreeBSD, and have added a huge amount of new code of their own. They continue to pull new code from FreeBSD every now and again, but it isn’t so much a plain fork of FreeBSD as a merger between parts of FreeBSD and a lot of other stuff with a completely different heritage
(Heck, recently I migrated a VM to its third hypervisor. It began as a physical machine a quarter century ago.)
This is how I think about music and Spotify. Pretty much all music exists, you 'just' have to remember everything that exists and what it's called so you can find it.
Recent HN link, Red Alert 2 in your web browser. A game from 25 years ago which you can unofficially download the C++ original version from the Internet Archive to upload to the website which extracts the assets to play in a Javascript based reimplementation inside a web hypertext document browser.
Edit: https://devblogs.microsoft.com/oldnewthing/20190830-00/?p=10... seems the justification was that UTF-8 didn't exist yet? Not totally accurate, but it wasn't fully standardized. Also that other article seems to imply Windows 95 used UTF16 (or UCS2, but either way 16-bit chars) so I'm confused about porting code being a problem. Was it that the APIs in 95 were still kind of a halfway point?
By the way, UTF-16 also didn't exist yet: Windows started with UCS-2. Though I think the name "UCS-2" also didn't exist yet -- AFAIK that name was only introduced in Unicode 2.0 together with UCS-4/UTF-32 and UTF-16 -- in Unicode 1.0, the 16-bit encoding was just called "Unicode" as there were no other encodings of unicode.
That's not true, UTF-8 predates Windows NT. It's just that the jump from ASCII to UCS2 (not even real UTF16) was much easier and natural and at the time a lot of people really thought that it would be enough. Java made the same mistake around the same time. I actually had the very same discussions with older die-hard win developers as late as 2015, for a lot of them 2 bytes per symbol was still all that you could possibly need.
Windows NT started development in 1988 and the public beta was released in July 1992 which happened before Ken Thompson devised UTF-8 on a napkin in September 1992. Rob Pike gave a UTF-8 presentation at USENIX January 1993.
Windows NT general release was July 1993 so it's not realistic to replace all UCS-16 code with UTF-8 after January 1993 and have it ready in less than 6 months. Even Linux didn't have UTF-8 support in July 1993.
Which, let's not forget, also meant an external ecosystem already developing software for it
Technically UTF-8 was invented before the first Windows NT release, but they would have had to rework a nearly finished and already delayed OS
Unicode 1.0 was in 1991, UTF-8 happened a year later, and Unicode 2.0 (where more than 65,536 characters became “official”, and UTF-8 was the recommended choice) was in 1996.
That means if you were green-fielding a new bit of tech in 1991, you likely decided 16 bits per character was the correct approach. But in 1992 it started to become clear that maybe a variable with encoding (with 8 bits as the base character size) was on the horizon. And by 1996 it was clear that fixed 16-bit characters was a mistake.
But that 5-year window was an extremely critical time in computing history: Windows NT was invented, so was Java, JavaScript, and a bunch of other things. So, too late, huge swaths of what would become today’s technical landscape had set the problem in stone.
UNIXes only use the “right” technical choice because it was already too hard to move from ASCII to 16-bit characters… but laziness in moving off of ASCII ultimately paid off as it became clear that 16-bits per character was the wrong choice in the first place. But otherwise UNIX would have had the same fate.
Edit: should be done now
JojoFatsani•2mo ago
hulitu•2mo ago
It won't compile.
bombcar•2mo ago
saghm•2mo ago
kachapopopow•2mo ago
Findecanor•2mo ago
londons_explore•2mo ago
I can't immediately see why explorer.exe wouldn't run and give you a start menu
isodev•2mo ago
delta_p_delta_x•2mo ago
Like I always say, the user-mode of Windows is easiest to change, that's why it has been done almost every version.
BruceEel•2mo ago
k12sosse•2mo ago
BruceEel•2mo ago