frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Jemalloc Postmortem

https://jasone.github.io/2025/06/12/jemalloc-postmortem/
264•jasone•3h ago•63 comments

Show HN: I wrote a BitTorrent Client from scratch

https://github.com/piyushgupta53/go-torrent-client
21•piyushgupta53•22m ago•4 comments

Frequent reauth doesn't make you more secure

https://tailscale.com/blog/frequent-reath-security
734•ingve•10h ago•327 comments

Rendering Crispy Text on the GPU

https://osor.io/text
85•ibobev•3h ago•16 comments

A Dark Adtech Empire Fed by Fake CAPTCHAs

https://krebsonsecurity.com/2025/06/inside-a-dark-adtech-empire-fed-by-fake-captchas/
106•todsacerdoti•7h ago•24 comments

Slow and Steady, This Poem Will Win Your Heart

https://www.nytimes.com/interactive/2025/06/12/books/kay-ryan-turtle-poem.html
3•mrholme•18m ago•2 comments

A receipt printer cured my procrastination

https://www.laurieherault.com/articles/a-thermal-receipt-printer-cured-my-procrastination
831•laurieherault•17h ago•454 comments

iPhone 11 emulation done in QEMU

https://github.com/ChefKissInc/QEMUAppleSilicon
254•71bw•14h ago•20 comments

Show HN: Tritium – The Legal IDE in Rust

https://tritium.legal/preview
173•piker•17h ago•81 comments

Urban Design and Adaptive Reuse in North Korea, Japan, and Singapore

https://www.governance.fyi/p/adaptive-reuse-across-asia-singapores
16•daveland•3h ago•3 comments

Three Algorithms for YSH Syntax Highlighting

https://github.com/oils-for-unix/oils.vim/blob/main/doc/algorithms.md
12•todsacerdoti•3h ago•2 comments

Show HN: McWig – A modal, Vim-like text editor written in Go

https://github.com/firstrow/mcwig
95•andrew_bbb•15h ago•8 comments

Maximizing Battery Storage Profits via High-Frequency Intraday Trading

https://arxiv.org/abs/2504.06932
226•doener•19h ago•215 comments

The curse of Toumaï: an ancient skull and a bitter feud over humanity's origins

https://www.theguardian.com/science/2025/may/27/the-curse-of-toumai-ancient-skull-disputed-femur-feud-humanity-origins
41•benbreen•7h ago•16 comments

Worldwide power grid with glass insulated HVDC cables

https://omattos.com/2025/06/12/glass-hvdc-cables.html
55•londons_explore•9h ago•35 comments

Show HN: Tool-Assisted Speedrunning the Boring Parts of Animal Crossing (GCN)

https://github.com/hunterirving/pico-crossing
79•hunterirving•15h ago•11 comments

Rust compiler performance

https://kobzol.github.io/rust/rustc/2025/06/09/why-doesnt-rust-care-more-about-compiler-performance.html
185•mellosouls•2d ago•133 comments

Why does my ripped CD have messed up track names? And why is one track missing?

https://www.akpain.net/blog/inside-a-cd/
108•surprisetalk•14h ago•109 comments

Solving LinkedIn Queens with SMT

https://buttondown.com/hillelwayne/archive/solving-linkedin-queens-with-smt/
98•azhenley•13h ago•33 comments

Chatterbox TTS

https://github.com/resemble-ai/chatterbox
594•pinter69•1d ago•177 comments

Microsoft Office migration from Source Depot to Git

https://danielsada.tech/blog/carreer-part-7-how-office-moved-to-git-and-i-loved-devex/
312•dshacker•1d ago•247 comments

Roundtable (YC S23) Is Hiring a President / CRO

https://www.ycombinator.com/companies/roundtable/jobs/wmPTI9F-president-cro-founding
1•timshell•8h ago

First thoughts on o3 pro

https://www.latent.space/p/o3-pro
129•aratahikaru5•2d ago•109 comments

Major sugar substitute found to impair brain blood vessel cell function

https://medicalxpress.com/news/2025-06-major-sugar-substitute-impair-brain.html
29•wglb•5h ago•6 comments

Helion: A modern fast paced Doom FPS engine in C#

https://github.com/Helion-Engine/Helion
141•klaussilveira•2d ago•54 comments

Dancing brainwaves: How sound reshapes your brain networks in real time

https://www.sciencedaily.com/releases/2025/06/250602155001.htm
143•lentoutcry•4d ago•39 comments

Quantum Computation Lecture Notes (2022)

https://math.mit.edu/~shor/435-LN/
122•ibobev•3d ago•43 comments

The Case for Software Craftsmanship in the Era of Vibes

https://zed.dev/blog/software-craftsmanship-in-the-era-of-vibes
87•Bogdanp•5h ago•28 comments

US-backed Israeli company's spyware used to target European journalists

https://apnews.com/article/spyware-italy-paragon-meloni-pegasus-f36dd32106f44398ee24001317ccf2bb
525•01-_-•13h ago•251 comments

Sorcerer (YC S24) raises $3.9M to launch more weather balloons

https://www.axios.com/pro/climate-deals/2025/06/12/sorcerer-seed-weather-balloons
42•tndl•12h ago•59 comments
Open in hackernews

Microsoft Office migration from Source Depot to Git

https://danielsada.tech/blog/carreer-part-7-how-office-moved-to-git-and-i-loved-devex/
312•dshacker•1d ago

Comments

smitty1e•1d ago
> We spent months debugging line ending handling

"Gosh, that sounds like a right mother," said Unix.

pm90•1d ago
Its oddly fascinating that Microsoft has managed to survive for so long with ancient/bad tools for software engineering. Almost like “life finds a way” but for software dev. From the outside it seems like they are doing better now after embracing OSS/generic dev tools.
com2kid•1d ago
At one point source depot was Toincredibly advanced, and there are still features that it had that git doesn't. Directory mapping being a stand out feature! Being able to only pull down certain directories from a depot and also remap where they are locally, and even have the same file be in multiple places. Makes sharing dependencies across multiple projects really easy, and a lot of complicated tooling around "monorepos" wouldn't need to exist if git supported directory mapping.

(You can get 80% of the way there with symlinks but in my experience they eventually break in git when too many different platforms making commits)

Also at one point I maintained an obscenely advanced test tool at MS, it pounded through millions of test cases across a slew of CPU architectures, intermingling emulators and physical machines that were connected to dev boxes hosting test code over a network controlled USB switch. (See: https://meanderingthoughts.hashnode.dev/how-microsoft-tested... for more details!)

Microsoft had some of the first code coverage tools for C/C++, spun out of a project from Microsoft Research.

Their debuggers are still some of the best in the world. NodeJS debugging in 2025 is dog shit compared to C# debugging in 2005.

bsder•1d ago
> git supported directory mapping.

Is this a "git" failure or a "Linux filesystems suck" failure?

It seems like "Linux fileystems" are starting to creak under several directions (Nix needing binary patching, atomic desktops having poor deduplication, containers being unable to do smart things with home directories or too many overlays).

Would Linux simply sucking it up and adopting ZFS solve this or am I missing something?

MobiusHorizons•1d ago
How is that related? I don’t think anyone would suggest ntfs is a better fit for these applications. It worked because it was a feature of the version control software, not because of file system features.
yjftsjthsd-h•1d ago
What would ZFS do for those issues? I guess maybe deduplication, but otherwise I'm not thinking of anything that you can't do with mount --bind and overlays (and I'm not even sure ZFS would replace overlays)
bsder•1d ago
Snapshots seems to be a cheap feature in ZFS but are expensive everywhere else, for example.

OverlayFS has had performance issues on Linux for a while (once you start composing a bunch of overlays, the performance drops dramatically as well as you start hitting limits on number of overlays).

adrian_b•23h ago
Nowadays the ZFS advantage for snapshots is no longer true.

Other file systems, e.g. the much faster XFS, have equally efficient snapshots.

klank•1d ago
Ok, but now tell me your real thoughts on sysgen. ;-)
o11c•1d ago
As always, git's answer to the problem is "stop being afraid of `git submodule`."

Cross-repo commits are not a problem as long as you understand "it only counts as truly committed if the child repo's commit is referenced from the parent repo".

xmprt•1d ago
> it only counts as truly committed if the child repo's commit is referenced from the parent repo

This is a big problem in my experience. Relying on consumers of your dependency to upgrade their submodules isn't realistic.

mickeyp•21h ago
Git submodules are awful. Using subversion's own submodule system should be mandatory for anyone claiming Git's implementation is somehow worthwhile or good.
tikkabhuna•1d ago
I never understood the value of directory mapping when we used Perforce. It only seemed to add complexity when one team checked out code in different hierarchies and then some builds worked, some didn’t. Git was wonderful for having a simple layout.
senderista•1d ago
You might feel differently if you worked on just a few directories in a giant repo. Sparse client views were a great feature of SD.
int_19h•12h ago
I'm in exactly this situation with Perforce today, and I still hate it. The same problem OP described applies - you need to know which exact directories to check out to build, run tests etc successfully. You end up with wikis filled with obscure lists of mappings, many of them outdated, some still working but including a lot of cruft because people just copy it around. Sometimes the required directories change over time and your existing workspaces just stop working.

Git has sparse client views with VFS these days.

dangus•1d ago
Let’s not forget that Microsoft developed a lot of tools in the first place, as in, they were one of the companies that created things that didn’t really exist before Microsoft created them.

Git isn’t even very old, it came out in 2005. Microsoft Office first came out in 1990. Of course Office wasn’t using git.

dboreham•1d ago
Some examples would be useful here. Not knocking MS tools in general but are there any that were industry fists? Source code control for example existed at least since SCCS which in turn predates Microsoft itself.
pianoben•1d ago
Of course that's only half the story - Microsoft invents amazing things, and promptly fails to capitalize on them.

AJAX, that venerable piece of kit that enabled every dynamic web-app ever, was a Microsoft invention. It didn't really take off, though, until Google made some maps with it.

noen•22h ago
Microsoft rarely did or does anything first. They are typically second or third to the post and VC is no different.

Most people don’t know or realize that Git is where it is because of Microsoft. About 1/2 of the TFS core team spun out to a foundation where they spent several years doing things like making submodules actually work, writing git-lfs, and generally making git scale.

You can look for yourself at the libgit2 repo back in the 2012-2015 timeframe. Nearly the whole thing was rewritten by Microsoft employees as the earliest stages of moving the company off source depot.

It was a really cool time that I’m still amazed to have been a small part of.

lIl-IIIl•22h ago
Office is a package including things like Word and Excel. Word itself came out in 1984 for the first Macintosh. Windows OS did not yet exist.
senderista•1d ago
Google used Perforce for years and I think Piper still has basically the same interface? So no, MSFT wasn’t ridiculously behind the times by using Source Depot for so long.
azhenley•1d ago
I spent nearly a week of my Microsoft internship in 2016 adding support for Source Depot to the automated code reviewer that I was building (https://austinhenley.com/blog/featurestheywanted.html) despite having no idea what Source Depot was!

Quite a few devs were still using it even then. I wonder if everything has been migrated to git yet.

sciencesama•1d ago
Naah still a lot of stuff works on sd !! Those sd commands and setting up sd gives me chills !!
hacker_homie•23h ago
Most of the day to day is in git, now.
PretzelPirate•16h ago
I miss CodeFlow everyday. It was such a great tool to use.
3eb7988a1663•1d ago

  We communicated the same information through multiple channels: weekly emails, Teams, wiki docs, team presentations, and office hours. The rule: if something was important, people heard it at least 3 times through different mediums.
If only this were standard. Last week I received the only notification that a bunch of internal systems were being deleted in two weeks. No scream test, no archiving, just straight deletion. Sucks to be you if you missed the email for any reason.
MBCook•1d ago
No kidding. The amount of things that change in important environments without anyone telling people outside their teams in some organizations can be maddening.
dshacker•1d ago
Even with this, there were many surprised people. I'm still amazed at all of the people that can ignore everything and just open their IDE and code (and maybe never see teams or email)
pvdebbe•23h ago
In my previous company it came to me as a surprise to learn from a third party that our office had moved lol.
sofixa•21h ago
Alternatively, communications fatigue. How many emails does the average employee get with nonsense that doesn't apply to them? Oh cool, we have a new VP. Oh cool, that department had a charity drive. Oh cool, system I've never heard of is getting replaced by a new one, favourite of this guy I've never heard of.

Add in the various spam (be it attacks or just random vendors trying to sell something).

At some point, people start to zone out and barely skim, if that, most of their work emails. Same with work chats, which are also more prone to people sharing random memes or photos from their picnic last week or their latest lego set.

kmoser•16h ago
Everybody gets important emails, and it's literally part of their job to filter the wheat from the chaff. One of my benchmarks for someone's competency is their ability to manage information. With a combination of email filters and mental discipline, even the most busy inbox can be manageable. But this is an acquired skill, akin to not getting lost in social media, and some people are far better at it than others.
BenjiWiebe•14h ago
If the same internal sender sends both irrelevant and important messages, it'll be pretty hard or impossible to filter.

My #1 method of keeping my inbox clean, is unsubscribing from newsletters.

Marsymars•14h ago
Our HR lady took personal offence when I asked to be unsubscribed from the emails about “deals” that employees have access to from corporate partners. :(
Vilian•9h ago
You can set custom rules in thunderbird to deal with specific mails, like tagging it as a "sale" or just deleting it based on regex
Marsymars•7h ago
Yeah, I ended up doing that with Outlook.

I also set up a rule to auto-delete phishing test emails based on their headers, which annoyed the security team.

kmoser•9h ago
Yes, the last filter is always the human being who has to deal with whatever the computer couldn't automate. But even then, you should be able to skim an email and quickly determine its relevancy, and decide whether you need to take action immediately, can leave it for the future, or can just delete it. Unless you're getting thousands of emails a day, this should be manageable.
AdamN•21h ago
If you read all the notifications you'll never do your actual job. People who just open their IDE and code are to be commended in some respects - but it's a balance of course.
xwolfi•1d ago
What we do is we scream the day before, all of us, get replied that we should have read the memo, reply we have real work to do, and the thing gets cancelled last minute, a few times a year, until nobody gives a fuck anymore.
indemnity•1d ago
I feel this.

Every month or two, we get notifications along the FINAL WARNING lines, telling us about some critical system about to be deleted, or some new system that needs to be set up Right Now, because it is a Corporate Standard (that was never rolled out properly), and by golly we have had enough of teams ignoring us, the all powerful Board has got its eyes on you now.

It's a full time job to keep up with the never-ending churn. We could probably just spend all our engineering effort being compliant and never delivering features :)

Company name withheld to preserve my anonymity (100,000+ employees).

90s_dev•1d ago
I actually remember using Perforce back in like 2010 or something. And I can't remember why or for which client or employer. I just remember it was stupid.
dboreham•1d ago
And expensive.
broodbucket•1d ago
There's still a lot of Perforce around. I've thankfully managed to avoid it but I have plenty of friends in the industry who still have to use it.
HideousKojima•1d ago
Perforce is still widely used in the game industry
gmueckl•1d ago
Perforce is convoluted and confusing, but I don't think it's really fair to call it stupid. It is still virtually unmatched in a couple of areas.
90s_dev•1d ago
I wasn't being fair, I was being mean. Perforce is stupid and ugly.
bananaboy•16h ago
I would say it's no more convoluted and confusing than git. I used Perforce professionally for quite a few years in gamedev, and found that a bit confusing at first. Then I was self-employed and used git, and coming to git from Perforce I found it very confusing at first. But then I grew to love it. Now I'm back to working for a big gamedev company and we use Perforce and I feel very proficient in both.
bob1029•22h ago
Perforce is really nice if you need to source control 16k textures next to code without thinking too much about it. Git LFS absolutely works but it's more complicated and has less support in industry tooling. Perforce also makes it easier to purge (obliterate) old revisions of files without breaking history for everyone. This can be invaluable if your p4 server starts to run out of disk space.

The ability to lock files centrally might seem outdated by the branching and PR model, but for some organizations the centralized solution works way better because they have built viable business processes around it. Centralized can absolutely smoke distributed in terms of iteration latency if the loop is tight enough and the team is cooperating well.

dazzawazza•22h ago
I agree with everything you say except git-lfs works. For modern game dev (where a full checkout is around 1TB of data) git-lfs is too slow, too error prone and too wasteful of disk space.

Perforce is a complete PITA to work with, too expensive and is outdated/flawed for modern dev BUT for binary files it's really the only game in town (closely followed by svn but people have forgotten how good svn was and only remember how bad it was at tracking branch merging).

daemin•18h ago
Sounds like the filesystem filter is required for the files in the repository and not just the metadata in the .git folder.
barries11•17h ago
I used Perforce a lot in the 90s, when it was simple (just p4, p4d, and p4merge!), super fast, and never crashed or corrupted itself. Way simpler, and easier to train newbies on, than any of the alternatives.

Subdirectories-as-branches (like bare repo + workspace-per-branch practices w/git) is so much easier for average computer users to grok, too. Very easy to admin too.

No idea what the current "enterprisey" offering is like, though.

For corporate teams, it was a game changer. So much better than any alternative at the time.

We're all so used to git that we've become used to it's terribleness and see every other system as deficient. Training and supporting a bunch of SWE-adjacent users (hw eng, ee, quality, managers, etc) is a really, really good reality check on how horrible the git UX and datamodel is (e.g. obliterating secrets--security, trade, or PII/PHI--that get accidentally checked in is a stop-the-world moment).

For the record, I happily use git, jj, and Gitea all day every day now (and selected them for my current $employer). However, also FTR, I've used SCCS, CVS, SVN, VSS, TFS and MKS SI professionally, each for years at a time.

All of the comments dismissing tools that are significantly better for most use cases other than distributed OSS, but lost the popularity contest, is shortsighted.

Git has a loooong way to go before it's as good in other ways as many of its "competitors". Learning about their benefits is very enlightening.

And, IIRC, p4 now integrates with git, though I've never used it.

int_19h•12h ago
I've used CVS, SVN, TFS, Mercurial, and Git in the past, so I have plenty of exposure to different options. I have to deal with Perforce in my current workplace and I have to say that even from this perspective it's honestly pretty bad in terms of how convoluted things are.
barries11•2h ago
I don't disagree at all--p4 was kick-ass back in the day but the world, and our expectations, have moved on. Plus, they went all high-street enterprisey.

What makes it convoluted? Where did it lose the beat?

90s_dev•1d ago
In about 2010, I briefly had a contract with a security firm with one dev, and there was no source control, and everything written was in low quality PHP. I quit after a week.
golergka•1d ago
What kind of security services did they provide? Breaches?
layer8•18h ago
Job security for the dev, probably.
dshacker•1d ago
php_final_final_v2.zip shipped to production. A classic. I had a similar experience with https://www.ioncube.com/ php encryption. Everything encrypted and no source control.
israrkhan•1d ago
We did migrate from Perforce to Git for a fairly large repositories, and I can relate to some of the issues. Luckily we did not had to invent VFS, although git-lfs was useful for large files.
carlual•1d ago
> Authenticity mattered more than production value.

Thanks for sharing this authentic story! As an ex-MSFT in a relatively small product line that only started switching to Git from SourceDepot in 2015, right before I left, I can truly empathize with how incredible a job you guys have done!

dshacker•1d ago
Yeah, it was a whole journey. I can't believe it happened. Thanks for your comment.
carlual•1d ago
Thank you! Btw, it reminds me of the book "Showstopper" about the journey of releasing Windows NT; highly recommended!
tux1968•1d ago
Thanks for the recommendation! I was just about to reread "Soul Of A New Machine", but will try Showstopper instead, since it sounds to be the same genre.
zem•21h ago
tangentially, if you like that genre one of my favourite books in it is "where wizards stay up late", about the development of the internet.
hacker_homie•23h ago
I spent a lot of time coaching people out of source depot, it was touch and go there for a while. It was worth it though thank you for Your effort.
MBCook•1d ago
Could someone explain the ideas of forward integration and reverse integration in Source Depot?

I’d never heard of Source Depot before today.

israrkhan•1d ago
source depot is (was?) essentially a fork of perforce.
MBCook•15h ago
The article mentioned something along those lines, but I’ve never used it either.

I’ve only ever really used CVS, SVN, and Git.

int_19h•12h ago
Perforce is broadly similar to SVN in semantics, and the same branching logic applies to both. Basically if you have the notion of long-lived main branch and feature branches (and possibly an hierarchy in between, e.g. product- or component-specific branches), you need to flow code between them in an organized way. Forward/reverse integration simply describes the direction in which this is done - FI for main -> feature, RI for feature -> main.
dshacker•1d ago
RI/FI is similar to having long-lived branches in Git. Imagine you have a "develop-word" branch in git. The admins for that branch would merge all of the changes of their code to "main" and from "main" to their long lived branches. It was a little bit different than long-lived git branches as they also had a file filter (my private branch only had onenote code and it was the "onenote" branch)
mikepurvis•1d ago
I've long wanted a hosted Git service that would help me maintain long lived fork branches. I know there's some necessary manual work that is occasionally required to integrate patches, but the existing tooling that I'm familiar with for this kind of thing is overly focused on Debian packaging (quilt, git-buildpackage) and has horrifyingly poor ergonomics.

I'd love a system that would essentially be a source control of my patches, while also allowing a first class view of the upstream source + patches applied, giving me clear controls to see exactly when in the upstream history the breakages were introduced, so that I'm less locking in precise upstream versions that can accept the patches, and more actively engaging with ranges of upstream commits/tags.

I can't imagine how such a thing would actually be commercially useful, but darned if would be an obvious fit for AI to automatically examine the upstream and patch history and propose migrations.

dybber•1d ago
We had a similar setup, also with a homegrown VCS developed internally in our company, where I sometimes acted as branch admin. I’m not sure it worked exactly like Source Depot, but I can try to explain it.

Basically instead of everyone creating their own short-lived branches (expensive operation), you would have long-lived branches that a larger group of people would commit to (several product areas). The branch admins job was then to get the work all of these people forward integrated to a branch upwards in the hierarchy. This was attempted a few times per day, but if tests failed you would have to reach out to the responsible people to get those test fixed. Then later, when you get the changes merged upwards, some other changes have also been made to the main integration branch, and now you need to pull these down into your long lived branch - reverse integration - such that your branch is up to date with everyone else in the company.

neerajsi•1d ago
At least in the Windows group, we use ri and fi oppositely from how you describe. RI = sharing code with a broader group of people toward trunk. FI = absorbing code created by the larger group of people on the dev team. Eventually we do a set of release forks that are isolated after a final set of FIs, so really outside customers get code via FI and then cherry pick style development.
BobbyTables2•1d ago
I’d like to know when Microsoft internally migrated away from Visual SourceSafe…

They should have recalled it to avoid continued public use…

dshacker•1d ago
I didn't even know Microsoft SourceSafe existed.
masklinn•1d ago
Lucky you. Definitely one of the worst tools I’ve had the displeasure of working with. Made worse by people building on top of it for some insane reason.
moron4hire•23h ago
It was at least a little better than CVS, but with SVN available at the same time, never understood the mentality of the offices that I worked at using Source Safe instead of SVN.
masklinn•22h ago
> It was at least a little better than CVS

Highly debatable.

CVS has a horrendous UI, but didn’t have a tendency to corrupt itself at the drop of a hat and didn’t require locking files to edit them by default (and then require a repository admin to come in and unlock files when a colleague went on holidays with files checked out). Also didn’t require shared write access to an SMB share (one of the reasons it corrupted itself so regularly).

mickeyp•21h ago
Agreed. It had a funny habit of corrupting its own data store also. That's absolutely what you want in a source control system.

It sucked; but honestly, not using anything is even worse than SourceSafe.

masklinn•21h ago
> Agreed. It had a funny habit of corrupting its own data store also. That's absolutely what you want in a source control system.

I still ‘member articles calling it a source destruction system. Good times.

> It sucked; but honestly, not using anything is even worse than SourceSafe.

There have always been alternatives. And even when you didn’t use anything, at least you knew what to expect. Files didn’t magically disappear from old tarballs.

TowerTall•21h ago
I remember when we migrated from Visual Source Safe to TFS at my place of work. I was in charge of the migration and we hit errors and opened a ticket with Microsoft Premier Support. The ticket ended up being assigned to one of creators of Source Safe who replied "What you are seeing is not possible". He did manage to solve it in the end after a lot of head scratching.
codeulike•21h ago
We used it. We knew no better. It was different then, you might not hear about alternatives unless you went looking for them. Source Safe was integrated with Visual Studio so was an obvious choice for small teams.

Get this; if you wanted to change a file you had to check it out. It was then locked and no-one else could change it. Files were literally read only on your machine unless you checked them out. The 'one at a time please' approach to Source Control (the other approach being 'lets figure out how to merge this later')

namdnay•20h ago
I remember a big commercial SCM at the time that had this as an option, when you wanted to make sure you wouldn’t need to merge. Can’t remember what it was called, you could “sync to file system” a bit like dropbox and it required teams of full time admins to build releases and cut branches and stuff . Think it was bought by IBM?
robin_reala•17h ago
I guess you’re talking about Rational Rose? I had the misfortune of using that at my first industry job (fintech in 2004).
meepmorp•9h ago
Rose is a UML modeling tool
robin_reala•8h ago
Oops, it was ClearCase that was Rational’s SCM: https://en.wikipedia.org/wiki/IBM_DevOps_Code_ClearCase
becurious•13h ago
ClearCase?
rswail•20h ago
Which is exactly how CVS (and its predecessors RCS and SCCS) worked.

They were file based revision control, not repository based.

SVN added folders like trunk/branches/tags that overlaid the file based versioning by basically creating copies of the files under each folder.

Which is why branch creation/merging was such a complicated process, because if any of the files didn't merge, you had a half merged branch source and a half merged branch destination that you had to roll back.

fanf2•19h ago
CVS was called the “concurrent version system” because it did not lock files on checkout. Nor did svn. Perforce does.
rswail•15h ago
True dat, my mistake. That was its major feature, from memory though it still used the same reversed diff file format?
ack_complete•14h ago
Perforce does not lock files on checkout unless you have the file specifically configured to enforce exclusive locking in the file's metadata or depot typemap.
umanwizard•17h ago
I am quite sure that you can edit files in an svn repo to your heart’s content regardless of whether anyone else is editing them on their machine at the same time.
masklinn•15h ago
Yep, svn has a lock feature but it is opt-in per file (possibly filetype?)

A pretty good tradeoff, because you can set it on complex structured files (e.g. PSDs and the like) to avoid the ballache of getting a conflict in an unmergeable file but it does not block code edition.

And importantly anyone can steal locks by default. So a colleague forgetting to unlock and going on holidays does not require finding a repo admin.

pjc50•19h ago
The lock approach is still used in IC design for some of the Cadence/Synopsis data files which are unmergable binaries. Not precisely sure of the details but I've heard it from other parts of the org.
dagw•17h ago
A lot of engineering is the same. You cannot diff and merge CAD files, so you lock them.
malkia•14h ago
Similar in video game shops - lots of binary files, or even huge (non-editable by human) text ones.
Disposal8433•18h ago
The file lock was a fun feature when a developer forgot to unlock it and went on holidays. Don't forget the black hole feature that made files randomly disappear for no reason. It may have been the worst piece of software I have ever used.
qingcharles•20h ago
It was pretty janky. We used it in the gamedev world in the 90s once the migration to Visual C started.
pianoben•1d ago
I don't know that they ever used it internally, certainly not for anything major. If they had, they probably wouldn't have sold it as it was...

Can't explain TFS though, that was still garbage internally and externally.

RandallBrown•1d ago
I doubt most teams ever used it.

I spent a couple years at Microsoft and our team used Source Depot because a lot of people thought that our products were special and even Microsoft's own source control (TFS at the time) wasn't good enough.

I had used TFS at a previous job and didn't like it much, but I really missed it after having to use Source Depot.

RyJones•1d ago
USGEO used it in the late 90s, as well as RAID
jbergens•20h ago
I was surprised that TFS was not mentioned in the story (at least not as far as I have read).

It should have existed around the same time and other parts of MS were using it. I think it was released around 2005 but MS probably had it internally earlier.

canucker2016•20h ago
SLM (aka slime, shared file-system source code control system) was used in most of MS, aka systems & apps.

NT created (well not NT itself, IIRC, there was some an MS-internal developer tools group in charge)/moved to source depot since a shared file-system doesn't scale well to thousands of users. Especially if some file gets locked and you DoS the whole division.

Source depot became the SCCS of choice (outside of Dev Division).

Then git took over, and MS had to scale git to NT-size scale, and upstream many of the changes to git mainline.

Raymond Chen has a blog that mentions much of this - https://devblogs.microsoft.com/oldnewthing/20180122-00/?p=97...

int_19h•13h ago
TFS was used heavily by DevDiv, but as far as I know they never got perf to the point where Windows folk were satisfied with it on their monorepo.

It wasn't too bad for a centralized source control system tbh. Felt a lot like SVN reimagined through the prism of Microsoft's infamous NIH syndrome. I'm honestly not sure why anyone would use it over SVN unless you wanted their deep integration with Visual Studio.

anonymars•5h ago
After the initial TFS 1.0 hiccups, merging was way, way better than SVN. SVN didn't track anything about merges until 1.6. Even today git's handling of file names has nothing on TFS.
mattgrice•1d ago
Around 2000? The only project I ever knew that used it was .NET and that was on SD by around then.
RyJones•1d ago
I was on the team that migrated Microsoft from XNS to TCP/IP - it was way less involved, but similar lessons learned.

Migrating from MSMAIL -> Exchange, though - that was rough

aaronbrethorst•23h ago
Is that what inspired the "Exchange: The Most Feared and Loathed Team in Microsoft" license plate frames? I'm probably getting a bit of the wording wrong. It's been nearly 20 years since I saw one.
RyJones•23h ago
Probably. A lot of people really loved MSMAIL; not so much Exchange.

I have more long, boring stories about projects there, but that’s for another day

canucker2016•20h ago
And sometimes they loved MSMAIL for the weirdest reasons...

MSMAIL was designed for Win3.x. Apps didn't have multiple threads. The MSMAIL client app that everyone used would create the email to be sent and store the email file on the system.

An invisible app, the Mail Pump, would check for email to be sent and received during idle time (N.B. Other apps could create/send emails via APIs, so you couldn't have the email processing logic in only the MSMAIL client app).

So the user could hit the Send button and the email would be moved to the Outbox to be sent. The mail pump wouldn't get a chance to process the outgoing email for a few seconds, so during that small window, if the user decided that they had been too quick to reply, they could retract that outgoing email. Career-limited move averted.

Exchange used a client-server architecture for email. Email client would save the email in the outbox and the server would notice the email almost instantly and send it on its way before the user blinked in most cases.

A few users complained that Exchange, in essence, was too fast. They couldn't retract a misguided email reply, even if they had reflexes as quick as the Flash.

RyJones•17h ago
I re-wrote MSPAGER for Exchange. Hoo boy what a hack that was! A VB3 app running as a service, essentially. I don't know if you remember romeo and juliet; those were PCs pulled from pc-recycle by a co-worker to serve install images.
mschuster91•17h ago
> A few users complained that Exchange, in essence, was too fast.

That is something that's actually pretty common and called "benevolent deception" - it has been discussed on HN some years past, too [1].

[1] https://news.ycombinator.com/item?id=16289380

canucker2016•20m ago
I wouldn't call either Mail Pump's slow email processing a "benevolent deception", nor Exchange's quick email processing an attempt to be perceived as a fast email server.

In MSMail/Exchange Client/Outlook, the presence of email in the Outbox folder signifies that an email is to be sent, but that the code for sending the email hasn't processed that particular email.

MSMail being slower than Exchange to send email is a leaky abstraction due to software architecture.

Win3.x doesn't support multithreaded apps, using a cooperative multitasking system. Any app doing real work would prevent the user from accessing the system since no user interface events would be processed.

So the Mail Pump would check to see if the system was idle. There are no public or secret/private Windows APIs (despite all the MS detractors) for code to determine if the system is idle - you, the developer, had to fall back to using heuristics. These heuristics aren't fast - you didn't want to declare that the system was idle only to discover the user was in the middle of an operation. So the Mail Pump had to be patient. That meant the email could sit in the outbox for more than a second.

Exchange Server was a server process running on a separate box. When an email client notified the Exchange Server that an email was to be sent (whether via RPC or SMTP command), Exchange Server didn't have to wait in case it would block the user's interaction with the computer. Exchange Server could process the email almost immediately.

But there was a happy resolution to the conundrum - no, the Exchange Server didn't add a delay.

Some architect/program manager had added a "delay before sending/processing" property to the list of supported properties of an email message. The Exchange/Win95/Capone email client didn't use/set this property. But a developer could write an email extension, allow the user to specify a default delay for each outgoing email and the extension could get notified when an email was sent and set this "delay before sending/processing" property, such that Exchange Server would wait at least the specified delay time before processing the email message.

The user who desired an extended delay before their sent email was processed by the Exchange Server, could install this client extension, and specify a desired delay interval.

Outlook eventually added support for this property a few years later.

I notice that Gmail has added support for a delay to enable the user to undo sent emails.

anonymars•5h ago
Ha, maybe my old memory is rusty, but I feel like I recognize this name and you had an old blog with some quotable Raymond Chen -- one bit I remember was something like

"How do you write code so that it compiles differently from the IDE vs the command line?" to which the answer was "If you do this your colleagues will burn you in effigy when they try to debug against the local build and it works fine"

RyJones•3h ago
Yup, that’s me.
palmotea•1d ago
What's the connection (if any) between "Source Depot" and TFSVC?
tamlin•23h ago
Source Depot was based on Perforce. Microsoft bought a license for the Perforce source code and made changes to work at Microsoft scale (Windows, Office).

TFS was developed in the Studio team. It was designed to work on Microsoft scale and some teams moved over to it (SQL server). It was also available as a fairly decent product (leagues better than SourceSafe).

nfg•23h ago
None that I know of, Source Depot is derived from Perforce.
hulitu•23h ago
> Microsoft Office migration from Source Depot to Git

Will they get an annoing window, in the midle of the migration, telling them that Office must be updated now, or the world will end ?

ksynwa•23h ago
Not doubting it but I don't understand how a shallow clone of OneNote would be 200GB.
paulddraper•23h ago
Must have videos or binaries.
LtWorf•22h ago
They probably vendor every single .dll it uses.
skrebbel•12h ago
that's a lot of .dll files!
dshacker•23h ago
Shallow clone of all of office, not onenote.
ksynwa•23h ago
Oh alright. Thanks.
carlhjerpe•23h ago
This article makes out thousands of engineers that are good enough to qualify at Microsoft and work on Office but haven't used git yet? That sounds a bit overplayed tbh, if you haven't used git you must live under a rock. You can't use Source Depot at home.

Overall good story though

dshacker•23h ago
You’d be surprised at the amount of people at Microsoft that their entire career have been at Microsoft (pre-git-creation) that never used Git. Git is relatively new (2005) but source control systems are not.
shakna•23h ago
That's still two decades. Git is so popular Microsoft bought one of the major forges 7 years ago.

To have never touched it in the last decade? You've got a gap in your CV.

stockerta•22h ago
Not everyone wants to code for hobby, so if their work not uses git then they too will not use it.
dkdbejwi383•22h ago
Not everyone _can_ code as a hobby. Some of us are old and have families and other commitments
shakna•21h ago
That's when you can only hope that your workplace is one that trains - so the investment isn't one sided.
qingcharles•20h ago
Agreed. In my professional career, the vast majority of devs I've worked with never wrote a single line of code outside of the office.
bdcravens•22h ago
The same could be said of .NET, Wordpress, or Docker.
shakna•22h ago
Yes? If its in your field, like a webdev who has never touched Wordpress, it can be surprising. An automated tester who has never tried containers also has a problem.

These are young industries. So most hiring teams expect that you take the time to learn new technologies as they become established.

AdamN•21h ago
This is one of the problems at big tech - people 10-20 years in and haven't lived in the outside world. It's a hard problem to solve.
Freak_NL•21h ago
I believe it. If you are a die-hard Microsoft person, your view of computing would be radically different from even the average developer today, let alone devs who are used to using FOSS.

Turn it around: If I were to apply for a job at Microsoft, they would probably find that my not using Windows for over twenty years is a gap on my CV (not one I would care to fill, mind).

int_19h•12h ago
It would very much depend on the team. There's no shortage of those that ship products for macOS and Linux, and sometimes that can even be the dominant platform.
YPPH•23h ago
It's entirely plausible that a long-term engineer at Microsoft wouldn't have have used git. I'm sure a considerable number of software engineers don't program as a hobby.
lIl-IIIl•22h ago
Sure you can use Source Depot (actually Perforce) at home: https://www.perforce.com/p/vcs/vc/free-version-control
YPPH•20h ago
I think Source Depot is a proprietary fork with a lot of Microsoft-stuff added in.
compiler-guy•15h ago
It only takes a week to learn enough git to get by, and only a month or two to become every-day use proficient. Especially if one is already familiar with perforce, or svn, or other VCS.

Yes, there is a transition, no it isn't really that hard.

Anyone who views lack of git experience as a gap in a CV is selecting for the wrong thing.

AdamN•21h ago
I feel like we're well into the longtail now. Are there other SCM systems or is it the end of history for source control and git is the one and done solution?
masklinn•21h ago
Mercurial still has some life to it (excluding Meta’s fork of it), jj is slowly gaining, fossil exists.

And afaik P4 still does good business, because DVCS in general and git in particular remain pretty poor at dealing with large binary assets so it’s really not great for e.g. large gamedev. Unity actually purchased PlasticSCM a few years back, and has it as part of their cloud offering.

Google uses its own VCS called Piper which they developed when they outgrew P4.

zem•21h ago
google also has a mercurial interface to piper
linkpuff•21h ago
There are some other solutions (like jujutsu, which while using git as storage medium, has some differences in the handling of commits). But I do believe we reached a critical point where git is the one stop shop for all the source control needs despite it's flaws/complexity.
dgellow•21h ago
Perforce is used in game dev, animation, etc. git is pretty poor at dealing with lots of really large assets
qiine•18h ago
why is this still the case ?
rwmj•17h ago
I've been checking in large (10s to 100s MBs) tarballs into one git repo that I use for managing a website archive for a few years, and it can be made to work but it's very painful.

I think there are three main issues:

1. Since it's a distributed VCS, everyone must have a whole copy of the entire repo. But that means anyone cloning the repo or pulling significant commits is going to end up downloading vast amounts of binaries. If you can directly copy the .git dir to the other machine first instead of using git's normal cloning mechanism then it's not as bad, but you're still fundamentally copying everything:

  $ du -sh .git
  55G .git
2. git doesn't "know" that something is a binary (although it seems to in some circumstances), so some common operations try to search them or operate on them in other ways as if they were text. (I just ran git log -S on that repo and git ran out of memory and crashed, on a machine with 64GB of RAM).

3. The cure for this (git lfs) is worse than the disease. LFS is so bad/strange that I stopped using it and went back to putting the tarballs in git.

dh2022•13h ago
Why would someone check binaries in a repo? The only time I came across checked binaries in a repo was because that particular dev could not be bothered to learn nuget / MAVEN. (the dev that approved that PR did not understand that either)
masklinn•12h ago
Because it’s way easier if you don’t require every level designer to spend 5 hours recompiling everything before they can get to work in the morning, because it’s way easier to just checkin that weird DLL than provide weird instructions to retrieve it, because onboarding is much simpler if all the tools are in the project, …

And it’s no sweat off p4’s back.

dh2022•12h ago
Hmm, I do not get it.... "The binaries are checked in the repo so that that the designer would not spend 5 hours recompiling" vs "the binaries come from a nuget site so that the designed would not spend 5 hours recompiling".

In both cases the designer does not recompile, but in the second case there are no checked in binaries in the repo... I still think nuget / MAVEN would be more appropriate for this task...

masklinn•12h ago
Everything is in P4: you checkout the project to work on it, you have everything. You update, you have everything up to date. All the tools are there, so any part of the pipeline can rely on anything that's checked in. You need an older version, you just check that out and off you go. And you have a single repository to maintain.

VCS + Nuget: half the things are in the VCS, you checkout the project and then you have to hunt down a bunch of packages from a separate thing (or five), when you update the repo you have to update the things, hopefully you don't forget any of the ones you use, scripts run on a prayer that you have fetched the right things or they crash, version sync is a crapshoot, hope you're not working on multiple projects at the same time needing different versions of a utility either. Now you need 15 layers of syncing and version management on top of each project to replicate half of what just checking everything into P4 gives you for free.

dh2022•10h ago
I have no idea what environment / team you worked on but nuget is pretty much rock solid. There are no scripts running on a prayer that everything is fetched. Version sync is not a crapshot because nuget versions are updated during merges and with proper merge procedures (PR build + tests) nuget versions are always correct on the main branch.

One does not forget what nugets are used: VS projects do that bookkeeping for you. You update the VS project with the new nugets your task requires; and this bookkeeping will carry on when you merge your PR.

I have seen this model work with no issues in large codebases: VS solutions with upwards of 500,000 lines of code and 20-30 engineers.

tom_•5h ago
But if you have to do this via Visual Studio, it's no good for the people that don't use Visual Studio.

Also, where does nuget get this stuff from? It doesn't build this stuff for you, presumably, and so the binaries must come from somewhere. So, you just got latest from version control to get the info for nuget - and now nuget has to use that info to download that stuff?

And that presumably means that somebody had to commit the info for nuget, and then separately upload the stuff somewhere that nuget can find it. But wait a minute - why not put that stuff in the version control you're using already? Now you don't need nuget at all.

nyarlathotep_•7h ago
> VCS + Nuget: half the things are in the VCS, you checkout the project and then you have to hunt down a bunch of packages from a separate thing

Oh, and there's things like x509/proxy/whatever errors when on a corpo machine that has ZScaler or some such, so you have to use internal Artifactory/thing but that doesn't have the version you need or you need permissions to access so.. and etc etc.

rwmj•11h ago
Because it's (part of) a website that hosts the tarballs, and we want to keep the whole site under version control. Not saying it's a good reason, but it is a reason.
suriya-ganesh•11h ago
This is a problem that occurs across game development to ML datasets.

We built oxen to solve this problem https://github.com/Oxen-AI/Oxen (I work at Oxen.ai)

Source control for large data. Currently our biggest repository is 17 TB. would love for you to try it out. It's open source, so you can self host as well.

nyarlathotep_•16h ago
I've heard this about game dev before. My (probably only somewhat correct) understanding is it's more than just source code--are they checking in assets/textures etc? Is perforce more appropriate for this than, say, git lfs?
malkia•14h ago
And often binaries: .exe, .dll, even .pdb files.
nyarlathotep_•7h ago
Interesting. Seems antithetical to the 'git centered' view of being for source code only (mostly)

I think I read somewhere that game dev teams would also check in the actual compiler binary and things of that nature into version control.

Usually it's considered "bad practice" when you see, like, and entire sysroot of shared libs in a git repository.

I don't even have any feeling one way or another. Even today "vendoring" cpp libraries (typically as source) isn't exactly rare. I'm not even sure if this is always a "bad" thing in other languages. Everyone just seems to have decided that relying on a/the package manager and some sort of external store is the Right Way. In some sense, it's harder to make the case for that.

tom_•5h ago
It's only considered a bad idea because git handles it poorly. You're already putting all your code in version control - why would you not include the compiler binaries and system libraries too? Now everybody that gets the code has the right compiler to build it with as well!

The better organised projects I've worked on have done this, and included all relevant SDKs too, so you can just install roughly the right version of Visual Studio and you're good to go. Doesn't matter if you're not on quite the right point revision or haven't got rough to doing the latest update (or had it forced upon you); the project will still build with the compiler and libraries you got from Perforce, same as for everybody else.

int_19h•12h ago
I'm not sure about the current state of affairs, but I've been told that git-lfs performance was still not on par with Perforce on those kinds of repos a few years ago. Microsoft was investing a lot of effort in making it work for their large repos though so maybe it's different now.

But yeah, it's basically all about having binaries in source control. It's not just game dev, either - hardware folk also like this for their artifacts.

masklinn•12h ago
Assets, textures, design documents, tools, binary dependencies, etc…

And yes, p4 just rolls with it, git lfs is a creacky hack.

foooorsyth•15h ago
git by itself is often unsuitable for XL codebases. Facebook, Google, and many other companies / projects had to augment git to make it suitable or go with a custom solution.

AOSP with 50M LoC uses a manifest-based, depth=1 tool called repo to glue together a repository of repositories. If you’re thinking “why not just use git submodules?”, it’s because git submodules has a rough UX and would require so much wrangling that a custom tool is more favorable.

Meta uses a custom VCS. They recently released sapling: https://sapling-scm.com/docs/introduction/

In general, the philosophy of distributed VCS being better than centralized is actually quite questionable. I want to know what my coworkers are up to and what they’re working on to avoid merge conflicts. DVCS without constant out-of-VCS synchronization causes more merge hell. Git’s default packfile settings are nightmarish — most checkouts should be depth==1, and they should be dynamic only when that file is accessed locally. Deeper integrations of VCS with build systems and file systems can make things even better. I think there’s still tons of room for innovation in the VCS space. The domain naturally opposes change because people don’t want to break their core workflows.

msgodel•15h ago
git submodules have a bad ux but it's certainly not worse than Android's custom tooling. I understand why they did it but in retrospect that seems like an obvious mistake to me.
WorldMaker•12h ago
It's interesting to point out that almost all of Microsoft's "augmentations" to git have been open source and many of them have made it into git upstream already and come "ready to configure" in git today ("conical" sparse checkouts, a lot of steady improvements to sparse checkouts, git commit-graph, subtle and not-so-subtle packfile improvements, reflog improvements, more). A lot of it is opt-in stuff because of backwards compatibility or extra overhead that small/medium-sized repos won't need, but so much of it is there to be used by anyone, not just the big corporations.

I think it is neat that at least one company with mega-repos is trying to lift all boats, not just their own.

kccqzy•10h ago
Meta and Google both have been using mercurial and they have also been contributing back to upstream mercurial.
2d8a875f-39a2-4•20h ago
Always nice to read a new retelling of this old story.

TFA throws some shade at how "a single get of the office repo took some hours" then elides the fact that such an operation was practically impossible to do on git at all without creating a new file system (VFS). Perforce let users check out just the parts of a repo that they needed, so I assume most SD users did that instead of getting every app in the Office suite every time. VFS basically closes that gap on git ("VFS for Git only downloads objects as they are needed").

Perforce/SD were great for the time and for the centralised VCS use case, but the world has moved on I guess.

daemin•18h ago
Some companies have developed their own technology like VFS for use with Perforce, so you can check out the entire suite of applications but only pull the files when you try to access them in a specific way. This is a lot more important in game development where massive source binary assets are stored along side text files.

It uses the same technology that's built into Windows that the remote drive programs (probably) use.

Personally I kind of still want some sort of server based VCS which can store your entire companies set of source without needing to keep the entire history locally when you check out something. But unfortunately git is still good enough to use on an ad-hoc basis between machines for me that I don't feel the need to set up a central server and CI/CD pipeline yet.

Also being able to stash, stage hunks, and interactively rebase commits are features that I like and work well with the way I work.

sixothree•15h ago
Doesn’t SVN let you check out and commit any folder or file at any depth of a project you choose? Maybe not the checkouts and commit, but that log history for a single subtree is something I miss from the SVN tooling.
gilbertbw•15h ago
Can you not achieve the log history on a subtree with `git log my/subfolder/`? Tools like TortoiseGit let you right click on a folder and view the log of changes to it.
daemin•14h ago
Yes it can, but the point is that in a git repo you store the entire history locally, so whenever you clone a repo, you clone its history on at least one branch.

So when you have a repo that's hundreds of GB in size, the entire history can be massive.

int_19h•13h ago
You can indeed. The problem with this strategy is that now you need to maintain the list of directories that needs to be checked out to build each project. And unless this is automated somehow, the documentation will gradually diverge from reality.
noitpmeder•16h ago
My firm still uses perforce and I can't say anyone likes it at this point. You can almost see the light leaves the eyes of new hires when you tell them we don't use git like the rest of the world.
kccqzy•16h ago
I cannot believe that new hires would be upset by the choice of version control software. They joined a new company after so many hoops and it's on them for having an open mind towards processes and tools in the new company.
axus•16h ago
If companies don't cater to the whims of the youth, they'd have to hire... old people
xeromal•16h ago
e-gad!
Tostino•15h ago
But those cost so much more!
bongodongobob•13h ago
But they are Analysts and know corporate speak and are really good at filling their schedules with meetings! They must be so busy doing very meaningful work!
DanielHB•16h ago
I almost cried of happiness when we moved to git from SVN on my first job after being there for 6 months

They might not be upset on the first few weeks but after a month or so they will be familiar with the pain.

kccqzy•11h ago
Oh a month is definitely enough time.
mattl•15h ago
I worked with someone who was surprised the company didn’t use Bitbucket and Discord. They were unhappy about both.
evilduck•15h ago
Discord I get, at least from a community or network effect, but Bitbucket? I can’t figure out why anyone but a CTO looking to save a buck would prefer Bitbucket.
mattl•14h ago
I cannot imagine many jobs use Discord over Slack/Teams unless they’re gaming related. This was not a gaming related job.
connicpu•14h ago
We use BitBucket where I work. Due to certain export regulations it's simpler for us to keep as many services as possible on-prem if they're going to contain any of our intellectual property, so BitBucket Server it is. There are other options of course, but all of the cloud solutions were off the table.
tough•14h ago
sorry for the tangent, how you deal with AI?
Kwpolska•13h ago
Why would you expect them to? It's really easy to live without AI.
tough•11h ago
nothing prevents them to run a gpu locally or on their own infra.

I was asking because I wonder what the enterprises that want to both use AI on their workflows like LLM's and have air-gap owned 100% data and pipelines are doing rn.

Feels like one of the few areas where to compete with big labs to me, might be wrong

bigstrat2003•13h ago
Presumably they don't use it.
connicpu•5h ago
External AI are banned, local or otherwise on-prem models are allowed. We're currently experimenting with some kind of llama instance running on one of our servers, but I personally don't use it much.
const_cast•10h ago
I actually quite like the interface of bitbucket. I think it's better, in a lot of ways, compared to gitlab and github.

What I hate about bitbucket is how stagnated it is.

Marsymars•14h ago
I feel like I’ve got an open mind towards processes and tools; the problem with a company using anything other than Git at this point is that unless they have a good explanation for it, it’s not going to be an indicator that the company compared the relative merits of VCS systems and chose something other than Git - it’s going to be an indicator that the company doesn’t have the bandwidth or political will to modernize legacy processes.
tough•14h ago
maybe they're on bazel?
kccqzy•11h ago
Well bazel is not a tool for version control.
tough•11h ago
damnit i was thinking jujutsu and got owned lol https://github.com/jj-vcs/jj
kccqzy•11h ago
Yeah but as a new hire, one doesn't yet know whether there is a good explanation for using a non-git tool. It takes time to figure that out.

A legacy tool might be bad, or it might be very good but just unpopular. A company that devotes political will to modernize for the sake of modernizing is the kind of craziness we get in the JS ecosystem.

jayd16•13h ago
A craftsman appreciates good tools.
kccqzy•11h ago
Is git a good tool then? Not necessarily. Some still think hg is better. Others think newer tools like jj are even better while being git compatible.
int_19h•13h ago
Perforce is sufficiently idiosyncratic that it's kinda annoying even when you remember the likes of SVN. Coming to it from Git is a whole world of pain.
inglor•11h ago
The problem is that you come to a prestigious place like Microsoft and end up using horrible outdated software.

Credit where credit is due at my time at Excel we did improve things a lot (migration from Script# to TypeScript, migration from SourceDepot to git, shorter dev loop and better tooling etc) and a large chunk of development time was spent on developer tooling/happiness.

But it does suck to have to go to one of the old places and use sourcedepot and `osubmit` the "make a change" tool and then go over 16 popups in the "happy path" to submit your patch for review (also done in a weird windows gui review tool)

Git was quite the improvement :D

filoleg•9h ago
> I cannot believe that new hires would be upset by the choice of version control software.

I can, if the version control software is just not up to standards.

I absolutely didn’t mind using mercurial/hg, even though I literally haven’t touched it until that point and knew nothing about it, because it is actually pretty good. I like it more than git now.

Git is a decent option that most people would be familiar with, cannot be upset about it either.

On another hand, Source Depot sucked badly, it felt like I had to fight against it the entire time. I wasn’t upset because it was unfamiliar to me. In fact, the more familiar I got with it, the more I disliked it.

2d8a875f-39a2-4•15h ago
Yeah it's an issue for new devs for sure. TFA even makes the point, "A lot of people felt refreshed by having better transferable skills to the industry. Our onboarding times were slashed by half".
tom_•12h ago
Interesting to hear it was so much of a problem in terms of onboarding time. Maybe Source Depot was particularly weird, and/or MS were using it in a way that made things particularly complicated? Perforce has never felt especially difficult to use to me, and programmers never seem to have any difficulty with it. Artists and designers seem to pick it up quite quickly too. (By and large, in contrast to programmers, they are less in the habit of putting up with the git style of shit.)
chokolad•7h ago
> Interesting to hear it was so much of a problem in terms of onboarding time. Maybe Source Depot was particularly weird, and/or MS were using it in a way that made things particularly complicated?

It was not. It was literally a fork of perforce with executable renamed to sd.exe from p4. Command line was pretty much identical.

Degorath•8h ago
Can't say anything about perforce as I've never used it, but I'd give my left nut to get Google's Piper instead of git at work :)
StephenAmar•6h ago
I concur. I miss citc & fig.
Arainach•1h ago
Piper's syntax is Perforce syntax.

I moved to Google from Microsoft and back when employee orientation involved going to Mountain View and going into labs to learn the basics, it was amusing to see fresh college hires confused at not-git while I sat down and said "It's Source Depot, I know this!"[1]

[1] https://www.youtube.com/watch?v=dFUlAQZB9Ng

swsieber•13h ago
I'm a bit surprised git doesn't offer a way to checkout only specific parts of the git tree to be honest. It seems like it'd be pretty easy to graft on with an intermediate service that understands object files, etc.
jjmarr•12h ago
It's existed for a while. Partial clones and LFS.

https://git-scm.com/docs/partial-clone

swsieber•10h ago
Thanks!
socalgal2•7h ago
VFS does not replace Perforce. Most AAA game companies still use Perforce. In particular, they need locks on assets so two people don't edit them at the same time and have an unmergable change and wasted time as one artist has to throw their work away
0points•18h ago
Having used vss in the 90s myself, it surprised me it wasn't even mentioned.

VSS (Visual SourceSafe) being Microsoft's own source versioning protocol, unlike Source Depot which was licensed from Perforce.

tamlin•17h ago
Yes, I used VSS as a solo developer in the 90s. It was a revelation at the time. I met other VCS systems at grad school (RCS, CVS).

I started a job at MSFT in 2004 and I recall someone explaining that VSS was unsafe and prone to corruption. No idea if that was true, or just lore, but it wasn't an option for work anyway.

mmastrac•17h ago
We used to call it Visual Source Unsafe because it was corrupting repos all the time.
skipkey•16h ago
As I recall, one problem was you got silent corruption if you ran out of disk space during certain operations, and there were things that took significantly more disk space while in flight than when finished, so you wouldn’t even know.

When I was at Microsoft, Source Depot was the nicer of the two version control systems I had to use. The other, Source Library Manager, was much worse.

meepmorp•9h ago
iirc, we called it visual source shred

kinda nice to know it wasn't just our experience

sumtechguy•17h ago
The integration with sourcesafe and all of the tools was pretty cool back then. Nothing else really had that level of integration at the time. However, VSS was seriously flakey. It would corrupt randomly for no real reason. Daily backups were always being restored in my workplace. Then they picked PVCS. At least it didnt corrupt itself.

I think VSS was fine if you used it on a local machine. If you put it on a network drive things would just flake out. It also got progressively worse as newer versions came out. Nice GUI, very straight forward to teach someone how to use it (checkout file, change, check in like a book), random corruptions about sums up VSS. That checkin/out model seems simpler for people to grasp. The virtual/branch systems most of the other ones use is kind of a mental block for many until they grok it.

marcosdumay•16h ago
> No idea if that was true

It's an absurd understatement. The only people that seriously used VSS and didn't see any corruption were the people that didn't look at their code history.

smithkl42•14h ago
I used VSS for a few years back in the late 90's and early 2000's. It was better than nothing - barely - but it was very slow, very network intensive (think MS Access rather than SQL), it had very poor merge primitives (when you checked out a file, nobody else could change it), and yes, it was exceedingly prone to corruption. A couple times we just had to throw away history and start over.
electroly•14h ago
SourceSafe had a great visual merge tool. You could enable multiple checkouts. VSS had tons of real issues but not enabling multiple checkouts was a pain that companies inflicted on themselves. I still miss SourceSafe's merge tool sometimes.
anonymars•6h ago
Have you used Visual Studio's git integration? (Note that you could just kick off the merge elsewhere and use VS to manage the conflicts, then commit from back outside. Etc.)
wvenable•13h ago
I was mandated to use VSS in a university course in the late 90s -- one course, one project -- and we still managed to corrupt it.
larrywright•15h ago
I used VSS in the 90s as well, it was a nightmare when working in a team. As I recall, Microsoft themselves did not use VSS internally, at least not for the majority of things.
hpratt4•11h ago
That’s correct. Before SD, Microsoft orgs (at least Office and Windows; I assume others too) used an internal tool called SLM (“slime”); Raymond Chen has blogged about it, in passing: https://devblogs.microsoft.com/oldnewthing/20180122-00/?p=97...
chiph•8h ago
VSS was picked up via the acquisition of One Tree Software in Raleigh. Their product was SourceSafe, and the "Visual" part was added when it was bundled with their other developer tools (Visual C, Visual Basic, etc). Prior to that Microsoft sold a version control product called "Microsoft Delta" which was expensive and awful and wasn't supported on NT.

One of the people who joined Microsoft via the acquisition was Brian Harry, who led the development of Team Foundation Version Control (part of Team Foundation Server - TFS) which used SQL Server for its storage. A huge improvement in manageability and reliability over VSS. I think Brian is retired now - his blog at Microsoft is no longer being updated.

From my time using VSS, I seem to recall a big source of corruption was it's use of network file locking over SMB. If there were a network glitch (common in the day) you'd have to repair your repository. We set up an overnight batch job to run the repair so we could be productive in the mornings.

EvanAnderson•6h ago
> ...I seem to recall a big source of corruption was it's use of network file locking over SMB...

Shared database files (of any kind) over SMB... shudder Those were such bad days.

0points•2h ago
Oh, TIL! Thanks for adding that to the story.

Indeed my experiences of vss was also not amazing and certainly got corrupted files too.

ThinkBeat•17h ago
What were the biggest hurdles? Where did Git fall short? How did you structure the repo(s)? Where there many artifacts that went into integration with GitLFS?
airstrike•17h ago
> Today, as I type these words, I work at Snowflake. Snowflake has around ~2,000 engineers. When I was in Office, Office alone was around ~4,000 engineers.

I'm sorry, what?! 4,000 engineers doing what, exactly?

Excel turns 40 this year and has changed very little in those four decades. I can't imagine you need 4,000 engineers just to keep it backwards compatible.

In the meantime we've seen entire companies built with a ragtag team of hungry devs.

throwaway889900•14h ago
Thank goodness I don't have to use IBM's Rational Team Concert anymore. Even just thinking about it makes me shudder.
mosdl•14h ago
It was a great tool for losing changes!
danielodievich•11h ago
I want to thank dev leads who trained this green-behind-the-ears engineer on mysteries of Source Depot. Once I understood it, it was quite illuminating. I am glad we only had a dependency on WinCE and IE, and so the clone only took 20 minutes instead of days. I don't remember your names but I remember your willingness to step up and help and onboard new person so they could start being productive. I pay this attitude forward with new hires here in my team no matter where I go.
b0a04gl•10h ago
funny how most folks remember the git migration as a tech win but honestly the real unlock was devs finally having control over their own flow no more waiting on sync windows, no more asking leads for branch access suddenly everyone could move fast without stepping on each other that shift did more for morale than any productivity dashboard ever could git didn’t just fix tooling, it fixed trust in the dev loop
bariumbitmap•9h ago
> In the early 2000s, Microsoft faced a dilemma. Windows was growing enormously complex, with millions of lines of code that needed versioning. Git? Didn’t exist. SVN? Barely crawling out of CVS’s shadow.

I wonder if Microsoft ever considered using BitKeeper, a commercial product that began development in 1998 and had its public release in 2000. Maybe centralized systems like Perforce were the norm and a DVCS like BitKeeper was considered strange or unproven?

wslh•9h ago
There was SourceSafe (VSS) around that time and TFVC afterwards.
jeffbee•7h ago
One thing I find annoying about these Perforce hate stories: yes it's awkward to branch in Perforce. It is also the case that there is no need to ever create a branch for feature development when you use Perforce. It's like complaining that it is hard to grate cheese with a trumpet. That just isn't applicable.