frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Geist Pixel

https://vercel.com/blog/introducing-geist-pixel
1•helloplanets•1m ago•0 comments

Show HN: MCP to get latest dependency package and tool versions

https://github.com/MShekow/package-version-check-mcp
1•mshekow•9m ago•0 comments

The better you get at something, the harder it becomes to do

https://seekingtrust.substack.com/p/improving-at-writing-made-me-almost
2•FinnLobsien•10m ago•0 comments

Show HN: WP Float – Archive WordPress blogs to free static hosting

https://wpfloat.netlify.app/
1•zizoulegrande•12m ago•0 comments

Show HN: I Hacked My Family's Meal Planning with an App

https://mealjar.app
1•melvinzammit•12m ago•0 comments

Sony BMG copy protection rootkit scandal

https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootkit_scandal
1•basilikum•15m ago•0 comments

The Future of Systems

https://novlabs.ai/mission/
2•tekbog•15m ago•1 comments

NASA now allowing astronauts to bring their smartphones on space missions

https://twitter.com/NASAAdmin/status/2019259382962307393
2•gbugniot•20m ago•0 comments

Claude Code Is the Inflection Point

https://newsletter.semianalysis.com/p/claude-code-is-the-inflection-point
3•throwaw12•21m ago•1 comments

Show HN: MicroClaw – Agentic AI Assistant for Telegram, Built in Rust

https://github.com/microclaw/microclaw
1•everettjf•22m ago•2 comments

Show HN: Omni-BLAS – 4x faster matrix multiplication via Monte Carlo sampling

https://github.com/AleatorAI/OMNI-BLAS
1•LowSpecEng•22m ago•1 comments

The AI-Ready Software Developer: Conclusion – Same Game, Different Dice

https://codemanship.wordpress.com/2026/01/05/the-ai-ready-software-developer-conclusion-same-game...
1•lifeisstillgood•24m ago•0 comments

AI Agent Automates Google Stock Analysis from Financial Reports

https://pardusai.org/view/54c6646b9e273bbe103b76256a91a7f30da624062a8a6eeb16febfe403efd078
1•JasonHEIN•28m ago•0 comments

Voxtral Realtime 4B Pure C Implementation

https://github.com/antirez/voxtral.c
2•andreabat•30m ago•1 comments

I Was Trapped in Chinese Mafia Crypto Slavery [video]

https://www.youtube.com/watch?v=zOcNaWmmn0A
2•mgh2•36m ago•0 comments

U.S. CBP Reported Employee Arrests (FY2020 – FYTD)

https://www.cbp.gov/newsroom/stats/reported-employee-arrests
1•ludicrousdispla•38m ago•0 comments

Show HN: I built a free UCP checker – see if AI agents can find your store

https://ucphub.ai/ucp-store-check/
2•vladeta•43m ago•1 comments

Show HN: SVGV – A Real-Time Vector Video Format for Budget Hardware

https://github.com/thealidev/VectorVision-SVGV
1•thealidev•45m ago•0 comments

Study of 150 developers shows AI generated code no harder to maintain long term

https://www.youtube.com/watch?v=b9EbCb5A408
1•lifeisstillgood•45m ago•0 comments

Spotify now requires premium accounts for developer mode API access

https://www.neowin.net/news/spotify-now-requires-premium-accounts-for-developer-mode-api-access/
1•bundie•48m ago•0 comments

When Albert Einstein Moved to Princeton

https://twitter.com/Math_files/status/2020017485815456224
1•keepamovin•49m ago•0 comments

Agents.md as a Dark Signal

https://joshmock.com/post/2026-agents-md-as-a-dark-signal/
2•birdculture•51m ago•0 comments

System time, clocks, and their syncing in macOS

https://eclecticlight.co/2025/05/21/system-time-clocks-and-their-syncing-in-macos/
1•fanf2•52m ago•0 comments

McCLIM and 7GUIs – Part 1: The Counter

https://turtleware.eu/posts/McCLIM-and-7GUIs---Part-1-The-Counter.html
2•ramenbytes•55m ago•0 comments

So whats the next word, then? Almost-no-math intro to transformer models

https://matthias-kainer.de/blog/posts/so-whats-the-next-word-then-/
1•oesimania•56m ago•0 comments

Ed Zitron: The Hater's Guide to Microsoft

https://bsky.app/profile/edzitron.com/post/3me7ibeym2c2n
2•vintagedave•59m ago•1 comments

UK infants ill after drinking contaminated baby formula of Nestle and Danone

https://www.bbc.com/news/articles/c931rxnwn3lo
1•__natty__•1h ago•0 comments

Show HN: Android-based audio player for seniors – Homer Audio Player

https://homeraudioplayer.app
3•cinusek•1h ago•2 comments

Starter Template for Ory Kratos

https://github.com/Samuelk0nrad/docker-ory
1•samuel_0xK•1h ago•0 comments

LLMs are powerful, but enterprises are deterministic by nature

3•prateekdalal•1h ago•0 comments
Open in hackernews

Removed rust to gain speed

https://www.prisma.io/blog/announcing-prisma-orm-7-0-0
71•2233•2mo ago

Comments

baranul•2mo ago
This is a decent example of not buying, getting pulled, or being forced into any corporate pushed hype or eliminating one's options. They re-evaluated and looked at what programming language was best for their situation, which was removing the Rust language and using something else. It then turned out, they actually got gains in greater user contributions, simplicity, efficiency, and even speed.
nine_k•2mo ago
> what programming language was best for their situation, which was removing the Rust language and using something else

This is correct, but I'd say that the key was removing Rust and not using something else. Fewer moving parts, fewer JS runtime boundaries to cross, no need to make certain that the GC won't interfere, etc.

Also, basically any rewrite is a chance to drop entrenched decisions that proved to be not as great. Rewriting a large enough part of Prisma likely allowed to address quite a few pieces of tech debt which were not comfortable to address in small incremental changes. Consider "Prisma requires ~98% fewer types to evaluate a schema. Prisma requires ~45% fewer types for query evaluation.": this mush have required quite a bit of rework of the whole thing. Removing Rust in the process was likely almost a footnote.

thunky•2mo ago
> This is a decent example of not buying, getting pulled, or being forced into any corporate pushed hype

It seems that maybe they did get hyped into Rust, because it's not clear why they believed Rust would make their JavaScript tool easier to develop, simpler, or more efficient in the first place.

satvikpendem•2mo ago
There are examples where that's true, like Biome or oxc.
koakuma-chan•2mo ago
Biome and oxc are developer tools. I don't know why in the world they would do this, but it sounds like they were using Rust at runtime to interact with the database?
solidsnack9000•2mo ago
At one time, they were targeting much a broader array of languages -- it wasn't specifically a JavaScript tool:

https://www.youtube.com/watch?v=1zSh0zYLTIE

cyberax•2mo ago
PSA: detached floating panels are pure cancer. Avoid them.

I literally can't scroll through your website.

jibal•2mo ago
So tell them ... posting it here is useless; the article was written a couple of weeks ago and they may not even know that it made it to HN.
ckwalsh•2mo ago
I'm in the "Pro-Rust" camp (not fanboy level "everything must be rewritten in rust", but "the world would be a better place if more stuff used Rust"), and I love this post.

They saw some benefits to Rust, tried it, and continued to measure. They identified the Typescript/Rust language boundary was slow, and noticed an effect on their contributions. After further research, they realized there was a faster way that didn't need the Rust dependency.

Good stuff, good explanation!

burnt-resistor•2mo ago
> I'm in the "Pro-Rust" camp (not fanboy level "everything must be rewritten in rust", but "the world would be a better place if more stuff used Rust")

While techno-religiosity is irrational and unprofessional, but that's some weak, eye-rolling both-sidesism.

The world would be a better place™ if more stuff used better and more formal tools and methods to prove that code is correct and bug-free. This is easier in some languages than others, but still, there's a lot of formal verification tools that aren't used enough and methodology patterns and attitudes that are missing from crucial projects and regular use. Effective safety and security assurance of software creation takes effort and understanding that a huge fraction of programmer non-software engineers don't want to undertake or know anything about. This needs to change and is defensible, marketable expertise that needs to be appreciated and cannot be replaced by AI anytime soon. There's no "easy" button, but there are a lot of "lazy" "buttons" that don't function as intended.

9rx•2mo ago
I'm not sure your characterization is all that accurate.

Originally, they thought they could build a product that worked across many languages. That necessitated a "lowest common denominator" language, which is a void that has always been strangely lacking in choice, to provide an integratabtle core. Zig had only been announced a few months earlier, so it wasn't really ready to be a contender. For all intents and purposes, C, C++, and Rust were the only options.

Once the product made it to market, it became clear that the Typescript ecosystem was the only one buying in. Upon recognizing the business failure, the "multi-language" core didn't make sense anymore. It was a flawed business model that forced them into using Rust (could have been C or C++ instead, but yeah) and once they gave up on that business model they understood that it would have been better to have been written it in Typescript in the first place — and it no doubt would have been if it weren't for the lofty pie in the sky dreams of trying to make it more than the market was willing to accept. Now they got the opportunity to actually do it.

quotemstr•2mo ago
Good. Rust is fine, but it makes you pay a complexity tax for manual memory management that you just don't need most of the time. In almost all real world cases, a GC is fine. TypeScript is a memory-safe language, just like Rust, and I can't imagine a database ORM of all things needing manual memory management to get good performance. (Talking to the database, not memory management, is the bottleneck!)
amluto•2mo ago
I don’t think the problems they were dealing with had much to do with any of those properties of Rust. Their issue seems to have been that they weren’t using native JavaScript/TypeScript and that their situation was improved by using native TypeScript.

If they had been using something like Java or Go or Haskell, etc, they may well have had even more downsides.

theusus•2mo ago
> manual memory management

Rust has automatic memory management.

> Complexity tax

Could you be more specific?

honeycrispy•2mo ago
The trait/type system can get pretty complex. Advanced Rust doesn't inherit like typical OOP, you build on generics with trait constraints, and that is a much more complex and unusual thing to model in the mind. Granted you get used to it.
burnt-resistor•2mo ago
OOP inheritance is an anti-pattern and hype train of the 90's/00's, especially multiple inheritance. Especially the codebases where they create extremely verbose factories and abstract classes for every damn thing ... Java, C++, and the Hack (PHP-kind) shop are frequently guilty of this.

Duck typing and selective traits/protocols are the way to go™. Go, Rust, Erlang+Elixir... they're sane.

What I don't like about Rust is the inability to override blanket trait implementations, and the inability to provide multiple, very narrow, semi-blanket implementations.

Finally: People who can't/don't want to learn multiple programming language platform paradigms probably should turn in their professional software engineer cards. ;)

quotemstr•2mo ago
> Rust has automatic memory management

Sure, if you define "automatic memory management" in a bespoke way.

> Could you be more specific?

The lifetime system, one of the most complex and challenging parts of the Rust programming model, exists only so that Rust can combine manual memory management with memory safety. Relax the requirement to manage memory manually and you can safely delete lifetimes and end up with a much simpler language.

Have you ever written a Java or Python program?

ameliaquining•2mo ago
An important part of the story here, not mentioned in this post but noted elsewhere (https://www.prisma.io/blog/from-rust-to-typescript-a-new-cha...), is that they gave up on offering client libraries for languages other than JavaScript/TypeScript. Doing this while mostly sharing a single implementation among all languages was much of the original reason to use Rust, because Rust is a good "lowest common denominator" language for FFI and TypeScript is not; it wasn't entirely about performance. If they hadn't tried to do this, they would likely never have used Rust; if they hadn't given up on it, they would likely still be using Rust.
Sytten•2mo ago
Yeah, the whole point of Prisma 2 was to be multi language and multi-DB with a Rust server in between you and the DB. There are a lot of advantages to that approach in enterprise. You can do better access control, stats, connection pooling, etc (Formal is a YC company in that space). Prisma 1 was a scala implementation of that vision.

Anyway, end of an era. There were a couple community bindings in Python and Java that are now dead I assume. I was heavily invested in Prisma around 4-5 years ago, that is funnily enough what got me started on my Rust journey.

hresvelgr•2mo ago
I'm sure you could get even greater speed by removing Prisma. All you need is a migration tool and a database connection. Most recent example in my work where we removed an ORM resulted in all of our engineers, particularly juniors becoming Postgres wizards.
PaulRobinson•2mo ago
Congratulations, you have now increased the cognitive load to be productive on your team and increased the SQL injection attack surface for your apps!

I jest, but ORMs exist for a reason. And if I were a new senior or principal on your team I’d be worried that there was now an expectation for a junior to be a wizard at anything, even more so that thing being a rich and complex RDBMS toolchain that has more potential guns pointing at feet than anything else in the stack.

I spent many years cutting Rails apps and while ActiveRecord was rarely my favourite part of those apps, it gave us so much batteries included functionality, we just realised it was best to embrace it. If AR was slow or we had to jump through hoops, that suggested the data model was wrong, not that we should dump AR - we’d go apply some DDD and CQRS style thinking and consider a view model and how to populate it asynchronously.

jakewins•2mo ago
I think this needs some nuance - this is definitely true in some domains.

Most of the domains I worked in it was the other way around: using an ORM didn’t mean we could skip learning SQL, it added an additional thing to learn and consider.

In the last years writing SQLAlchemy or Django ORM the teams I was on would write queries in SQL and then spend the rest of the day trying to make the ORM reproduce it. At some point it became clear how silly that was and we stopped using the ORMs.

Maybe it’s got to do with aggregate-heavy domains (I particularly remember windowing aggregates being a pain in SQLAlchemy?), or large datasets (again memory: a 50-terabyte Postgres machine, the db would go down if an ORM generated anything that scanned the heap of the big data tables), or highly concurrent workloads where careful use of select for update was used.

tasuki•2mo ago
> In the last years writing SQLAlchemy or Django ORM the teams I was on would write queries in SQL and then spend the rest of the day trying to make the ORM reproduce it.

Ah yes, good times! Not Django for me but similar general idea. I'm not a big fan of ORMs: give me a type safe query and I'm happy!

pjmlp•2mo ago
SQL injection is only a thing for those careless to ever allow doing screen concatenation to go through pull requests.

If it isn't using query parameters, straight rejection, no yes and buts.

Naturally if proper code review isn't a thing, than anything goes, and using an ORM won't help much either.

9rx•2mo ago
Brainfuck also exists for a reason. That doesn't imply that you should use it.
hresvelgr•2mo ago
> Congratulations, you have now increased the cognitive load to be productive on your team and increased the SQL injection attack surface for your apps!

Maybe I am speaking from too much experience but writing SQL is second-nature to me and I would wager my team feels similarly. Perhaps we are an anomaly. Secondly, most if not all SQL connector libraries have a query interface with all the usual injection vectors mitigated. Not saying it's impossible to break through but these are the same connector libraries even the ORMs use.

> ORMs exist for a reason. And if I were a new senior or principal on your team I’d be worried that there was now an expectation for a junior to be a wizard at anything

ORMs exist to hide the complexity of the RDBMS. Why would any engineer want to make arguably the most critical aspect of every single IT business opaque? ORMs may imply safety and ease, but in my experience they foster a culture with a tacit fear of SQL. Sounds a bit dramatic, but this has been a surprisingly consistent experience.

AnotherGoodName•2mo ago
Using an ORM and escape hatching to raw SQL is pretty much industry standard practice these days and definitely better than no ORM imho. I have code that's basically a lot of

    result = orm.query({raw sql}, parameters)

It's as optimal as any other raw SQL query. Now that may make some people scream "why use an ORM at all then!!!" but in the meantime;

I have wonderful and trivially configurable db connection state management

I have the ability to do things really simply when i want to; i still can use the ORM magic for quick prototyping or when i know the query is actually trivial object fetching.

The result passing into an object that matches the result of the query is definitely nicer with a good ORM library than every raw SQL library i've used.

RedShift1•2mo ago
Every project I've come across that uses an ORM has terrible database design. All columns nullable, missing foreign key indexes, doing things in application code that could easily be done by triggers (fields like created, modified, ...), wrong datatypes (varchar(n) all over the place, just wwwhhhhyyy, floats for money, ...), using sentinel values (this one time, at bandcamp, I came across a datetime field that used a sentinel value and it only worked because of two datetime handling bugs (so two wrongs did make a right) and the server being in the UTC timezone), and the list goes on and on...

I think this happens because ORMs make you treat the database as a dumb datastore and hence the poor schema.

AnotherGoodName•2mo ago
Honestly database schema management doesn't scale particularly well under any framework and i've seen those issues start to crop up in every org once you have enough devs constantly changing the schema. It happens with ORMs and with raw SQL.

When that happens you really really should look into the much maligned no-sql alternatives. Similarly to the hatred ORMs get, no-sql data stores actually have some huge benefits. Especially at the point where db schema maintenance starts to break down. Ie. Who cares if someone adds a new field to the FB Newsfeed object in development when ultimately it's a key-value store fetched with graphQL queries? The only person it'll affect is the developer who added that field, no one else will even notice the new key value object unless they fetch it. There's no way to make SQL work at all at scale (scale in terms of number of devs messing with the schema) but a key-value store with graphQL works really well there.

Small orgs where you're the senior eng and can keep the schema in check on review? Use an ORM to a traditional db, escape hatch to raw SQL when needed, keep a close eye on any schema changes.

Big orgs where there's a tons of teams wanting to change things at high velocity? I have no idea how to make either SQL or ORMs work in these cases. I do know from experience how to make graphQL and a key-value store work well though and that's where the above issues happen in my experience. It's really not an ORM specific issue. I suggest going down the no-sql route in those cases.

RedShift1•2mo ago
NoSQL is even worse, data gets duplicated and then forgotten, so it doesn't get updated correctly, or somebody names a field "mail" and another person names it "email" and so on...

There is zero guarantee that whatever you ask the database for contains anything valid, so your code gets littered with null and undefined checks, and if you ask for example a field "color" what is it going to contain? A hex value? rgb(), rgba(), integer? So you need to check that too.

In my experience NoSQL is even worse, they are literally data dumps (as in garbage dump).

pjmlp•2mo ago
Same here, I am a big advocate of knowing your SQL, and stored procedures, no need to waste network traffic for what is never going to be shown on the application.
anon-3988•2mo ago
I claim that 99.9999% of software should be written in a GC language. Very, very, very few problems actually requires memory management. It is simply not part of the business requirement. That said, how come the only language closest to this criteria is Go except it hasn't learn about clean water (algebraic data types).
mwkaufma•2mo ago
Games are 0.00001% of software? Web browsers? Operating systems?
theamk•2mo ago
Large parts of web browsers (like entire Firefox' UI) is written in javascript already

Operating systems _should_ use GC languages more. Sure, video needs to have absolute max performance... but there is no reason my keyboard or mouse driver should be in non-GC language.

BenoitP•2mo ago
Most programs should be written in GCd languages, but not this.

Except in a few cases, GCs introduce small stop-the-world pauses. Even at 15ms pauses, it'd still be very noticeable.

xigoi•2mo ago
There are GC algorithms that don’t require stopping the world.
baranul•2mo ago
Might want to read up on Ponylang. GC'd FOSS language without the stop-the-world pauses, demonstrating that it's possible. Should also be pointed out, that there are a number of proprietary solutions that claim GC with no pauses. Unfortunately, if coming from more common C-family languages, Ponylang may require more to get used to the actor model and different syntax.
oskarkk•2mo ago
> Large parts of web browsers (like entire Firefox' UI) is written in javascript already

Is UI even a large part of Firefox? I imagine that the rendering engine, JS engine, networking, etc, are many times larger than UI.

kstrauser•2mo ago
Meanwhile, earlier this week we had a big conversation about the skyrocketing costs of RAM. While it's technically true that GC doesn't mean a program has to hold allocations longer than the equivalent non-GC code, I've personally never seen a GC'ed program not using multiple times as much RAM.

And then you have non-GC languages like Rust where you're hypothetically managing memory yourself, in the form of keeping the borrow checker happy, you never see free()/malloc() in a developer's own code. It might as well be GC from the programmer's POV.

9rx•2mo ago
> And then you have non-GC languages like Rust

Rust is a GC language: https://doc.rust-lang.org/std/rc/struct.Rc.html, https://doc.rust-lang.org/std/sync/struct.Arc.html

Ygg2•2mo ago
By that token so is C.

GC is a very fuzzy topic, but overall trend is that a language is GC if it's opt-out of GC. Not opt-in. And more strictly, it has to be tracing GC.

9rx•2mo ago
You can add your own custom GC in C — you can add your own custom anything to any language; its all just 1s and 0s at the end of the day — but it is not a feature provided by the language out of the box like in Rust. Not the same token at all. This is very different.
Ygg2•2mo ago
Well, that's mostly what a Arc<T> and Rc<T> in Rust are. Optional add-ons.
9rx•2mo ago
...as part of the language. Hence it being a GC language.

Is this another case of "Rustacians" randomly renaming things? There was that whole debacle where sum types bizarrely became enums, even though enums already had an established, different meaning, with all the sad talking past everyone else that followed. This is starting to look like that again.

aw1621107•2mo ago
I can't say I've heard of a commonly used definition of "GC language" that includes C++ and excludes C. If anything, my impression is that both C and C++ are usually held up as exemplars of non-GC languages.
9rx•2mo ago
C++ didn't add GC until relatively recently, to be fair. When people from 30 years ago get an idea stuck in their head they don't usually ever change their understanding even as life continues to march forward. This isn't limited to software. If you look around you'll regularly find people repeating all kinds of things that were true in the past even though things have changed. And fair enough. There is only so much time in the day. You can't possibly keep up to date on everything.
aw1621107•2mo ago
The thing is that the usual comparisons I'm thinking of generally focused on how much the languages in question rely on GC for practical use. C++11 didn't really move the needle much, if at all, in that respect compared to the typical languages on the other side of said comparisons.

Perhaps I happen to have been around different discussions than you?

9rx•2mo ago
> focused on how much the languages in question rely on GC for practical use.

That's quite nebulous. It should be quantified. But, while we wait for that, if we assume by that metric C++ is not a GC language today, but tomorrow C++ developers all collectively decide that all heap allocations are to depend on std::shared_ptr, then it must become a GC language.

But the language hasn't changed in any way. How can an attribute of the language change without any changes?

aw1621107•2mo ago
> That's quite nebulous. It should be quantified.

Perhaps, but I'm reluctant to speak more definitively since I don't consider myself an authority/expert in the field.

> But the language hasn't changed in any way. How can an attribute of the language change without any changes?

The reason I put in "for practical use" is because since pedantically speaking no language actually requires GC - you "just" need to provision enough hardware (see: HFT firms (ab)use of Java by disabling the GC and resetting programs/machines at the end of the day). That's not relevant for basically everyone, though, since practically speaking you usually want to bound resource use, and some languages rely on a GC to do that.

I guess "general" or "normal" might have been a better word than "practical" in that case. I didn't intend to claim that how programmers use a language affects whether it should be considered a GC language or not.

Ygg2•2mo ago
> ...as part of the language.

Which part? It's not available in no-std without alloc crate. You can write your own Arc.

Most crates don't have to use Arc/Rc.

> Is this another case of "Rustacians" randomly renaming things?

No. This is a case of someone not having enough experience with Rust. Saying Rust is a GC language is like claiming Pascal is object oriented language because they share some surface similarities.

9rx•2mo ago
> Which part?

The part that is detailed on rust-lang.org.

> It's not available in no-std without alloc crate.

no-std disables features. It does not remove them from existence. Rust's worldly presence continues to have GC even if you choose to disable it for your particular project.

> This is a case of someone not having enough experience with Rust.

Nah. Even rust-lang.org still confuses sum types and enums to this very day. How much more experienced with Rust can you get than someone versed enough in the language to write comprehensive, official documentation? This thought of yours doesn't work.

> Saying Rust is a GC language is like claiming Pascal is object oriented language because they share some surface similarities.

What surface similarity does Pascal have to OO? It only has static dispatch. You've clearly not thought that one through.

Turbo Pascal has dynamic dispatch. Perhaps you've confused different languages because they happen to share similar names? That is at least starting to gain some surface similarity to OO. But message passing, of course, runs even deeper than just dynamic dispatch.

Your idea is not well conceived. Turbo Pascal having something that starts to show some very surface-level similarity to OO, but still a long way from being the real deal, isn't the same as Rust actually having GC. It is not a case of Rust having something that sort of looks kind of like GC. It literally has GC.

Ygg2•2mo ago
> no-std disables features. It does not remove them from existence.

It's the other way around; standard library adds features. Because Rust features are designed to be additive.

Look into it. `std` library is nothing more than Rust-lang provided `core`, `alloc` and `os` crates.

> Nah.

You don't seem to know how features work, how std is made or how often RC is encountered in the wild. It's hard to argue when you don't know language you are discusing.

> Even rust-lang.org still confuses sum types and enums to this very day.

Rust lang is the starting point for new Rust programmers; why in the heck would they start philosophizing about a bikesheddy naming edge case?

That's like opening your car manual to see history and debates on what types of motors preceded your own, while you're trying to get the damn thing running again.

> What surface similarity does Pascal have to OO?

Dot operator as in (dot in `struct.method`). The guy I was arguing with unironically told me that any language using the dot operator is OO. Because the dot operator is a sign of accessing an object or a struct.

Much like you, he had very inflexible thoughts on what makes or does not make something OO; it reminds me so much of you saying C++ is a GC-language.

> Your idea is not well conceived.

My idea is to capture the colloquial meaning of GC-language. The original connotation is to capture languages like C#, Java, JS, etc. That comes with a (more or less) non-removable tracing garbage collector. In practice, what this term means is

- How hard is it to remove and/or not rely on GC? Defaults matter a lot.

- How heavy is the garbage collection GC? Is it just RC or ARC?

- How much of the ecosystem depends on GC?

And finally, how many people are likely to agree with it? I don't care if my name is closest to frequency of red, if no one else agrees.

xigoi•2mo ago
By your definition, is Nim a GC language?
Ygg2•2mo ago
So by that definition yes, as of Nim 2.0 ORC is the default. You need to opt-out of it.
cb321•2mo ago
I'm not sure this opt-in/out "philosophical razor" is as sharp as one would like it to be. I think "optionality" alone oversimplifies and a person trying to adopt that rule for taxonomy would just have a really hard time and that might be telling us something.

For example, in Nim, at the compiler CLI tool level, there is opt-in/opt-out via the `--mm=whatever` flag, but, at the syntax level, Nim has both `ref T` and `ptr T` on equal syntactic footing . But then in the stdlib, `ref` types (really things derived from `seq[T]`) are used much more (since it's so convenient). Meanwhile, runtimes are often deployment properties. If every Linux distro had their libc link against -lgc for Boehm, people might say "C is a GC'd language on Linux". Minimal CRTs vary across userspaces and OS kernel/userspace deployment.. "What you can rely on/assume", I suspect the thrust behind "optionality", just varies with context.

Similar binding vagueness between properties (good, bad, ugly) of a language's '"main" compiler' and a 'language itself' and 'its "std"lib' and "common" runtimes/usage happen all the time (e.g. "object-oriented", often diluted by the vagueness of "oriented"). That doesn't even bring in "use by common dependencies" which is an almost independent axis/dimension and starts to relate to coordination problems of "What should even be in a 'std'-lib or any lib, anyway?".

I suspect this rule is trying to make the adjective "GC'd" do more work in an absolute sense than it realistically can given the diversity of PLangs (sometimes not so visible considering only workaday corporate PLangs). It's not always easy to define things!

Ygg2•2mo ago
> I think "optionality" alone oversimplifies and a person trying to adopt that rule for taxonomy would just have a really hard time and that might be telling us something.

I think optionality is what gives that definition weight.

Think of it this way. You come to a project like a game engine, but you find it's written in some language and discover for your usage you need no/minimal GC. How hard is it to minimize or remove GC. Assume that changing build flags will also cause problems elsewhere due to behavior change.

> Similar binding vagueness between properties (good, bad, ugly) of a language's '"main" compiler' and a 'language itself' and 'its "std"lib' and "common" runtimes/usage happen all the time (e.g. "object-oriented", often diluted by the vagueness of "oriented")

Vagueness is the intrinsic quality of human language. You can't escape it.

The logic is fuzzy, but going around saying stuff like "Rust is a GC language" because it has an optional, rarely used Arc/Rc, is just off the charts level of wrong.

xigoi•2mo ago
Do you consider ORC a tracing GC, even though it only uses tracing for cyclic types?
aw1621107•2mo ago
If Rust is a GC language because of Rc/Arc, then C++ is a GC language because of std::shared_ptr, right?
9rx•2mo ago
Absolutely.
aw1621107•2mo ago
Interesting... To poke at that definition a bit more:

Would that also mean that C++ only became a GC language with the release of C++11? Or would C++98 also be considered a GC language due to auto_ptr?

Are no_std Rust and freestanding C++ GC languages?

Does the existence of Fil-C change anything about whether C is a GC language?

9rx•2mo ago
> Or would C++98 also be considered a GC language due to auto_ptr?

auto_ptr does not exhibit qualities of GC.

> Are no_std Rust and freestanding C++ GC languages?

These are not different languages. A developer opting out of certain features as enabled by the language does not change the language. If one was specific and said "no_std Rust", then you could fairly say that GC is not available, but that isn't applicable here. We are talking about "Rust" as written alone.

> Does the existence of Fil-C change anything about whether C is a GC language?

No. While Fil-C is heavily inspired by C, to the point that it is hard to distinguish between them, it is its own language. You can easily tell they are different languages as not all valid C is valid Fil-C.

aw1621107•2mo ago
> auto_ptr does not exhibit qualities of GC.

OK, so by the definition you're using C++ became a GC language with C++11?

> If one was specific and said "no_std Rust", then you could fairly say that GC is not available, but that isn't applicable here.

I'd imagine whether or not GC capabilities are available in the stdlib is pretty uncontroversial. Is that the criteria you're using for a GC language?

> No. While Fil-C is heavily inspired by C, to the point that it is hard to distinguish between them, it is its own language. You can easily tell they are different languages as not all valid C is valid Fil-C.

OK, fair; perhaps I should have been more abstract, since the precise implementation isn't particularly important to the point I'm trying to get to. I should have asked whether the existence of a fully-compatible garbage collecting implementation of C change anything about whether C is a GC language. Maybe Boehm with -DREDIRECT_MALLOC might have been a better example?

9rx•2mo ago
> I should have asked whether the existence of a fully-compatible garbage collecting implementation of C change anything about whether C is a GC language.

In a similar vein, tinygo allows compilation without GC[1]. That is despite the Go spec explicitly defining it as having GC. Is Go a GC language or not?

As you can see, if it were up to implementation, a GC/non-GC divide could not exist. But it does — we're talking about it. The answer then, of course, is that specification is what is significant. Go is a GC language even if there isn't a GC in implementation. C is not a GC language even if there is one in implementation. If someone creates a new Rust implementation that leaves out Rc and Arc, it would still be a GC language as the specification indicates the presence of it.

[1] It doesn't yet quite have full feature parity so you could argue it is more like the Fil-C situation, but let's imagine that it is fully compatible in the same way you are suggesting here.

aw1621107•2mo ago
You make some good points, and after thinking some more I think I agree that the existence of GC/non-GC implementations is not determinative of whether a language is typically called a GC language.

After putting some more thought into this, I want to say where I diverge from your line of thinking is that I think whether a language spec offers GC capabilities is not sufficient on its own to classify a language as a "GC language"; it's the language's dependence on said GC capabilities (especially for "normal" use) that matters.

For example, while you can compile Go without a GC, the language generally depends on the presence of one for resource management to the point that a GC-less Go is going to be relatively restricted in what it can run. Same for Java, JavaScript, Python, etc. - GC-less implementations are possible, but not really reasonable for most usage.

C/C++/Rust, on the other hand, are quite different; it's quite reasonable, if not downright common, to write programs that don't use GC capabilities at all in those languages. Furthermore, removing std::shared_pointer/Rc/Arc from the correspondning stdlibs wouldn't pose a significant issue, since writing/importing a replacement is something those languages are pretty much designed to be capable of.

baranul•2mo ago
Good point. However, similar offshoot languages of Golang, do have them. Those are Vlang (aka Sum Types) and Odin (aka Discriminated Unions). However and per your comment, only Vlang has the optional GC (used by default) and the flexible memory management philosophy (to turn the GC off without its stdlib depending on it).