frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

OpenArchiver: Open-source platform for email archiving

https://github.com/LogicLabs-OU/OpenArchiver
1•thunderbong•24s ago•0 comments

1TB Raspberry Pi SSD on sale now for $70

https://www.raspberrypi.com/news/1tb-raspberry-pi-ssd-on-sale-now-for-70/
1•sohkamyung•29s ago•0 comments

TIL: Mastodon Has Lists

https://fedi.tips/how-to-use-the-lists-feature-on-mastodon/
1•laktak•47s ago•0 comments

OntoMotoOS – A Meta-Operating System Framework for AI Governance

1•nettalk83•1m ago•0 comments

Mark Zuckerberg Sues Mark Zuckerberg

https://techcrunch.com/2025/09/04/mark-zuckerberg-sues-mark-zuckerberg/
2•harrisreynolds•3m ago•0 comments

The false promise of WiFi 7 on iPhone 16 models

https://techloot.co.uk/ios/iphone-16-promises-blazing-fast-wifi-7-speeds-but-a-hidden-160-mhz-lim...
1•yrcyrc•3m ago•0 comments

US sanctions Palestinian groups who asked for Israel war crimes

https://www.cnn.com/2025/09/04/middleeast/trump-rubio-israel-palestinian-sanctions-hnk-intl
2•NomDePlum•4m ago•0 comments

Faster Rust Builds on Mac

https://nnethercote.github.io/2025/09/04/faster-rust-builds-on-mac.html
2•mkj•4m ago•0 comments

What the splinternet means for big tech. Unpleasant new trade-offs, for starters

https://www.economist.com/business/2025/09/04/what-the-splinternet-means-for-big-tech
1•bookofjoe•5m ago•1 comments

Elon Musk could become first trillionaire under new Tesla pay deal

https://www.independent.co.uk/news/business/elon-musk-tesla-pay-package-trillion-salary-b2820903....
2•doctaj•9m ago•0 comments

Strategies for Securing Non-Human Identities

https://www.cerbos.dev/blog/strategies-for-securing-non-human-identities
1•GarethX•9m ago•0 comments

What the panic about kids using AI to cheat gets wrong

https://www.vox.com/technology/458875/ai-cheating-data-education-panic
1•Wowfunhappy•11m ago•0 comments

Europe's largest paper mill? 1,500 research articles linked to Ukrainian network

https://www.nature.com/articles/d41586-025-02809-y
3•rntn•11m ago•0 comments

Show HN: Veritas – Detecting Hidden Bias in Everyday Writing

1•axisai•13m ago•1 comments

Anthropic CEO is doubling down on warning that AI will gut entry-level jobs

https://www.businessinsider.com/anthropic-ceo-ai-cut-entry-level-law-finance-consulting-jobs-2025-9
4•doctaj•14m ago•0 comments

Steve Ballmer denies allegations of circumventing salary cap

https://www.nytimes.com/athletic/6600547/2025/09/05/steve-ballmer-kawhi-leonard-endorsement-deal-...
2•itsdrewmiller•14m ago•0 comments

Archipelago: Multi-Game Randomizer and Server

https://github.com/ArchipelagoMW/Archipelago
2•vvoruganti•17m ago•0 comments

Helix humanoid robot doing dishes

https://www.youtube.com/watch?v=8gfuUzDn4Q8
3•sfjailbird•17m ago•0 comments

US economy added just 22,000 jobs in August, unemployment highest in 4 yrs

https://www.cnn.com/2025/09/05/economy/us-jobs-report-august-final
4•mgh2•19m ago•0 comments

The Key Points of Working Effectively with Legacy Code

https://understandlegacycode.com/blog/key-points-of-working-effectively-with-legacy-code/
1•lordleft•20m ago•0 comments

The Ongoing Fallout from a Breach at AI Chatbot Maker Salesloft

https://krebsonsecurity.com/2025/09/the-ongoing-fallout-from-a-breach-at-ai-chatbot-maker-salesloft/
1•Bender•26m ago•0 comments

Decoding sweet potato DNA: New research reveals surprising ancestry

https://phys.org/news/2025-08-decoding-sweet-potato-dna-reveals.html
1•PaulHoule•26m ago•0 comments

When an AI Seems Conscious

https://whenaiseemsconscious.org/
1•chiwilliams•26m ago•0 comments

Face-lifts are becoming better and more popular

https://www.thecut.com/article/undetectable-facelifts-trend-popularity-deep-plane-face-lift-vs-sm...
1•j5r5myk•28m ago•0 comments

Show HN: KnowViz – Turn any concept into visuals (exploring "Nano Banana")

https://knowviz.app/en
1•renedloh•29m ago•0 comments

Yang–Mills Mass Gap: The Math Holds – Can You Trace the Path?

https://zenodo.org/records/17042143
2•soldtm•29m ago•1 comments

Jaguar Land Rover Operations 'Severely Disrupted' by Cyberattack

https://www.securityweek.com/jaguar-land-rover-operations-severely-disrupted-by-cyberattack/
2•Bender•30m ago•1 comments

US Cybersecurity Agency Flags Wi-Fi Range Extender Vulnerability Under Attack

https://www.securityweek.com/us-cybersecurity-agency-flags-wi-fi-range-extender-vulnerability-und...
1•Bender•30m ago•0 comments

Show HN: NPC Chronicles – Give your NPCs voices by pro voice actors

https://tabletopy.gumroad.com/l/npcchroniclesdemo
1•lovegrenoble•31m ago•0 comments

Securing your self-hosted Database

https://hwisnu.bearblog.dev/securing-your-self-hosted-database/
2•Improvement•32m ago•1 comments
Open in hackernews

Why ML Needs a New Programming Language

https://signalsandthreads.com/why-ml-needs-a-new-programming-language/
65•melodyogonna•2h ago

Comments

torginus•1h ago
I think Mojo's cool and there's definitely a place for a modern applications programming language with C++ class(ish) performance, aka what Swift wanted to be but got trapped in the Apple ecosystem (designed by the same person as Mojo).

The strong AI focus seems to be a sign of the times, and not actually something that makes sense imo.

diggan•1h ago
> The strong AI focus seems to be a sign of the times, and not actually something that makes sense imo.

Are you sure about that? I think Mojo was always talked about as "The language for ML/AI", but I'm unsure if Mojo was announced before the current hype-cycle, must be 2-3 years at this point right?

torginus•1h ago
According to wikipedia it was announced in May 2023
tomovo•1h ago
While I appreciate all his work on LLVM, Chris Lattner's Swift didn't work out so well for me, so I'm cautious about this.

Swift has some nice features. However, the super slow compilation times and cryptic error messages really erase any gains in productivity for me.

- "The compiler is unable to type-check this expression in reasonable time?" On an M3 Pro? What the hell!?

- To find an error in SwiftUI code I sometimes need to comment everything out block by block to narrow it down and find the culprit. We're getting laughs from Kotlin devs.

elpakal•33m ago
To be fair to Chris, I’ve only seen the message about compiler not being able to type check the expression in swiftui closure hell. I think he left (maybe partly) because of the SwiftUI influence on Swift.
fnands•1h ago
> The strong AI focus seems to be a sign of the times, and not actually something that makes sense imo.

It has been Mojo's explicit goal from the start. It has it's roots in the time that Chris Lattner spent at Google working on the compiler stack for TPUs.

It was explicitly designed to by Python-like because that is where (almost) all the ML/AI is happening.

_aavaa_•1h ago
Yeah, except Mojo’s license is a non-starter.
auggierose•1h ago
Wow, just checked it out, and they distinguish (for commercial purposes) between CPU & Nvidia on one hand, and other "accelerators" (like TPU or AMD) on the other hand. For other accelerators you need to contact them for a license.

https://www.modular.com/blog/a-new-simpler-license-for-max-a...

_aavaa_•1h ago
Yes; in particular see sections 2-4 of [0].

They say they'll open source in 2026 [1]. But until that has happened I'm operating under the assumption that it won't happen.

[0]: https://www.modular.com/legal/community

[1]: https://docs.modular.com/mojo/faq/#will-mojo-be-open-sourced

actionfromafar•56m ago
Same
Cynddl•1h ago
Anyone knows what Mojo is doing that Julia cannot do? I appreciate that Julia is currently limited by its ecosystem (although it does interface nicely with Python), but I don't see how Mojo is any better then.
jakobnissen•1h ago
Mojo to me looks significantly lower level, with a much higher degree of control.

Also, it appears to be more robust. Julia is notoriously fickle in both semantics and performance, making it unsuitable for foundational software the way Mojo strives for.

thetwentyone•1h ago
Especially because Julia has pretty user friendly and robust GPU capabilities such as JuliaGPU and Reactant[2] among other generic-Julia-code to GPU options.

1: https://enzymead.github.io/Reactant.jl/dev/ 2: https://enzymead.github.io/Reactant.jl/dev/

jb1991•1h ago
I get the impression that most of the comments in this thread don't understand what a GPU kernel is. These high-level languages like Python and Julia are not running on the kernel, they are calling into other kernels usually written in C++. The goal is different with Mojo, it says at the top of the article:

> write state of the art kernels

You don't write kernels in Julia.

jakobnissen•1h ago
Im pretty sure Julia does JIT compilation of pure Julia to the GPU: https://github.com/JuliaGPU/GPUCompiler.jl
actionfromafar•1h ago
” you should use one of the packages that builds on GPUCompiler.jl, such as CUDA.jl, AMDGPU.jl, Metal.jl, oneAPI.jl, or OpenCL.jl”

Not sure how that organization compares to Mojo.

ssfrr•1h ago
It doesn’t make sense to lump python and Julia together in this high-level/low-level split. Julia is like python if numba were built-in - your code gets jit compiled to native code so you can (for example) write for loops to process an array without the interpreter overhead you get with python.

People have used the same infrastructure to allow you to compile Julia code (with restrictions) into GPU kernels

arbitrandomuser•46m ago
>You don't write kernels in Julia.

The package https://github.com/JuliaGPU/KernelAbstractions.jl was specifically designed so that julia can be compiled down to kernels.

Julia's is high level yes, but Julia's semantics allow it to be compiled down to machine code without a "runtime interpretter" . This is a core differentiating feature from Python. Julia can be used to write gpu kernels.

pjmlp•31m ago
See new cu tile architecture on CUDA, designed from the ground up with Python in mind.
adgjlsfhk1•9m ago
Julia's GPU stack doesn't compile to C++. it compiles Julia straight to GPU assembly.
Alexander-Barth•1h ago
I guess that the interoperability with Python is a bit better. But on the other hand, the PythonCall.jl (allowing calling python from julia) is quite good and stable. In Julia, you have quite good ML frameworks (Lux.jl and Flux.jl). I am not sure that you have mojo-native ML frameworks which are similarly usable.
jb1991•1h ago
Isn't Mojo designed for writing kernels? That's what it says at the top of the article:

> write state of the art kernels

Julia and Python are high-level languages that call other languages where the kernels exist.

Sukera•1h ago
No, you can write the kernels directly in Julia using KernelAbstractions.jl [1].

[1] https://juliagpu.github.io/KernelAbstractions.jl/stable/

hansvm•45m ago
[0] https://danluu.com/julialang/
ubj•43m ago
> Anyone knows what Mojo is doing that Julia cannot do?

First-class support for AoT compilation.

https://docs.modular.com/mojo/cli/build

Yes, Julia has a few options for making executables but they feel like an afterthought.

MohamedMabrouk•35m ago
* Compiling arbitrary Julia code into a native standalone binary (a la rust/C++) with all its consequcnes.
nromiun•1h ago
Weird that there has been no significant adoption of Mojo. It has been quite some time since it got released and everyone is still using PyTorch. Maybe the license issue is a much bigger deal than people realize.
jb1991•1h ago
It says at the top:

> write state of the art kernels

Mojo seems to be competing with C++ for writing kernels. PyTorch and Julia are high-level languages where you don't write the kernels.

Alexander-Barth•1h ago
Actually in julia you can write kernels with a subset of the julia language:

https://cuda.juliagpu.org/stable/tutorials/introduction/#Wri...

With KernelAbstractions.jl you can actually target CUDA and ROCm:

https://juliagpu.github.io/KernelAbstractions.jl/stable/kern...

For python (or rather python-like), there is also triton (and probably others):

https://pytorch.org/blog/triton-kernel-compilation-stages/

jakobnissen•1h ago
I think Julia aspires to be performant enough that you can write the kernels in Julia, so Julia is more like Mojo + Python together.

Although I have my doubts that Julia is actually willing to make the compromises which would allow Julia to go that low level. I.e. semantic guarantees about allocations and inference, guarantees about certain optimizations, and more.

pjmlp•1h ago
You can write kernels with Python using CUDA and Open API SDKs in 2025, that is one of the adoption problems regarding Mojo.
fnands•1h ago
It's still very much in a beta stage, so a little bit hard to use yet.

Mojo is effectively an internal tool that Modular have released publicly.

I'd be surprised to see any serious adoption until a 1.0 state is reached.

But as the other commented said, it's not really competing with PyTorch, it's competing with CUDA.

pjmlp•1h ago
I personally think they overshot themselves.

First of all some people really like Julia, regardless of how it gets discussed on HN, its commercial use has been steadily growing, and has GPGPU support.

On the other hand, regardless of the sore state of JIT compilers on CPU side for Python, at least MVidia and Intel are quite serious on Python DSLs for GPGPU programming on CUDA and One API, so one gets close enough to C++ performance while staying in Python.

So Mojo isn't that appealing in the end.

nickpsecurity•8m ago
Here's some benefits it might try to offer as differentiators:

1. Easy packaging into one executable. Then, making sure that can be reproducible across versions. Getting code from prior, AI papers to rub can be hard.

2. Predictability vs Python runtime. Think concurrent, low-latency GC's or low/zero-overhead abstractions.

3. Metaprogramming. There have been macro proposals for Python. Mojo could borrow from D or Rust here.

4. Extensibility in a way where extensions don't get too tied into the internal state of Mojo like they do Python. I've considered Python to C++, Rust, or parallelized Python schemes many times. The extension interplay is harder to deal with than either Python or C++ itself.

5. Write once, run anywhere, to effortlessly move code across different accelerators. Several frameworks are doing this.

6. Heterogenous, hot-swappable, vendor-neutral acceleration. That's what I'm calling it when you can use the same code in a cluster with a combination of Nvidia GPU', AMD GPU's, Gaudi3's, NPU's, SIMD chips, etc.

pansa2•36m ago
Sounds to me like it's very incomplete:

> maybe a year, 18 months from now [...] we’ll add classes

singularity2001•6m ago
Is it really released? Last time I checked it was not open sourced I don't want to rely on some proprietary vaporware stack.
melodyogonna•2m ago
It is not ready for general-purpose programming. Modular itself tried offering a Mojo api for their MAX engine, but had to give up because the language still evolved too rapidly for such an investment.

As per the roadmap[1], I expect to start seeing more adoption once phase 1 is completed.

1. https://docs.modular.com/mojo/roadmap

blizdiddy•1h ago
Mojo is the enshitification of programming. Learning a language is too much cognitive investment for VC rugpulls. You make the entire compiler and runtime GPL or you pound sand, that has been the bar for decades. If the new cohort of programmers can’t hold the line, we’ll all suffer.
pjmlp•1h ago
For decades, paying for compiler tools was a thing.
blizdiddy•52m ago
I’d prefer to not touch a hot stove twice. Telling me what processors I can use is Oracle- level rent seeking, and it should be mocked just like Oracle.
pjmlp•34m ago
I am quite sure Larry thinks very foundly of such folks when having vacations on his yatch or paying the bills to land the private jet off airport opening times.
analog31•49m ago
True, but aren't we in a better place now? I think the move to free tools was motivated by programmers, and not by their employers. I've read that it became hard to hire people if you used proprietary tools. Even the great Microsoft open-sourced their flagship C# language. And it's ironic but telling that the developers of proprietary software don't trust proprietary tools. And every developer looks at the state of the art in proprietary engineering tooling, such as CAD, and retches a little bit. I've seen many comments on HN along those lines.

And "correlation is not causality," but the occupation with the most vibrant job market until recently was also the one that used free tools. Non-developers like myself looked to that trend and jumped on the bandwagon when we could. I'm doing things with Python that I can't do with Matlab because Python is free.

Interestingly, we may be going back to proprietary tools, if our IDE's become a "terminal" for the AI coding agents, paid for by our employers.

pjmlp•36m ago
Not really, as many devs rediscover public domain, shareware, demos and open core, because it turns out there are bills to pay.

If you want the full C# experience, you will still be getting Windows, Visual Studio, or Rider.

VSCode C# support is under the same license as Visual Studio Community, and lack several tools, like the advanced graphical debugging for parallel code and code profiling.

The great Microsoft has not open sourced that debugger, nor many other tools on .NET ecosystem, also they can afford to subsidise C# development as gateway into Azure, and being valued in 4 trillion, the 2nd biggest in the world.

threeducks•56m ago
When I was young, I enjoyed messing around with new languages, but as time went on, I realized that there is really very little to be gained through new languages that can not be obtained through a new library, without the massive downside of throwing away most of the ecosystem due to incompatibility. Also, CuPy, Triton and Numba already exist right now and are somewhat mature, at least compared to Mojo.
dwattttt•54m ago
If a learning a new language didn't change how you think about programming, it wasn't a language worth learning.
jakobnissen•48m ago
Usually people create languages to address issues that cannot be addressed by a library because they have different semantics on a deeper level.

Like, Rust could not be a C++ library, that does not make sense. Zig could not be a C library. Julia could not be a Python library.

There is some superficial level of abstraction where all programming languages do is interchangeable computation and therefore everything can be achieved in every language. But that superficial sameness doesn't correspond to the reality of programming.

ActionHank•38m ago
Would love to know which languages you learned that were so similar that you didn't gain much.

Just comparing for example c++, c#, and typescript. These are all c-like, have heavy MS influence, and despite that all have deeply different fundamentals, concepts, use cases, and goals.

frou_dh•35m ago
Listening to this episode, I was quite surprised to hear that even now in Sept 2025, support for classes at all is considered a medium-term goal. The "superset of Python" angle was thrown around a lot in earlier discussions of Mojo 1-2 years ago, but at this rate of progress seems a bit of a pie-in-the-sky aspiration?
adgjlsfhk1•4m ago
superset of Python was never a goal. It was a talking point to try and build momentum that was quietly dropped once it served it's purpose of getting Mojo some early attention.
Shorel•32m ago
We have C++ :)
mitch_said•22m ago
Since it's not especially fun to read a transcript, here's a top-down mind map summarizing the conversation: https://app.gwriter.io/#/mindmap/view/5b86681e-a398-4988-8cf...
postflopclarity•21m ago
Julia could be a great language for ML. It needs more mindshare and developer attention though
singularity2001•3m ago
What's the current state of time to first plot and executable size? Last time it was several seconds to get a 200 MB hello world. I'm sure they are moving in the right direction the only questions is are they there yet?
numbers_guy•1m ago
What makes Julia "great" for ML?
MontyCarloHall•1m ago
The reason why Python dominates is that modern ML applications don't exist in a vacuum. They aren't the standalone C/FORTRAN/MATLAB scripts of yore that load in some simple, homogeneous data, crunch some numbers, and spit out a single result. Rather, they are complex applications with functionality extending far beyond the number crunching, which requires a robust preexisting software ecosystem.

For example, a modern ML application might need an ETL pipeline to load and harmonize data of various types (text, images, video, etc., all in different formats) from various sources (local filesystem, cloud storage, HTTP, etc.) The actual computation then must leverage many different high-level functionalities, e.g. signal/image processing, optimization, statistics, etc. All of this computation might be too big for one machine, and so the application must dispatch jobs to a compute cluster or cloud. Finally, the end results might require sophisticated visualization and organization, with a GUI and database.

There is no single language with a rich enough ecosystem that can provide literally all of the aforementioned functionality besides Python. Python's C/C++ FFIs (e.g. Python.h, NumPy C integration, PyTorch/Boost C++ integration) are not perfect, but are good enough that implementing the performance-critical portions of code in C/C++ is much easier compared to re-implementing entire ecosystems of packages in another language like Julia.