frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

I hacked a dating app (and how not to treat a security researcher)

https://alexschapiro.com/blog/security/vulnerability/2025/04/21/startups-need-to-take-security-seriously
294•bearsyankees•2h ago•154 comments

Embeddings Are Underrated

https://technicalwriting.dev/ml/embeddings/overview.html
267•jxmorris12•4h ago•86 comments

The Barbican

https://arslan.io/2025/05/12/barbican-estate/
189•farslan•3h ago•62 comments

RIP Usenix ATC

https://bcantrill.dtrace.org/2025/05/11/rip-usenix-atc/
62•joecobb•2h ago•10 comments

HealthBench

https://openai.com/index/healthbench/
28•mfiguiere•1h ago•10 comments

Launch HN: ParaQuery (YC X25) – GPU Accelerated Spark/SQL

62•winwang•3h ago•18 comments

A community-led fork of Organic Maps

https://www.comaps.app/news/2025-05-12/3/
224•maelito•7h ago•146 comments

Byte Latent Transformer: Patches Scale Better Than Tokens

https://arxiv.org/abs/2412.09871
26•dlojudice•2h ago•8 comments

Show HN: Airweave – Let agents search any app

https://github.com/airweave-ai/airweave
64•lennertjansen•3h ago•20 comments

Legion Health (YC S21) Is Hiring Founding Engineers to Fix Mental Health with AI

https://www.workatastartup.com/jobs/75011
1•the_danny_g•2h ago

Ruby 3.5 Feature: Namespace on read

https://bugs.ruby-lang.org/issues/21311
119•ksec•5h ago•59 comments

5 Steps to N-Body Simulation

https://alvinng4.github.io/grav_sim/5_steps_to_n_body_simulation/
13•dargscisyhp•2d ago•0 comments

Demonstrably Secure Software Supply Chains with Nix

https://nixcademy.com/posts/secure-supply-chain-with-nix/
44•todsacerdoti•4h ago•9 comments

Why GADTs matter for performance (2015)

https://blog.janestreet.com/why-gadts-matter-for-performance/
22•hyperbrainer•2d ago•6 comments

Reviving a Modular Cargo Bike Design from the 1930s

https://www.core77.com/posts/136773/Reviving-a-Modular-Cargo-Bike-Design-from-the-1930s
77•surprisetalk•4h ago•67 comments

Tailscale 4via6 – Connect Edge Deployments at Scale

https://tailscale.com/blog/4via6-connectivity-to-edge-devices
56•tiernano•5h ago•17 comments

University of Texas-led team solves a big problem for fusion energy

https://news.utexas.edu/2025/05/05/university-of-texas-led-team-solves-a-big-problem-for-fusion-energy/
168•signa11•6h ago•122 comments

Universe expected to decay in 10⁷⁸ years, much sooner than previously thought

https://phys.org/news/2025-05-universe-decay-years-sooner-previously.html
111•pseudolus•9h ago•155 comments

Continuous glucose monitors reveal variable glucose responses to the same meals

https://examine.com/research-feed/study/1jjKq1/
94•Matrixik•2d ago•54 comments

Spade Hardware Description Language

https://spade-lang.org/
83•spmcl•6h ago•37 comments

How to title your blog post or whatever

https://dynomight.net/titles/
11•cantaloupe•2h ago•1 comments

Show HN: CLI that spots fake GitHub stars, risky dependencies and licence traps

https://github.com/m-ahmed-elbeskeri/Starguard
59•artski•6h ago•36 comments

I ruined my vacation by reverse engineering WSC

https://blog.es3n1n.eu/posts/how-i-ruined-my-vacation/
311•todsacerdoti•15h ago•157 comments

The Internet 1997 – 2021

https://www.opte.org/the-internet
12•smusamashah•2h ago•1 comments

Show HN: The missing inbox for GitHub pull requests

https://github.com/pvcnt/mergeable
5•pvcnt•1h ago•0 comments

OpenEoX to Standardize End-of-Life (EOL) and End-of-Support (EOS) Information

https://openeox.org/
19•feldrim•4h ago•13 comments

A Typical Workday at a Japanese Hardware Tool Store [video]

https://www.youtube.com/watch?v=A98jyfB5mws
98•Erikun•2d ago•38 comments

The FTC puts off enforcing its 'click-to-cancel' rule

https://www.theverge.com/news/664730/ftc-delay-click-to-cancel-rule
245•speckx•5h ago•141 comments

Optimizing My Hacker News Experience

https://reorientinglife.substack.com/p/optimizing-my-hacker-news-experience
37•fiveleavesleft•4d ago•18 comments

Ash (Almquist Shell) Variants

https://www.in-ulm.de/~mascheck/various/ash/
63•thefilmore•2d ago•3 comments
Open in hackernews

Spade Hardware Description Language

https://spade-lang.org/
83•spmcl•6h ago

Comments

ajross•5h ago
Haven't looked at this one, but IMHO HDL's are sort of the ultimate existence proof that "DSLs Are Bad Design Smell".

DSLs are great and elegant and beautiful for expressing a domain solution. Once. But real solutions evolve, and as they do they get messy, and when that happens you need to address that with software tools. And DSLs are, intentionally, inadequate to the tasks of large scale software design. So they add features[1], and we gets stuff like SystemC that looks almost like real programming. Except that it's walled off from updates in the broader community, so no tools like MSAN or whatnot, and you're working from decades-stale language standards, and everything is proprietary...

Honestly my sense is that it's just time to rip the bandaid off and generate synthesizable hardware from Python or Rust or whatnot. More syntax isn't what's needed.

[1] In what can be seen as an inevitable corollary of Greenspun's Tenth Rule, I guess.

leonheld•5h ago
> Honestly my sense is that it's just time to rip the bandaid off and generate synthesizable hardware from Python or Rust or whatnot.

I worked a bit with VHDL and the parallelism aspect is - to me - so fundamentally different than what our sequential programming languages can express that I'm not sure I a layer of abstraction between this and that. How would that work?

ajross•5h ago
You mean parallelism for simulation? Generate a simulator output from your input (in VHDL if you like) and run it in an appropriate runtime.

You don't need to run Python/whatever to simulate and you don't need (and probably don't want) your semantic constraints and checks to be expressed in python/whatever syntax. But the process of moving from a parametrized design through the inevitable cross-team-design-madness and decade-stale-design-mistake-workarounds needs to be managed in a development environment that can handle it.

nsteel•3h ago
I don't think this is about simulation. Python requires an additional DSL layer in order to express parallelism. I've personally no interest in learning that, or anything like that stuck on top some other language that's similarly unfit for HDL purposes.

Modern VHDL isn't too far off what we need. I'd rather see more improvements to that. But most crucially, we need tooling that actually supports the improvements and new features. We don't have that today, it's an absolute mess trying to use VHDL '19 with the industry's standard tools. We even avoid using '08 for fear of issues. I can't speak to how far off SV is.

oasisaimlessly•5h ago
See e.g. Migen [1], a Python HDL.

TL;DR: The hardware modules you're generating are represented as first-class objects that can be constructed from a DSL embedded within Python or explicitly from a list of primitives.

[1]: https://m-labs.hk/gateware/migen/

duped•3h ago
I would argue that the prevalence of HDLs proves that DSLs are a good design for problem domains that scale in complexity. The alternative is point and click CAD, which has a ceiling on the scale of complexity you can reach.

> Honestly my sense is that it's just time to rip the bandaid off and generate synthesizable hardware from Python or Rust or whatnot. More syntax isn't what's needed.

People who think the problem is that they can't synthesize a program in hardware from something like Python completely misunderstand the purpose of an HDL. You do not write a program and press a button to get that program on hardware. You write a program to generate the design and verify its correctness. It is much less like writing an imperative or functional program and more like writing macros or code generation.

Now if you want to write a Python library for generating the underlying data for programming an FPGA or taping out circuits that's actually a good idea that people have tried out - the problem you run into though are network effects. Generating designs is easy, verifying and debugging them is very hard. All the money is in the tooling, and that tooling speaks HDLs.

adwn•2h ago
> I would argue that the prevalence of HDLs proves that DSLs are a good design for problem domains that scale in complexity.

The major HDLs (i.e., Verilog/SystemVerilog and VHDL) are not DSLs in any meaningful sense of the word. There exist HDLs which actually are DSLs, but they're mostly used by hobbyist and aren't gaining any significant traction in the industry.

duped•2h ago
HDLs are the textbook definition of a domain specific language (the domain being the description of hardware, either it's behavior or design or both).
monocasa•1h ago
That's like saying that software languages are all DSLs, the domain being the description of von Neumann style software.
ajross•22m ago
Lots of "DSLs" are general purpose Turing-complete environments. What distinguishes them is their specific features that target a particular usage, usually just limited to syntax that directly reflects the domain in question.

But my point upthread is that even though these are "general purpose", they're still extremely limited in Practical Expressive Power for Large Scale Development, simply by being weird things that most people don't learn.

Python and Rust and even C++ projects can draw on decades of community experience and best practices and tools and tutorials that tell you how to get stuff done in their environments (and importantly how not to do things).

Literally the smartest people in software are trying to help you write Python et. al... With e.g. SystemVerilog you're limited to whatever the yahoos at Synopsys thought was a good idea. It's not the same.

adwn•1h ago
Okay, I grant that HDLs fall under the wider definition of "domain specific language". I was thinking of the narrower definition, which is apparently more precisely called "embedded DSL" – a language which is a specialization of a general purpose language, respectively embedded or defined within a GPL.
ajross•2h ago
> more like writing macros or code generation

Tasks which are also best done by premier software development environments and not ad hoc copies of ideas from other areas.

Surac•5h ago
There are already Verilog and VHDL based tools alvilable. i think nopne likes to learn a third HDL. Including specific ideas into the hdl makes it less attractive to people
RetroTechie•3h ago
Each HDL has its own strengths, weaknesses, tool support & application areas. So each has its place as long as -for specific designs/projects- it's better than alternative HDLs. Users can define "better" for themselves.

Not to mention: existing designs already done in Verilog, VHDL or whatever. Converting such a design from one HDL to another may no be easy.

So as always: use the best tool for the job.

adwn•5h ago
I'll take a closer look later, and I welcome anything that tries to bring concepts from modern programming languages to hardware design.

But. The focus on "CPU" examples on the landing page (A 3 stage CPU supporting Add, Sub, Set and Jump, "You can easily build an ALU") is immediately discouraging. I implement and verify FPGA designs for a living, and the vast, vast majority of my work is nothing like designing a CPU. So my fear is that this new hardware description language hasn't been created by a veteran who has many years of experience in using HDLs and therefore knows what and where the real pain points are, but by someone who's ultimately a software developer – even a highly skilled and motivated software developer – and who has never designed, implemented, and verified a large FPGA design. And skilled, motivated software developers without a lot of domain-specific experience tend to solve the wrong problems.

I would be happy to be proven wrong, though.

Philpax•5h ago
If you're curious why people keep wanting to reinvent HDLs, these posts by Dan Luu might be useful:

- https://danluu.com/why-hardware-development-is-hard/

- https://danluu.com/pl-troll/

tails4e•5h ago
SystemVerilog is a much better language than verilog. You can get pretty strong typed behaviour now, if you use the language parts that allow that. It's like C, if you use it poorly/the 'verilog way' it's got some serious footguns, but you can use it quite safely if you use the more modern features.

That said, I'm all for better languages, of they really are better and as expressive

wirybeige•4h ago
To add, even before SV we've had VHDL which is also strongly typed and has other nice features. But I do still like using SV more than VHDL :3. I'm not wholly convinced of these languages that have been popping up so far.
Symmetry•5h ago
I wonder how it compares to Bluespec?
grquantum•4h ago
I think this commits the same sin many other new HDLs do -- it just tries to awkwardly smush the paradigm of clocked logic into a sequential software language. The abstractions just don't match, which means you lose the mental connection between the code and the generated Verilog, which makes debugging stuff like timing awkward.

I'm a big Bluespec booster, and beyond the nice typing and functional programming you get I think the big advance it brings to the table is the Guarded Atomic Action paradigm, which simplifies reasoning about what the code is doing and means that it's usually not too painful to poke at the generated HDL too since there's a clear connection between the two halves. At $WORK$ we've been using Bluespec very successfully in a small team to quickly iterate on hardware designs.

I don't want to denigrate the Spade developers since it's clearly a labor of love and nicely done, but I feel that unless the underlying mental model changes there's not much benefit to any of these neo-HDLs compared to SV or VHDL.

fooblaster•3h ago
Where do you work that uses bluespec? Any open positions? Apologies for being forward.
fjfaase•5h ago
I thought that this was about the hardware description language Clash developed by some ex-colleagues, but it appeared to be something else. Clash [1] is based on the functional programming language Haskell and it can output to VHDL, Verilog, or SystemVerilog.

Although the last official release mentioned on the website is from 2021, it is still actively developed on GitHub [2]. See also contranomy [3] for a non-pipelined RV32I RISC-V core written in Clash.

[1] https://clash-lang.org/

[2] https://github.com/clash-lang/clash-compiler

[3] https://github.com/christiaanb/contranomy

chrsw•5h ago
On the surface this seems like it strikes a nice balance between addressing issues with expressing digital design intent and not completely breaking the mental model digital designers are used to in traditional HDLs.
smallpipe•4h ago
If the output SystemVerilog is unreadable I'm unlikely to use this. SV is still the lingua franca for physical tools. I'm not debugging timing on something that looks like this:

    localparam[14:0] _e_953 = 0;
    localparam[14:0] _e_958 = 1;
    assign _e_956 = \count  + _e_958;
    assign _e_955 = _e_956[14:0];
    assign _e_948 = _e_949 ? _e_953 : _e_955;
alain94040•4h ago
Fair, but it's just a tooling issue. You don't debug your Verilog anymore at the gate-level, do you?
adwn•4h ago
> just a tooling issue

The word "just" is carrying the weight of the world on its shoulders…

TeMPOraL•3h ago
It's been said that the arc of history bends toward just-ice...
Pet_Ant•3h ago
Whenever I see that I think of the Itanium where it was destined to be a success, the compiler just needed to...
smallpipe•3h ago
I do for the most sensitive path of the design, but only because it's needed. I don't want to have to look at it for day-to-day debugging, where it's just a distraction.
variaga•3h ago
When debugging/fixing timing problems, or trying to implement a functional change using only metal layers - yes, it is absolutely still necessary to debug verilog at the gate level.

Also, "just a tooling issue" is a pretty big problem when you're talking about something that wants to be adopted as part of the toolchain.

js8•4h ago
I think any HDL that is more inspired by functional languages (with better composability) is good. But yeah there is lot of inertia in using existing tools.
kayson•3h ago
Another new HDL: https://veryl-lang.org/

It'll be a long while before either gets enough traction to be serious competition to system erilog, even if SV is, compared to modern software languages, outdated.

raluk•3h ago
How one writes circular circuit, for example stream of fibonnachi numbers or IIR filter? For IIR filter it would be nice if it has protoype like iir(sig : T, a : vec<T, N>, b : vec<T,M>) -> T
Lramseyer•1h ago
Love to see this at the top of HN! I haven't written anything with this language yet, but I have met some of the developers of this language. They're pretty great and they are doing a lot of really good work in the open source hardware community. Another project they maintain is Surfer: https://surfer-project.org/

The challenge of a HDL over a regular sequential programming (software) language is that a software language is programmed in time, whereas a HDL is programmed in both space and time. As one HDL theory expert once told me "Too many high level HDLs try to abstract out time, when what they really need to do is expose time."

etep•1h ago
Surfer deserves to hit the front page also. Much better than gtk wave. Nice work Spade & Surfer!
polalavik•1h ago
If a new HDL language doesn’t have simulation capabilities baked in its next to useless. See: hardcaml and amaranth.

The hard part has never been writing HDL, it’s verifying HDL and making the verification as organized and as easy as possible. Teams spend something like 20% of time on design and 80%+ on verification allegedly (definitely true at my shop).

Edit: I see it’s tightly integrated with cocotb which is good. But someone needs to take a verification-first approach to writing a new language for HDL. It shouldn’t be a fun after thought, it’s a bulk of the job.

VonTum•6m ago
I've been wondering how long it'd take for it to show up here. I can attest to Frans (the lead dev) being a talented and highly active developer. It's frankly quite intimidating to be a competitor of his. (https://sus.rocks)

Hopefully one day we'll break open the hardware design ecosystem. Verilog & VHDL still being de-facto industry standard is pathetic. And IMO the only reason is the white-knuckle grip Intel (Altera again?) and Xilinx have over what languages are accepted by their respective proprietary design tools.