frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Dexterous robotic hands: 2009 – 2014 – 2025

https://old.reddit.com/r/robotics/comments/1qp7z15/dexterous_robotic_hands_2009_2014_2025/
1•gmays•45s ago•0 comments

Interop 2025: A Year of Convergence

https://webkit.org/blog/17808/interop-2025-review/
1•ksec•10m ago•1 comments

JobArena – Human Intuition vs. Artificial Intelligence

https://www.jobarena.ai/
1•84634E1A607A•13m ago•0 comments

Concept Artists Say Generative AI References Only Make Their Jobs Harder

https://thisweekinvideogames.com/feature/concept-artists-in-games-say-generative-ai-references-on...
1•KittenInABox•17m ago•0 comments

Show HN: PaySentry – Open-source control plane for AI agent payments

https://github.com/mkmkkkkk/paysentry
1•mkyang•19m ago•0 comments

Show HN: Moli P2P – An ephemeral, serverless image gallery (Rust and WebRTC)

https://moli-green.is/
1•ShinyaKoyano•29m ago•0 comments

The Crumbling Workflow Moat: Aggregation Theory's Final Chapter

https://twitter.com/nicbstme/status/2019149771706102022
1•SubiculumCode•33m ago•0 comments

Pax Historia – User and AI powered gaming platform

https://www.ycombinator.com/launches/PMu-pax-historia-user-ai-powered-gaming-platform
2•Osiris30•34m ago•0 comments

Show HN: I built a RAG engine to search Singaporean laws

https://github.com/adityaprasad-sudo/Explore-Singapore
1•ambitious_potat•40m ago•0 comments

Scams, Fraud, and Fake Apps: How to Protect Your Money in a Mobile-First Economy

https://blog.afrowallet.co/en_GB/tiers-app/scams-fraud-and-fake-apps-in-africa
1•jonatask•40m ago•0 comments

Porting Doom to My WebAssembly VM

https://irreducible.io/blog/porting-doom-to-wasm/
1•irreducible•40m ago•0 comments

Cognitive Style and Visual Attention in Multimodal Museum Exhibitions

https://www.mdpi.com/2075-5309/15/16/2968
1•rbanffy•42m ago•0 comments

Full-Blown Cross-Assembler in a Bash Script

https://hackaday.com/2026/02/06/full-blown-cross-assembler-in-a-bash-script/
1•grajmanu•47m ago•0 comments

Logic Puzzles: Why the Liar Is the Helpful One

https://blog.szczepan.org/blog/knights-and-knaves/
1•wasabi991011•58m ago•0 comments

Optical Combs Help Radio Telescopes Work Together

https://hackaday.com/2026/02/03/optical-combs-help-radio-telescopes-work-together/
2•toomuchtodo•1h ago•1 comments

Show HN: Myanon – fast, deterministic MySQL dump anonymizer

https://github.com/ppomes/myanon
1•pierrepomes•1h ago•0 comments

The Tao of Programming

http://www.canonical.org/~kragen/tao-of-programming.html
2•alexjplant•1h ago•0 comments

Forcing Rust: How Big Tech Lobbied the Government into a Language Mandate

https://medium.com/@ognian.milanov/forcing-rust-how-big-tech-lobbied-the-government-into-a-langua...
3•akagusu•1h ago•0 comments

PanelBench: We evaluated Cursor's Visual Editor on 89 test cases. 43 fail

https://www.tryinspector.com/blog/code-first-design-tools
2•quentinrl•1h ago•2 comments

Can You Draw Every Flag in PowerPoint? (Part 2) [video]

https://www.youtube.com/watch?v=BztF7MODsKI
1•fgclue•1h ago•0 comments

Show HN: MCP-baepsae – MCP server for iOS Simulator automation

https://github.com/oozoofrog/mcp-baepsae
1•oozoofrog•1h ago•0 comments

Make Trust Irrelevant: A Gamer's Take on Agentic AI Safety

https://github.com/Deso-PK/make-trust-irrelevant
7•DesoPK•1h ago•4 comments

Show HN: Sem – Semantic diffs and patches for Git

https://ataraxy-labs.github.io/sem/
1•rs545837•1h ago•1 comments

Hello world does not compile

https://github.com/anthropics/claudes-c-compiler/issues/1
35•mfiguiere•1h ago•20 comments

Show HN: ZigZag – A Bubble Tea-Inspired TUI Framework for Zig

https://github.com/meszmate/zigzag
3•meszmate•1h ago•0 comments

Metaphor+Metonymy: "To love that well which thou must leave ere long"(Sonnet73)

https://www.huckgutman.com/blog-1/shakespeare-sonnet-73
1•gsf_emergency_6•1h ago•0 comments

Show HN: Django N+1 Queries Checker

https://github.com/richardhapb/django-check
1•richardhapb•1h ago•1 comments

Emacs-tramp-RPC: High-performance TRAMP back end using JSON-RPC instead of shell

https://github.com/ArthurHeymans/emacs-tramp-rpc
1•todsacerdoti•1h ago•0 comments

Protocol Validation with Affine MPST in Rust

https://hibanaworks.dev
1•o8vm•2h ago•1 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
5•gmays•2h ago•1 comments
Open in hackernews

The Medley Interlisp Project: Reviving a Historical Software System [pdf]

https://interlisp.org/documentation/young-ccece2025.pdf
118•pamoroso•7mo ago

Comments

phaer•7mo ago
They have an informative website and an online emulator at https://interlisp.org/software/access-online/
pfdietz•7mo ago
I love this quote (from Interlisp-D: Overview and Status):

"Interlisp is a very large software system and large software systems are not easy to construct. Interlisp-D has on the order of 17,000 lines of Lisp code, 6,000 lines of Bcpl, and 4,000 lines of microcode."

So large. :)

http://www.softwarepreservation.net/projects/LISP/interlisp-...

pmcjones•7mo ago
A better URL:

https://www.softwarepreservation.org/projects/LISP/interlisp...

zelphirkalt•7mo ago
Same thing could probably be some 200k or more LoC in enterprise Java.
pjmlp•7mo ago
Or Objective-C,

https://github.com/Quotation/LongestCocoa

Or Smalltalk or C++,

https://en.wikipedia.org/wiki/Design_Patterns

Or even C,

https://www.amazon.com/Modern-Structured-Analysis-Edward-You...

Point being, people like to blame Java, while forgetting history of enterprise architecture.

zelphirkalt•7mo ago
Smalltalk though is famous for having blocks (closures!), a very minimal syntax, and late binding. It has had those for decades before Java got something like it.

But I think it is also at least partially a culture thing. Wanting to make every noun into a class and then going for 3 levels of design patterns, when actually a single function with appropriate types of input arguments and output would be sufficient, if it can be decided on at runtime. Then we run into issues of Java the language, which doesn't let you use the most elementary building block, a function. No, it forces you to put things into classes and that makes people think, that now they want to make objects, instead of merely have a static method. Then they complicate design more and more building on that. A culture of not being familiar with other languages, which emphasize more basic building blocks than classes. A culture of "I can do everything with Java, why learn something else?". A similar culture exists in the C++ world or among C aficionados.

pjmlp•7mo ago
Smalltalk is a pure OOP language, even the blocks you praise are objects.

Java isn't the only language missing closures, plenty of them took their nice time getting them into the language.

Everything else about Java on your comment, applies equally well to Smalltalk, that is why famous books like Design Patters exist, were written about 5 years predating the idea of a programming language like Java, and mostly uses Smalltalk examples, with some C++ as well.

In Smalltalk image world, everything is Smalltalk, the IDE, the platform, the OS, there isn't something else.

Many Java frameworks like JUnit, or industry trends like XP and Agile, have their roots in Smalltalk consulting projects, using IDEs like Visual Age for Smalltalk.

J2EE started its life as an Objective-C framework at Sun, during their collaboration with NeXT, called Distributed Objects Everywhere.

In similar vein, NeXT ported their Web Objects framework in Objective-C to Java, even before Apple's acquisition, with pretty much the same kind of abstraction ideas.

topspin•7mo ago
> So large. :)

Consider this: at any moment in history, the prevailing state the art in computing and information systems are characterized has huge, very large, massive, etc. Some years later, it's a portable device with a battery, and we forgive and snicker at those naïve souls that had no idea at the time.

It's still true today. Whatever thing you have in mind: vast software systems running clouds, megawatt powered NVidia GPU clusters, mighty LLMs... given 10-20 years, the equivalent will be an ad subsidized toy you'll impulse purchase on Black Friday.

You're instinct will be to reject this as absurd. Keep in mind, that is the same impulse experienced by those that came before us.

jshaqaw•7mo ago
Retro lisp machines are cool. Kudos to the team. Love it.

That said… we need the “lisp machine” of the future more than we need a recreation.

rjsw•7mo ago
What does a Lisp Machine of the future look like?

There is Mezzano [1] as well as the Interlisp project described in the linked paper and another project resurrecting the LMI software.

[1] https://github.com/froggey/Mezzano

imglorp•7mo ago
LMI here https://github.com/dseagrav/ld
amszmidt•7mo ago
Mostly dead. Current Lisp Machine shenanigans related to MIT/LMI are at https://tumbleweed.nu/lm-3 ...

Currently working on an accurate model of the MIT CADR in VHDL, and merging the various System source trees into one that should work for Lambda, and CADR.

diggan•7mo ago
> Currently working on an accurate model of the MIT CADR in VHDL

Sounds extremely interesting, any links/feeds one could follow the progress at?

The dream of running lisp on hardware made for lisp lives on, against all odds :)

amszmidt•7mo ago
Current work is at http://github.com/ams/cadr4

And of course .. https://tumbleweed.nu/lm-3 .

rjsw•7mo ago
Maybe try replacing the ALU with one written directly in Verilog, I suspect this will run a lot faster than building it up from 74181+74182 components.
amszmidt•7mo ago
From what I see -- that is not the case.

The current state is _very_ fast in simulation to the point where it is uninteresting (there are other things to figure out) to write something as a behavioral model of the '181/'182.

~100 microcode instructions takes about 0.1 seconds to run.

rjsw•7mo ago
I was thinking more of a behavioral model of the whole ALU, just so that the FPGA tools can map it onto a collection of the smaller ALUs built into each slice.

What clock speed does your latest design synthesize at?

therealcamino•7mo ago
At the top of the readme it says "There will be no attempt at making this synthesizable (at this time)!".
rjsw•7mo ago
There was already a design of CADR for FPGAs [1] that does synthesize (and boot), I don't know why amszmidt needed to start again from scratch or if his design is a modification of the earlier one.

A similar comment applies to lm-3. Maybe it is built on a fork of the previous repo, it is hard to tell.

[1] https://github.com/lisper/cpus-caddr

eadmund•7mo ago
> What does a Lisp Machine of the future look like?

Depends on what one means by that.

Dedicated hardware? I doubt that we’ll ever see that again, although of course I could be wrong.

A full OS? That’s more likely, but only just. If it had some way to run Windows, macOS or Linux programs (maybe just emulation?) then it might have a chance.

As a program? Arguably Emacs is a Lisp Machine for 2025.

Provocative question: would a modern Lisp Machine necessarily use Lisp? I think that it probably has to be a language like Lisp, Smalltalk, Forth or Tcl. It’s hard to put into words what these very different languages share that languages such as C, Java and Python lack, but I think that maybe it reduces down to elegant dynamism?

amszmidt•7mo ago
> Provocative question: would a modern Lisp Machine necessarily use Lisp?

Seeing that not even "Original Gangster" Lisp Machine used Lisp ...

Both the Lambda and CADR are RISCy machines with very little specific to Lisp (the CADR was designed specifically to just run generic VM instructions, one cool hack on the CADR was to run PDP-10 instructions).

By Emacs you definitely mean GNU Emacs -- there are other implementations of Emacs. To most people, what the Lisp Machine was (is?), was a full operating system with editor, compiler, debugger and very easy access to all levels of the system. Lisp .. wasn't the really interesting thing, Smalltalk, Oberon .. share the same idea.

0xpgm•7mo ago
> Dedicated hardware? I doubt that we’ll ever see that again, although of course I could be wrong.

Since we're now building specialized hardware for AI, emergence of languages like Mojo that take advantage of hardware architecture and what I interpret as a renewed interest in FPGAs perhaps specialized hardware is making a comeback.

If I understand computing history correctly, chip manufacturers like Intel optimized their chips for C language compilers to take advantage of economies of scale created by C/Unix popularity. This came with the cost of killing off lisp/smalltalk specialized hardware that gave these high level languages decent performance.

Alan Kay famously said that people who are serious about their software should make their own hardware.

chillpenguin•7mo ago
Smalltalk was the lisp machine of the future. Of course, now even Smalltalk is a thing of the past.
lproven•7mo ago
> we need the “lisp machine” of the future

Totally agree.

Here's my idea: stick a bunch of NVRAM DIMMs into a big server box, along with some ordinary SDRAM. So, say, you get a machine with the first, say, 16GB of RAM is ordinary RAM, and then the 512GB or 1TB of RAM above that in the memory map is persistent RAM. It keeps its contents when the machine is shut off.

That is it. No drives at all. No SSD. All its storage is directly in the CPU memory map.

Modify Interim or Mezzano to boot off a USB key into RAM and store a resume image in the PMEM part of the memory map, so you can suspend, turn off the power, and resume where you were when the power comes back.

https://github.com/froggey/Mezzano

https://github.com/mntmn/interim

Now try to crowbar SBCL into this, and as many libraries and frameworks as can be sucked in. All of Medley/Interlisp, and some kind of convertor so SBCL can run Interlisp.

You now have an x86-64 LispM, with a whole new architectural model: no files, no disks, no filesystem. It's all just RAM. Workspace at the bottom, disposable. OS and apps higher up where it's nonvolatile.

I fleshed this out a bit here:

https://archive.fosdem.org/2021/schedule/event/new_type_of_c...

And here...

https://www.theregister.com/2024/02/26/starting_over_rebooti...

kps•7mo ago
~wavy lines~~ I've never used this in anger, but I owned a second-hand Xerox Daybreak for a while to play around with. Later, there was a some freely available project (I've now forgotten) that used Interlisp running on an emulator running on a DEC Alpha, and so I added some minor bits to NetBSD's Ultrix compatibility.
rjsw•7mo ago
The Lisp Machine emulator that was tied to DEC Alpha was Symbolics OpenGenera.
kps•7mo ago
Made me look: Xerox LFG Grammar Writer's Workbench: https://web.archive.org/web/20170907021542/https://www2.parc...

Interlisp screenshot: https://web.archive.org/web/20160616231118/http://www2.parc....

Evidently the emulator was later ported to Linux as well.

pamoroso•7mo ago
Did you use LFG Grammar Writer's Workbench for linguistics work or research back then? Ron Kaplan is looking to revive the system and make it run on Medley Interlisp again.
kps•7mo ago
Personally, neither — I just used it as a way to get a copy of Interlisp to play with.