frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
529•klaussilveira•9h ago•146 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
859•xnx•15h ago•518 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
72•matheusalmeida•1d ago•13 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
180•isitcontent•9h ago•21 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
182•dmpetrov•10h ago•79 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
294•vecti•11h ago•130 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
69•quibono•4d ago•12 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
343•aktau•16h ago•168 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
338•ostacke•15h ago•90 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
434•todsacerdoti•17h ago•226 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
237•eljojo•12h ago•147 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
13•romes•4d ago•2 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
373•lstoll•16h ago•252 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
6•videotopia•3d ago•0 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
41•kmm•4d ago•3 comments

Show HN: ARM64 Android Dev Kit

https://github.com/denuoweb/ARM64-ADK
14•denuoweb•1d ago•2 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
220•i5heu•12h ago•162 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
91•SerCe•5h ago•75 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
62•phreda4•9h ago•11 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
162•limoce•3d ago•82 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
38•gfortaine•7h ago•10 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
127•vmatsiiako•14h ago•53 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
18•gmays•4h ago•2 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
261•surprisetalk•3d ago•35 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1029•cdrnsf•19h ago•428 comments

FORTH? Really!?

https://rescrv.net/w/2026/02/06/associative
55•rescrv•17h ago•18 comments

Show HN: Smooth CLI – Token-efficient browser for AI agents

https://docs.smooth.sh/cli/overview
83•antves•1d ago•60 comments

WebView performance significantly slower than PWA

https://issues.chromium.org/issues/40817676
18•denysonique•6h ago•2 comments

Zlob.h 100% POSIX and glibc compatible globbing lib that is faste and better

https://github.com/dmtrKovalenko/zlob
5•neogoose•2h ago•1 comments

I'm going to cure my girlfriend's brain tumor

https://andrewjrod.substack.com/p/im-going-to-cure-my-girlfriends-brain
109•ray__•6h ago•54 comments
Open in hackernews

Does anyone remember websites?

http://tttthis.com/rememberwebsites.php/
44•lr0•3mo ago

Comments

snitzr•3mo ago
https://www.spacejam.com/1996/
devin•3mo ago
If this is true to the original, I am surprised to see this was table-oriented layout and not a bitmap image with clickable x,y coordinates.
hosh•3mo ago
Table-oriented layouts was a thing back then too.
zahlman•3mo ago
It was common to make tables and use them to assemble a bitmap, where each cell had zero border/margin/padding and an exact size, and contained a "slice" of the image. Web authoring tools (and Photoshop) even had explicit support for generating this sort of thing, as I recall. This was I guess simpler to automate than defining clickable regions of a single image, and it allowed for the individual pieces of the image to be requested in parallel on slow connections (adding another dimension of progressive loading).
oxguy3•3mo ago
Ah, nothing like trying to save the logo from such a website, then discovering the image you saved is partially cut-off and includes the navbar behind it instead having a transparent background.
devin•3mo ago
Yeah, I remember this. Macromedia Fireworks had a slice tool that I used quite a bit. You'd basically make an image which was your website, and then do all the layout with zero border tables. But for me, this was what I was doing circa 2004 before CSS was dominant. Earlier software from the '96 era like Frontpage I think would use bitmaps whole cloth, but maybe I'm misremembering.
ranger_danger•3mo ago
Looking at the code, it has definitely been modified from the original... there is now CSS as well as a google ad tracker... but visually it's probably almost exactly the same.
zahlman•3mo ago
> You’ve enabled HTTPS-Only Mode for enhanced security, and a HTTPS version of tttthis.com is not available.

This, too, is nostalgic, in a way.

mapontosevenths•3mo ago
It's silly to try and encrypt everything, and arguably it can be worse for the authors privacy.

Sometimes a blog post on a plain http web site doesnt need to be encrypted.

bee_rider•3mo ago
What’s the argument that it can be worse for the author’s privacy?

In general, I think we should encrypt everything. The more encrypted stuff floating around, the less it stands out, and the better for everybody’s privacy. Of course, nowadays encrypted content is quite common. But it didn’t become that way without effort!

mapontosevenths•3mo ago
Thanks to let's encrypt it's now at least possible to get a valid certificate anonymously, but it's a pain that requires renewal every 60 to 90 days and puts you at their mercy.

If they decide they don't like your brand of free speech it's lights out and they are the only game in town.

Yes, I know you can automate renewal if you have shell access, but you'll probably have to remember to do it manually if you use shared hosting that doesn't provide a cert for you.

That's a lot of work, and a lot of risk, to secure a message that's meant to be publicly broadcast in the first place.

I imagine it to be a bit like encrypting OTA television. Sure, you could stop a pirate broadcast from inpersonating your station by encrypting it, but that's not actually a threat model that applies to normal people most of the time and it makes everything far more complex.

Can your ISP MITM you? Yep, and if they do you should cancel your service then sue them into the ground.

Tepix•3mo ago
If you don‘t encrypt everything a malicious actor like some ISPs can inject nasty things like pervasive tracking or zero day exploits.
cpa•3mo ago
It helps against this kind of stuff (2015) https://blog.fox-it.com/2015/04/20/deep-dive-into-quantum-in...
simpaticoder•3mo ago
I disagree. The primary threat model for unencrypted http connections is a MITM attack. A middle box (a proxy or router) modifies the response payload to inject malicious content or modify the content. For an ordinary blog or personal website an attacker can gain compute, violate privacy, acquire a (minor) DDOS source, on the blogs users by injecting a script.

Another type of attack would modify the content of the site to suit the attackers purpose - either to hurt the author and/or their message. Consider the damage an attacker can do if they injected CSAM onto a person's blog. The victim's life would be ruined long before the wheels of justice turned (if they turn at all). The one mitigating factor is that you'd need to have reliable control over a relatively stable middle-box to execute this attack, but that's quite feasible. Last but not least don't underestimate the way software grows. Sooner or later someone is going to implement HTTP basic authentication over plain HTTP and, needless to say, that's a bad idea.

Look, I don't like it either. I remember when you could telnet into a server and interact with it. That was good for pedagogy and building a mental model of the protocol. But we have to deal with how things are, not how we want them to be.

xantronix•3mo ago
openssl s_client -connect host:port
ranger_danger•3mo ago
Unfortunately this isn't 1999, and bad actors are everywhere. Even ISPs themselves (cough Comcast) have been injecting unsolicited new code into people's webpages for many years now.
username223•3mo ago
It was kind of awesome that you could telnet to port 80, type "GET / HTTP/1.0", press return a couple of times, and receive a web page. Then shitty hotel wifi that injected ads happened, so we had to encrypt traffic that had absolutely no sensitive information.
mapontosevenths•3mo ago
Our solution was technical, the problem was political. It should have been solved using criminal penalties for the criminals who did this.
epapsiou•3mo ago
I remember. Stumbled upon
morcus•3mo ago
This seems like it has at least partial overlap with the "small web": see https://kagi.com/smallweb
xavierstein•3mo ago
I think that Homestar Runner did a great short[1] about this for their 25th anniversary.

[1] https://homestarrunner.com/toons/backtoawebsite

mojuba•3mo ago
The early internet was like some settlers' shacks, built uncontrollably and unsystematically whereas the modern web is all skyscrapers, residential and business. Uniform apartments, uniform offices all looking very similar differeing in only subtle interior design details here and there.

Should we go back to the shack era? Of course not. But maybe we should start a new era of land exploration and start over. It shouldn't necessarily be Internet 3.0, might be something else completely. AR/VR? Possibly although that has already failed once.

Llamamoe•3mo ago
The only thing missing from your analogy is the fact that the shacks were filled with personal diaries and curios, while the skyscrapers are mostly chock-full of homogenous sewage slurry.
bee_rider•3mo ago
Also the shacks weren’t really particularly shabby or anything, they were just more like well-enough-constructed single family homes.

Old websites before scripting became popular were pretty much solid in that boring-tech way. Hardware and networks were not as reliable, but the sites themselves could be fine via simplicity.

Modern overdesigned sites are sort of like modern apartment buildings: shitty build quality under fake plastic marble and wood.

cykros•3mo ago
Well, they did literally often have "this page is under construction" banners on them. and <blink> tags lol.

I'd take it all back over the Squarespace hellscape the web has become.

beckthompson•3mo ago
If you've visited old mining operations / shacks that's pretty common! There are always some weird choices and cool things to see
bee_rider•3mo ago
> Should we go back to the shack era? Of course not.

This isn’t obvious, at least, we can’t write the idea off with an “of course not.”

mojuba•3mo ago
Keep in mind the early websites were mostly built by an enthusiast minority, technical or not but willing to learn HTML and Netscape Composer. You can't expect the whole humanity to be as enthusiastic. The skyscraper era, no matter how much we all hate it, makes the web more democratic: it gives everyone some standardized space (Facebook, Youtube, etc) with algorithmized discovery which is parking and elevators if you want to continue the analogy.
gdubs•3mo ago
Hard to live through what social media has done to society over the past decade without at least entertaining the idea that the higher barrier to entry of being online was maybe not a bad thing.
DaveZale•3mo ago
yes, for sure! It was a different time. Early website authors were pioneers. They had something worth sharing and they thought it worthwhile enough to learn some coding. Nobody was trying to push ads and monetize, and there was no ubiquitous tracking or cookies
mojuba•3mo ago
I don't disagree but notice how it's about the second decade of Web 2.0, not the first one. Profit-driven algorithms is a separate era in its own right. I.e. you can't blame the skyscrapers themselves for your shitty life, you just need to demand more regulation.
_DeadFred_•3mo ago
If the skyscraper is designed with elevators that try to keep me in and away from the first floor so I don't leave I can definitely complain.
krapp•3mo ago
If we're talking about the internet before Eternal September, maybe, but putting up a site on Geocities or Tripod or using Dreamweaver certainly was not a high barrier to entry.
pessimizer•3mo ago
I wouldn't agree that the higher barrier to entry was a good thing, but I also would say that the barrier to entry was actually pretty low, with angelfire, geocities, etc. Dreamweaver + other wysiwyg, and the lack of a necessity of a giant js framework with bundling and tree-shaking.

The problem is that the barrier to entry got too low, so it was necessary for large companies to interpose themselves between producers and audiences, starting with google (becoming something other than a grep for the web, and instead becoming the editor and main income source for the web) and expanding outwards into facebook.

Remember that we started with walled gardens like AOL and Compuserve, and the web (and the end of those companies) was people desperate to break out of them. Now people have been herded in again since the indexers bought the ad companies.

bee_rider•3mo ago
Facebook and YouTube are top-down managed systems, and I think it is a real disservice to the idea of democracy to call this sort of thing “more democratic.” They are democratic like a mall is, which is to say, not.
bryanrasmussen•3mo ago
>AR/VR? Possibly although that has already failed once.

I'm pretty sure it's already failed 3 times.

assimpleaspossi•3mo ago
I like to compare today's web to radio in the late 1800s and early 1900s.

Back then, if you could piece together a transmitter and throw an antenna up, you were a broadcaster and many broadcast whatever they felt like. Just like today's internet.

Social media is the CB radio of the 1970s and 80s when anyone could buy a small rig and do all kinds of weird and wild things for cheap.

But, eventually, something had to reign in all that and the FCC along with international laws and standards came up to calm all that down. In the same way, I think the internet will eventually become licensed and regulated.

pessimizer•3mo ago
> But, eventually, something had to reign in all that and the FCC along with international laws and standards came up to calm all that down.

No, it actually stayed pretty lively until the 90s, when the government decided that there could be huge monopolies in media, all the stations were bought up by like 6 guys, and were automated to play Disney music 24 hours a day.

Not such a neat story, right?

krapp•3mo ago
The rationale behind the FCC is that it's regulating a limited resource (spectrum space.) The web is not a limited resource (although bandwidth is, but that's a different debate.) The web is also international, and we're already seeing conflicts where one country tries to force their regulations onto another. That metaphor just doesn't work where the web is concerned.

I agree that the web in the US, and specifically large social media platforms, will probably be regulated because that seems to be one of the few things both parties agree on for their own reasons. But more so because the government wants to control information and surveil citizens. I think the balkanization of the web as a whole into smaller, closed networks is probably inevitable.

But what's most depressing of all is how many people in tech and on HN would be thrilled if one needed a license to publish on the internet just because that would implicitly push most people off of the web and leave it for a privileged elite.

As bad as social media can be (and I think its harm is often oversold for political ends) having a space where anyone can publish and communicate and create freely, where different platforms can exist and cater to different needs, where media isn't entirely controlled and gatekept by corporations, is critically important. More important than any other communications paradigm before it, including the printing press.

It's really going to be sad when we burn it all down, because it seems unlikely anyone is going to make something as free and open as the web ever again.

c22•3mo ago
The FCC licenses radio broadcasters because the spectrum is finite. Which finite aspects of the internet do you see driving such eventual practice?
ptero•3mo ago
> Should we go back to the shack era? Of course not.

I am not sure. Different people want different things. I ran a Hetzner cloud instance where I toss a simple webpage with locally hosted travel photos for friends and family. And a Jupiter server (with a very weak password) on the same instance for myself and any friend when we want something more powerful than a calculator.

And this messy, improperly organized, breaking all design patterns way works just fine for me. So I'm fine with a shack for personal communication and as a personal space. My 2c.

thaw13579•3mo ago
I think a better analogy is large corp built & owned vs small artisan businesses & single family homes. Why not have both?
cal85•3mo ago
The last sentence can’t be true. If you go looking for them, they’re easy to find. The problem is you don’t.
MountDoom•3mo ago
Sort of. There are many confounding factors here. For one, they're harder to find because the number of personal websites doesn't scale as quickly as commercial content and SEO spam. It's also a bit of a vicious cycle, because if your website is less likely to be read by anyone, why bother writing it in the first place?

But for the most part, the very people bemoaning the current state of affairs then go back to scrolling through TikTok / Instagram / Facebook / Reddit.

cal85•3mo ago
Or, when you do go looking, it doesn’t feel the same. Why?
prewett•3mo ago
Because you were 25 years younger. When you are, say, 20 years old, people >= 20 years old are likely to be interesting, which was pretty much everyone with a web page. When you are 45 years old, writings from 20 year olds are much less interesting, on average.
LarsDu88•3mo ago
Does anyone remember geocities, tripod, webrings, and Amazon affiliate links?

Pepperidge farm remembers...

DaveZale•3mo ago
I am currently building on neocities.org - that is the old geocities
DaveZale•3mo ago
I am compiling a website on neocites.org ... for about two months now, and I won't be complete for another year. It's basically a place for photos, maps and descriptions of a local community xeriscape garden with about 400 specimens.

Others will take photos and videos of the place throughout the year, and post to social media, where they instantly get a couple dozen thumbs up, and gloat about it. That is not my intention. I want a coffee table book.

krapp•3mo ago
Websites do still exist. People do know what they are. There are more of them on the web than there ever have been. Nothing is stopping anyone from creating a website if they want. Nothing is stopping you.
pessimizer•3mo ago
They will not be indexed by search engines, though, so you had better email all your friends so you'll have a few visitors.
chneu•3mo ago
Yes they will?

I run/host a bunch of personal websites for friends.

I do nothing special to get them indexed and they are all on search engines.

techjamie•3mo ago
I've created a few sites and never explicitly told search engines about them, and they got picked up just fine surprisingly quickly.
TypicalHog•3mo ago
https://wiby.me/
tuukkah•3mo ago
I think e.g. Mastodon with IndieWeb is a way to fight against the enshittification and to bring back the good from the early years. "The IndieWeb is a people-focused alternative to the “corporate web”. We are a community of independent and personal websites based on the principles of: owning your domain and using it as your primary online identity, publishing on your own site first (optionally elsewhere), and owning your content." https://indieweb.org/
jmclnx•3mo ago
This comes up every so often, and I always post something like this :)

It still exists with Gemini protocol and gopher:

https://www.linux-magazine.com/Issues/2021/245/The-Rise-of-t...

https://en.wikipedia.org/wiki/Gemini_(protocol)

https://en.wikipedia.org/wiki/Gopher_(protocol)

https://geminiprotocol.net/

https://wiki.sdf.org/doku.php?id=gemini_site_setup_and_hosti...

https://sdf.org/?tutorials/gopher

I have moved my site to gemini (and gopher), maintaining both is far easier than what I had to go through with the WEB/htmp.

techjamie•3mo ago
I can't speak for Gemini, but when I found out about Gopher I read the specification and made a very simple server implementation for myself. If you're looking for a quick project to play with, it's not a bad one to try.

It didn't support everything, mostly just basic browsing and linking. But it was cool to build something mostly compliant to a spec that quickly.

silexia•3mo ago
My website joelx.com has been available for 18 years now. Lots of articles.
bigjobby•3mo ago
I mean. There was a lot of absolute toss also?

I like the current state of webpages. I'm confident they will load quickly and render properly. I'm happy to read streamlined posts and find the bigger sites will organised and much easier to navigate than sites of old