frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Does anyone remember websites?

http://tttthis.com/rememberwebsites.php/
38•lr0•3h ago

Comments

snitzr•3h ago
https://www.spacejam.com/1996/
devin•3h ago
If this is true to the original, I am surprised to see this was table-oriented layout and not a bitmap image with clickable x,y coordinates.
hosh•3h ago
Table-oriented layouts was a thing back then too.
zahlman•2h ago
It was common to make tables and use them to assemble a bitmap, where each cell had zero border/margin/padding and an exact size, and contained a "slice" of the image. Web authoring tools (and Photoshop) even had explicit support for generating this sort of thing, as I recall. This was I guess simpler to automate than defining clickable regions of a single image, and it allowed for the individual pieces of the image to be requested in parallel on slow connections (adding another dimension of progressive loading).
oxguy3•2h ago
Ah, nothing like trying to save the logo from such a website, then discovering the image you saved is partially cut-off and includes the navbar behind it instead having a transparent background.
devin•1h ago
Yeah, I remember this. Macromedia Fireworks had a slice tool that I used quite a bit. You'd basically make an image which was your website, and then do all the layout with zero border tables. But for me, this was what I was doing circa 2004 before CSS was dominant. Earlier software from the '96 era like Frontpage I think would use bitmaps whole cloth, but maybe I'm misremembering.
zahlman•3h ago
> You’ve enabled HTTPS-Only Mode for enhanced security, and a HTTPS version of tttthis.com is not available.

This, too, is nostalgic, in a way.

mapontosevenths•3h ago
It's silly to try and encrypt everything, and arguably it can be worse for the authors privacy.

Sometimes a blog post on a plain http web site doesnt need to be encrypted.

bee_rider•3h ago
What’s the argument that it can be worse for the author’s privacy?

In general, I think we should encrypt everything. The more encrypted stuff floating around, the less it stands out, and the better for everybody’s privacy. Of course, nowadays encrypted content is quite common. But it didn’t become that way without effort!

Tepix•3h ago
If you don‘t encrypt everything a malicious actor like some ISPs can inject nasty things like pervasive tracking or zero day exploits.
cpa•3h ago
It helps against this kind of stuff (2015) https://blog.fox-it.com/2015/04/20/deep-dive-into-quantum-in...
simpaticoder•2h ago
I disagree. The primary threat model for unencrypted http connections is a MITM attack. A middle box (a proxy or router) modifies the response payload to inject malicious content or modify the content. For an ordinary blog or personal website an attacker can gain compute, violate privacy, acquire a (minor) DDOS source, on the blogs users by injecting a script.

Another type of attack would modify the content of the site to suit the attackers purpose - either to hurt the author and/or their message. Consider the damage an attacker can do if they injected CSAM onto a person's blog. The victim's life would be ruined long before the wheels of justice turned (if they turn at all). The one mitigating factor is that you'd need to have reliable control over a relatively stable middle-box to execute this attack, but that's quite feasible. Last but not least don't underestimate the way software grows. Sooner or later someone is going to implement HTTP basic authentication over plain HTTP and, needless to say, that's a bad idea.

Look, I don't like it either. I remember when you could telnet into a server and interact with it. That was good for pedagogy and building a mental model of the protocol. But we have to deal with how things are, not how we want them to be.

ranger_danger•2h ago
Unfortunately this isn't 1999, and bad actors are everywhere. Even ISPs themselves (cough Comcast) have been injecting unsolicited new code into people's webpages for many years now.
username223•3h ago
It was kind of awesome that you could telnet to port 80, type "GET / HTTP/1.0", press return a couple of times, and receive a web page. Then shitty hotel wifi that injected ads happened, so we had to encrypt traffic that had absolutely no sensitive information.
epapsiou•3h ago
I remember. Stumbled upon
morcus•3h ago
This seems like it has at least partial overlap with the "small web": see https://kagi.com/smallweb
xavierstein•3h ago
I think that Homestar Runner did a great short[1] about this for their 25th anniversary.

[1] https://homestarrunner.com/toons/backtoawebsite

mojuba•3h ago
The early internet was like some settlers' shacks, built uncontrollably and unsystematically whereas the modern web is all skyscrapers, residential and business. Uniform apartments, uniform offices all looking very similar differeing in only subtle interior design details here and there.

Should we go back to the shack era? Of course not. But maybe we should start a new era of land exploration and start over. It shouldn't necessarily be Internet 3.0, might be something else completely. AR/VR? Possibly although that has already failed once.

Llamamoe•3h ago
The only thing missing from your analogy is the fact that the shacks were filled with personal diaries and curios, while the skyscrapers are mostly chock-full of homogenous sewage slurry.
bee_rider•3h ago
Also the shacks weren’t really particularly shabby or anything, they were just more like well-enough-constructed single family homes.

Old websites before scripting became popular were pretty much solid in that boring-tech way. Hardware and networks were not as reliable, but the sites themselves could be fine via simplicity.

Modern overdesigned sites are sort of like modern apartment buildings: shitty build quality under fake plastic marble and wood.

beckthompson•2h ago
If you've visited old mining operations / shacks that's pretty common! There are always some weird choices and cool things to see
bee_rider•3h ago
> Should we go back to the shack era? Of course not.

This isn’t obvious, at least, we can’t write the idea off with an “of course not.”

mojuba•2h ago
Keep in mind the early websites were mostly built by an enthusiast minority, technical or not but willing to learn HTML and Netscape Composer. You can't expect the whole humanity to be as enthusiastic. The skyscraper era, no matter how much we all hate it, makes the web more democratic: it gives everyone some standardized space (Facebook, Youtube, etc) with algorithmized discovery which is parking and elevators if you want to continue the analogy.
gdubs•2h ago
Hard to live through what social media has done to society over the past decade without at least entertaining the idea that the higher barrier to entry of being online was maybe not a bad thing.
DaveZale•2h ago
yes, for sure! It was a different time. Early website authors were pioneers. They had something worth sharing and they thought it worthwhile enough to learn some coding. Nobody was trying to push ads and monetize, and there was no ubiquitous tracking or cookies
mojuba•2h ago
I don't disagree but notice how it's about the second decade of Web 2.0, not the first one. Profit-driven algorithms is a separate era in its own right. I.e. you can't blame the skyscrapers themselves for your shitty life, you just need to demand more regulation.
_DeadFred_•19m ago
If the skyscraper is designed with elevators that try to keep me in and away from the first floor so I don't leave I can definitely complain.
krapp•2h ago
If we're talking about the internet before Eternal September, maybe, but putting up a site on Geocities or Tripod or using Dreamweaver certainly was not a high barrier to entry.
pessimizer•2h ago
I wouldn't agree that the higher barrier to entry was a good thing, but I also would say that the barrier to entry was actually pretty low, with angelfire, geocities, etc. Dreamweaver + other wysiwyg, and the lack of a necessity of a giant js framework with bundling and tree-shaking.

The problem is that the barrier to entry got too low, so it was necessary for large companies to interpose themselves between producers and audiences, starting with google (becoming something other than a grep for the web, and instead becoming the editor and main income source for the web) and expanding outwards into facebook.

Remember that we started with walled gardens like AOL and Compuserve, and the web (and the end of those companies) was people desperate to break out of them. Now people have been herded in again since the indexers bought the ad companies.

bee_rider•1h ago
Facebook and YouTube are top-down managed systems, and I think it is a real disservice to the idea of democracy to call this sort of thing “more democratic.” They are democratic like a mall is, which is to say, not.
bryanrasmussen•2h ago
>AR/VR? Possibly although that has already failed once.

I'm pretty sure it's already failed 3 times.

assimpleaspossi•2h ago
I like to compare today's web to radio in the late 1800s and early 1900s.

Back then, if you could piece together a transmitter and throw an antenna up, you were a broadcaster and many broadcast whatever they felt like. Just like today's internet.

Social media is the CB radio of the 1970s and 80s when anyone could buy a small rig and do all kinds of weird and wild things for cheap.

But, eventually, something had to reign in all that and the FCC along with international laws and standards came up to calm all that down. In the same way, I think the internet will eventually become licensed and regulated.

pessimizer•2h ago
> But, eventually, something had to reign in all that and the FCC along with international laws and standards came up to calm all that down.

No, it actually stayed pretty lively until the 90s, when the government decided that there could be huge monopolies in media, all the stations were bought up by like 6 guys, and were automated to play Disney music 24 hours a day.

Not such a neat story, right?

krapp•2h ago
The rationale behind the FCC is that it's regulating a limited resource (spectrum space.) The web is not a limited resource (although bandwidth is, but that's a different debate.) The web is also international, and we're already seeing conflicts where one country tries to force their regulations onto another. That metaphor just doesn't work where the web is concerned.

I agree that the web in the US, and specifically large social media platforms, will probably be regulated because that seems to be one of the few things both parties agree on for their own reasons. But more so because the government wants to control information and surveil citizens. I think the balkanization of the web as a whole into smaller, closed networks is probably inevitable.

But what's most depressing of all is how many people in tech and on HN would be thrilled if one needed a license to publish on the internet just because that would implicitly push most people off of the web and leave it for a privileged elite.

As bad as social media can be (and I think its harm is often oversold for political ends) having a space where anyone can publish and communicate and create freely, where different platforms can exist and cater to different needs, where media isn't entirely controlled and gatekept by corporations, is critically important. More important than any other communications paradigm before it, including the printing press.

It's really going to be sad when we burn it all down, because it seems unlikely anyone is going to make something as free and open as the web ever again.

c22•2h ago
The FCC licenses radio broadcasters because the spectrum is finite. Which finite aspects of the internet do you see driving such eventual practice?
ptero•2h ago
> Should we go back to the shack era? Of course not.

I am not sure. Different people want different things. I ran a Hetzner cloud instance where I toss a simple webpage with locally hosted travel photos for friends and family. And a Jupiter server (with a very weak password) on the same instance for myself and any friend when we want something more powerful than a calculator.

And this messy, improperly organized, breaking all design patterns way works just fine for me. So I'm fine with a shack for personal communication and as a personal space. My 2c.

cal85•3h ago
The last sentence can’t be true. If you go looking for them, they’re easy to find. The problem is you don’t.
MountDoom•3h ago
Sort of. There are many confounding factors here. For one, they're harder to find because the number of personal websites doesn't scale as quickly as commercial content and SEO spam. It's also a bit of a vicious cycle, because if your website is less likely to be read by anyone, why bother writing it in the first place?

But for the most part, the very people bemoaning the current state of affairs then go back to scrolling through TikTok / Instagram / Facebook / Reddit.

cal85•2h ago
Or, when you do go looking, it doesn’t feel the same. Why?
prewett•2h ago
Because you were 25 years younger. When you are, say, 20 years old, people >= 20 years old are likely to be interesting, which was pretty much everyone with a web page. When you are 45 years old, writings from 20 year olds are much less interesting, on average.
LarsDu88•3h ago
Does anyone remember geocities, tripod, webrings, and Amazon affiliate links?

Pepperidge farm remembers...

DaveZale•2h ago
I am currently building on neocities.org - that is the old geocities
DaveZale•2h ago
I am compiling a website on neocites.org ... for about two months now, and I won't be complete for another year. It's basically a place for photos, maps and descriptions of a local community xeriscape garden with about 400 specimens.

Others will take photos and videos of the place throughout the year, and post to social media, where they instantly get a couple dozen thumbs up, and gloat about it. That is not my intention. I want a coffee table book.

krapp•2h ago
Websites do still exist. People do know what they are. There are more of them on the web than there ever have been. Nothing is stopping anyone from creating a website if they want. Nothing is stopping you.
pessimizer•2h ago
They will not be indexed by search engines, though, so you had better email all your friends so you'll have a few visitors.
chneu•2h ago
Yes they will?

I run/host a bunch of personal websites for friends.

I do nothing special to get them indexed and they are all on search engines.

techjamie•24m ago
I've created a few sites and never explicitly told search engines about them, and they got picked up just fine surprisingly quickly.
TypicalHog•2h ago
https://wiby.me/
tuukkah•2h ago
I think e.g. Mastodon with IndieWeb is a way to fight against the enshittification and to bring back the good from the early years. "The IndieWeb is a people-focused alternative to the “corporate web”. We are a community of independent and personal websites based on the principles of: owning your domain and using it as your primary online identity, publishing on your own site first (optionally elsewhere), and owning your content." https://indieweb.org/
jmclnx•2h ago
This comes up every so often, and I always post something like this :)

It still exists with Gemini protocol and gopher:

https://www.linux-magazine.com/Issues/2021/245/The-Rise-of-t...

https://en.wikipedia.org/wiki/Gemini_(protocol)

https://en.wikipedia.org/wiki/Gopher_(protocol)

https://geminiprotocol.net/

https://wiki.sdf.org/doku.php?id=gemini_site_setup_and_hosti...

https://sdf.org/?tutorials/gopher

I have moved my site to gemini (and gopher), maintaining both is far easier than what I had to go through with the WEB/htmp.

techjamie•25m ago
I can't speak for Gemini, but when I found out about Gopher I read the specification and made a very simple server implementation for myself. If you're looking for a quick project to play with, it's not a bad one to try.

It didn't support everything, mostly just basic browsing and linking. But it was cool to build something mostly compliant to a spec that quickly.

silexia•1m ago
My website joelx.com has been available for 18 years now. Lots of articles.

GNU Health

https://www.gnuhealth.org/about-us.html
142•smartmic•2h ago•31 comments

The <output> Tag

https://denodell.com/blog/html-best-kept-secret-output-tag
558•todsacerdoti•9h ago•132 comments

Microsoft Amplifier

https://github.com/microsoft/amplifier
98•JDEW•2h ago•76 comments

Vibing a non-trivial Ghostty feature

https://mitchellh.com/writing/non-trivial-vibing
93•skevy•3h ago•33 comments

Show HN: Gnokestation Is an Ultra Lightweight Web Desktop Environment

https://gnokestation.netlify.app
9•edmundsparrow•42m ago•3 comments

Testing two 18 TB white label SATA hard drives from datablocks.dev

https://ounapuu.ee/posts/2025/10/06/datablocks-white-label-drives/
29•thomasjb•5d ago•13 comments

AMD and Sony's PS6 chipset aims to rethink the current graphics pipeline

https://arstechnica.com/gaming/2025/10/amd-and-sony-tease-new-chip-architecture-ahead-of-playstat...
241•zdw•13h ago•263 comments

The World Trade Center under construction through photos, 1966-1979

https://rarehistoricalphotos.com/twin-towers-construction-photographs/
119•kinderjaje•4d ago•48 comments

Superpowers: How I'm using coding agents in October 2025

https://blog.fsck.com/2025/10/09/superpowers/
156•Ch00k•10h ago•96 comments

Crypto-Current (2021)

https://zerophilosophy.substack.com/p/crypto-current
5•keepamovin•5d ago•3 comments

Windows Subsystem for FreeBSD

https://github.com/BalajeS/WSL-For-FreeBSD
150•rguiscard•10h ago•41 comments

A Quiet Change to RSA

https://www.johndcook.com/blog/2025/10/06/a-quiet-change-to-rsa/
55•ibobev•4d ago•18 comments

How to Check for Overlapping Intervals

https://zayenz.se/blog/post/how-to-check-for-overlapping-intervals/
26•birdculture•2h ago•7 comments

I built physical album cards with NFC tags to teach my son music discovery

https://fulghum.io/album-cards
502•jordanf•21h ago•175 comments

Building a JavaScript Runtime from Scratch using C

https://devlogs.xyz/blog/building-a-javaScript-runtime
23•redbell•3d ago•15 comments

A Library for Fish Sounds

https://nautil.us/a-library-for-fish-sounds-1239697/
23•pistolpete5•4d ago•4 comments

Wilson's Algorithm

https://cruzgodar.com/applets/wilsons-algorithm/
10•FromTheArchives•4h ago•1 comments

(Re)Introducing the Pebble Appstore

https://ericmigi.com/blog/re-introducing-the-pebble-appstore/
239•duck•20h ago•43 comments

How hard do you have to hit a chicken to cook it? (2020)

https://james-simon.github.io/blog/chicken-cooking/
150•jxmorris12•16h ago•90 comments

Daniel Kahneman opted for assisted suicide in Switzerland

https://www.bluewin.ch/en/entertainment/nobel-prize-winner-opts-for-suicide-in-switzerland-261946...
403•kvam•10h ago•356 comments

Tangled, a Git collaboration platform built on atproto

https://blog.tangled.org/intro
275•mjbellantoni•20h ago•71 comments

Programming in the Sun: A Year with the Daylight Computer

https://wickstrom.tech/2025-10-10-programming-in-the-sun-a-year-with-the-daylight-computer.html
140•ghuntley•18h ago•47 comments

Let's Take Esoteric Programming Languages Seriously

https://feelingof.com/episodes/078/
63•strombolini•3d ago•14 comments

Does our “need for speed” make our wi-fi suck?

https://orb.net/blog/does-speed-make-wifi-suck
236•jamies•23h ago•278 comments

Show HN: I invented a new generative model and got accepted to ICLR

https://discrete-distribution-networks.github.io/
610•diyer22•1d ago•82 comments

AV2 video codec delivers 30% lower bitrate than AV1, final spec due in late 2025

https://videocardz.com/newz/av2-video-codec-delivers-30-lower-bitrate-than-av1-final-spec-due-in-...
233•ksec•9h ago•140 comments

Learn Turbo Pascal – a video series originally released on VHS

https://www.youtube.com/watch?v=UOtonwG3DXM
91•AlexeyBrin•6h ago•32 comments

Synthetic aperture radar autofocus and calibration

https://hforsten.com/synthetic-aperture-radar-autofocus-and-calibration.html
160•nbernard•3d ago•9 comments

Firefox is the best mobile browser

https://kelvinjps.com/blog/firefox-best-mobile-browser/
167•kelvinjps10•4h ago•94 comments

Show HN: A Digital Twin of my coffee roaster that runs in the browser

https://autoroaster.com/
120•jvkoch•5d ago•35 comments