frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Igalia, Servo, and the Sovereign Tech Fund

https://www.igalia.com/2025/10/09/Igalia,-Servo,-and-the-Sovereign-Tech-Fund.html
224•robin_reala•4h ago•33 comments

Ryanair flight landed at Manchester airport with six minutes of fuel left

https://www.theguardian.com/business/2025/oct/10/ryanair-flight-landed-at-manchester-airport-with...
147•mazokum•1h ago•118 comments

Show HN: I invented a new generative model and got accepted to ICLR

https://discrete-distribution-networks.github.io/
320•diyer22•7h ago•34 comments

I'm in Vibe Code Hell

https://blog.boot.dev/education/vibe-code-hell/
75•wagslane•53m ago•25 comments

Notes on Switching to Helix from Vim

https://jvns.ca/blog/2025/10/10/notes-on-switching-to-helix-from-vim/
22•chmaynard•2h ago•4 comments

The Molecular Basis of Long Covid Brain Fog

https://www.yokohama-cu.ac.jp/english/news/20251001takahashi.html
43•onnnon•1h ago•13 comments

Ask HN: What's the best hackable smart TV?

36•xrd•4d ago•33 comments

NanoMi: Open-source transmission electron microscope

https://sites.google.com/view/nanomi-org?usp=sharing
23•pillars•2d ago•1 comments

All-Natural Geoengineering with Frank Herbert's Dune

https://www.governance.fyi/p/all-natural-geoengineering-with-frank
25•toomuchtodo•2h ago•6 comments

A story about bypassing air Canada's in-flight network restrictions

https://ramsayleung.github.io/en/post/2025/a_story_about_bypassing_air_canadas_in-flight_network_...
113•samray•8h ago•88 comments

Ohno Type School

https://ohnotype.co/blog/ohno-type-school-a
127•tobr•4d ago•49 comments

Boring Company cited for almost 800 environmental violations in Las Vegas

https://www.propublica.org/article/elon-musk-boring-company-violations-fines-vegas-loop
16•maxeda•29m ago•2 comments

My approach to building large technical projects (2023)

https://mitchellh.com/writing/building-large-technical-projects
249•mad2021•12h ago•35 comments

Nobel Peace Prize 2025: María Corina Machado

https://www.nobelprize.org/prizes/peace/2025/summary/
464•pykello•7h ago•465 comments

Weave (YC W25) is hiring a founding AI engineer

https://www.ycombinator.com/companies/weave-3/jobs/SqFnIFE-founding-ai-engineer
1•adchurch•4h ago

Origami Patterns Solve a Major Physics Riddle

https://www.quantamagazine.org/origami-patterns-solve-a-major-physics-riddle-20251006/
25•westurner•4d ago•1 comments

Python 3.14 is here. How fast is it?

https://blog.miguelgrinberg.com/post/python-3-14-is-here-how-fast-is-it
653•pjmlp•1d ago•477 comments

Examples Are the Best Documentation

https://rakhim.exotext.com/examples-are-the-best-documentation
304•Bogdanp•21h ago•115 comments

OpenGL is getting mesh shaders as well, via GL_EXT_mesh_shader

https://www.supergoodcode.com/mesh-shaders-in-the-current-year/
66•pjmlp•4h ago•57 comments

Show HN: Lights Out: my 2D Rubik's Cube-like Game

https://raymondtana.github.io/projects/pages/Lights_Out.html
9•raymondtana•12h ago•1 comments

QA-use-MCP: MCP for E2E testing

https://www.npmjs.com/package/@desplega.ai/qa-use-mcp
3•tarasyarema•4d ago•1 comments

Parallelizing Cellular Automata with WebGPU Compute Shaders

https://vectrx.substack.com/p/webgpu-cellular-automata
43•ibobev•7h ago•5 comments

An MVCC-like columnar table on S3 with constant-time deletes

https://www.shayon.dev/post/2025/277/an-mvcc-like-columnar-table-on-s3-with-constant-time-deletes/
28•shayonj•4d ago•3 comments

I Switched from Htmx to Datastar

https://everydaysuperpowers.dev/articles/why-i-switched-from-htmx-to-datastar/
250•ksec•9h ago•179 comments

PSA: Always use a separate domain for user content

https://www.statichost.eu/blog/google-safe-browsing/
113•ericselin•3h ago•102 comments

Fascism Can't Mean Both a Specific Ideology and a Legitimate Target

https://www.astralcodexten.com/p/fascism-cant-mean-both-a-specific
12•feross•26m ago•3 comments

A small number of samples can poison LLMs of any size

https://www.anthropic.com/research/small-samples-poison
1076•meetpateltech•1d ago•396 comments

Static Bundle Object: Modernizing Static Linking

https://medium.com/@eyal.itkin/static-bundle-object-modernizing-static-linking-f1be36175064
24•ingve•2d ago•15 comments

Show HN: I've built a tiny hand-held keyboard

https://github.com/mafik/keyer
377•mafik•1d ago•102 comments

LLMs are mortally terrified of exceptions

https://twitter.com/karpathy/status/1976077806443569355
285•nought•23h ago•136 comments
Open in hackernews

PSA: Always use a separate domain for user content

https://www.statichost.eu/blog/google-safe-browsing/
113•ericselin•3h ago

Comments

duxup•3h ago
It feels like unless you're one of the big social media companies, accepting user content is slowly becoming a larger and larger risk.
jacquesm•3h ago
It always was. You're one upload and a complaint to your ISP/Google/AWS/MS away from having your account terminated.
blenderob•2h ago
But something has definitely changed over the past few years. Back in the days, it felt completely normal for individuals to spin up and run their own forums. Small communities built and maintained by regular people. How many new truly independent, individual-run forums can you name today? Hardly any. Instead we keep hearing about long-time community sites shutting down because individuals can no longer handle the risks of user content. I've seen so many posts right here on HN announcing those kinds of shutdowns.
Imustaskforhelp•2h ago
I feel like yes forums are being closed because they have migrated to the likes of things like discord

I have mixed opinions about discord and if I can be honest, I have mixed opinions about forums as well

My opinion is to take things like forums and transfer them over to things like xmpp/(Irc?)/(signal?)/(matrix most prefered)

There are bridges as well for matrix <-> Irc if this is something that interests you, there are bridges for everything but I prefer matrix with cinny and I generally think that due to its decentralized nature, it might be better than centralized forums maybe as well.

morkalork•2h ago
Is it consolidation of services? Waaaaay back in the day, imageboards like 4chan were "one complaint away from being shut down" but 24-hours later they'd be up again on another rag-tag hosting provider. Nowadays it's like one complaint to cloudflare or AWS and the site is dead dead.
pixl97•2h ago
>How many new truly independent, individual-run forums can you name today?

Almost none, but it's due to a lot of complicated factors and not just the direct risk of user content.

Take moderation of content that won't get you banned by your ISP. It sucks. Nobody in their right mind would want to do it. There are countless bots and trolls that are going to flood your forums for whatever cause they champion.

Then there is DDOS floods because you pissed off said bots and trolls. This can make the forums unaffordable and piss off your ISP.

But even if nothing goes wrong, popularity is a risk in itself. In the past there was stuff like the Slashdot effect where your site would go down for a while. But now if your small site became popular on tiktok for some reason 20 million people could show up. Even if your site can stand up to that, how will you moderate it? How will you pay for the bandwidth?

Oh, and will you get any advertisers because of said user content? How are you going to pay for the site?

Oh, also you're competing with massive sites for eyeballs, how are you going to get actual users?

wahnfrieden•2h ago
Any services successfully offloading UGC to other moderated platforms? E.g. developer tools relying on GitHub instead of storing source/assets in the service itself, and Microsoft can take care of most moderation needs. But are there consumer apps that do things like this?
jacquesm•2h ago
I think imgur and disqus are good examples of that, there are probably quite a few.
dylan604•2h ago
Your equally just one fake report to an automated system away having your account shut down. So, yes, your actions have consequences, but more worrying to me is the ability of someone with a grudge causing consequences for you as well.
jacquesm•1h ago
This is a direct consequence of centralization of services. We're doing this to ourselves.
fukka42•3h ago
Still not sure why it's legal for Google to slander companies like this. They often have no proof or it's a false positive, meanwhile they're screaming about how malicious you are.
jacquesm•3h ago
Because Google has absolutely nothing to lose and you do, besides that they can outlast anybody except for nation states in court.
fukka42•2h ago
How does this answer the question of legality?
jacquesm•2h ago
Questions of legality are answered by a judge, not by a forum
fukka42•2h ago
Could you please review the HN guidelines? Thanks.

https://news.ycombinator.com/newsguidelines.html#comments

ericselin•3h ago
Good question, it probably shouldn't be legal. Enforcing the law on these behemoths is another problem, though... :)
acoustics•3h ago
Notably this post did not examine whether any of the sites it was hosting on this domain was malicious/misleading.
fukka42•2h ago
I'm not asking about this specific case. There are plenty of examples of Google wrongly accusing others of being malicious with massive business impact
progbits•3h ago
Hosts phishing sites, gets blocked by anti phishing mechanism. Works as expected from my point of view.

Get yourself on public suffix list or get better moderation. But of course just moaning about bad google is easier.

ericselin•2h ago
You are right, of course. I'm not sure if those of you who disagree with me think that Safe Browsing did its job (which it did!), that Safe Browsing is a good thing (which it maybe is, but which I slightly disagree with), or that it's ok that Google monitors everything everyone does.

The last point is actually the one I'm trying to make.

shadowgovt•2h ago
There should be a concept, sort of an inverse of tragedy of the commons, for the positive feedback loop of many users providing big data to a company that can use that data to benefit many users.

From spamblocking that builds heuristics fed by the spam people manually flag in GMail to Safe Browsing using attacks on users' Chrome as a signal to their voice recognition engine leapfrogging the industry standard a few years back because they trained it on the low-quality signal from GOOG411 calls, Google keeps building product by harvesting user data... And users keep signing up because the resulting product is good.

This puts a lot of power in their hands but I don't think it's default bad... If it becomes bad, users leave and Google starts to lose their quality signal, so they're heavily incentivized to provide features users want to retain them.

This does make it hard to compete with them. In the US at least, antitrust competition has generally been about user harm, not actually market harm. If a company has de-facto control but customers aren't getting screwed, that's fine because ultimately the customer matters (and nobody else is owed a shot at beeing a Google).

mkishi•2h ago
It's hard to get that point because you're conflating two different stories.

Folks around here are generally uneasy about tracking in general too, but remove big brother monitoring from Safe Browsing and this story could still be the same: whole domain blacklisted by Google, only due to manual reporting instead.

"Oh, but a human reviewer would've known `*.statichost.eu` isn't managed by us"—not in a lot of cases, not really.

freehorse•2h ago
You are right, but then again, nobody flags facebook because of the scamming taking place in some facebook pages.
tartarus4o•2h ago
"Might makes right" as they say.

There is no real way a normal person even can flag facebook.

gkbrk•1h ago
If youtube.com doesn't end up on the Safe Browsing blacklist because of phishing videos, but your own website can easily end up there, it's a pretty clear case of Google abusing their power.
AlienRobot•55m ago
What is a phishing video?
jeroenhd•15m ago
YouTube hosts millions of videos telling people that they are the government/your bank and that you should move money/contact a scam center/buy cryptocurrency. Even worse is the fact you can pay to turn these videos into ads that will roll in front of other videos.

On the whole of YouTube, it's a tiny sliver of a percentage, but because YouTube has grown too large to moderate, it's still hosting these videos.

If Google applied the same rules they apply to the safe browsing list, they'd probably get YouTube flagged multiple times a week.

jeroenhd•7m ago
YouTube doesn't allow you to put your credentials into text box and hit send. Google sites, on the other hand, does pose a disk, but they'll likely be treated the same as any other domain on the PSL.

In my experience, safe browsing does theoretically allow you to report scams and phishing in terms of user generated content, but it won't apply unless there's an actual interactive web page on the other end of the link.

There is the occasional false positive but many good sites that end up on that list are there because their WordPress plugin got hacked and somewhere on their site they are actually hosting malware.

I've contacted the owners of hacked websites hosting phishing and malware content several times, and most of the time I've been accused of being the actual hacker or I've been told that I'm lying. I've given up trying to be the good guy and report the websites to Google and Microsoft these days to protect the innocent.

Google's lack of transparency what exact URLs are hosting bad material does play a role there.

NitpickLawyer•3h ago
The PSA is good, the article is meh. There is too much misdirected anger towards google here, IMO. I agree it sucks to be the false positive, but it'd also suck more to unknowingly be part of phishing campaigns and not know.

On top of that, it is also recommended to serve user content from another domain for security reasons. It's much easier to avoid entire classes of exploits this way. For the site admins: treat it as a learning experience instead of lashing out on goog. In the long run you'll be better off, having learned a good lesson.

bluesmoon•2h ago
Exactly! For a web dev in 2025 to still not know security best practices that have been around for 20+ years is a failure on the part of the dev.
hk__2•2h ago
I’m sure I don’t know ALL the "security best practices that have been around for 20+ years" and this is perfectly fine as long as I’m able to react quickly. See also https://xkcd.com/1053/.
blenderob•2h ago
It's fine if you personally didn't know that. But if I'm paying for a service, I expect the provider to understand basic security best practices that have been industry standard for 20+ years. And if they don't, they should be hiring people who do.

XKCD 1053 is not a valid excuse for what amounts to negligence in a production service.

bluesmoon•3h ago
Github discovered the same thing a long long time ago which is why you now have the github.io domain.
Macha•2h ago
In Github's case, I think it was also because a lot of security boundaries were using TLD which led x.github.com potentially grab cookies of y.github.com or worse, github.com itslef

https://news.ycombinator.com/item?id=5500612

kbolino•2h ago
Putting user content on another domain and adding that domain to the public suffix list is good advice.

So good, in fact, that it should have been known to an infrastructure provider in the first place. There's a lot of vitriol here that is ultimately misplaced away from the author's own ignorance.

ericselin•2h ago
This is of course true! It just takes an incident like this to get ones head out of ones ass and actually do it. :)
hiatus•2h ago
One can only imagine the other beginner mistakes made by this operator.
shadowgovt•2h ago
Everyone learns somehow.
kbolino•2h ago
Well, you're responding to him, so questions or suggestions are probably better than speculation.

My comment about vitriol was more directed at the HN commenters than Eric himself. Really, I think a discussion about web infrastructure is more interesting than a hatefest on Google. Thankfully, the balance seems to have shifted since I posted my top-level comment.

hiatus•2h ago
> Well, you're responding to him, so questions or suggestions are probably better than speculation.

I suspect the author is unaware of their other blindspots. It's not 2001 anymore. Holding yourself out as a hosting provider comes with some baseline expectations.

lcnPylGDnU4H9OF•45m ago
> baseline expectations

Do you have more details? That sounds interesting.

kbolino•2h ago
The good news is, once known, a lesson like this is hard to forget.

The PSL is one of those load-bearing pieces of web infrastructure that is esoteric and thanklessly maintained. Maybe there ought to be a better way, both in the sense of a direct alternative (like DNS), and in the sense of a better security model.

chrismorgan•32m ago
There’s some value in the public suffix list being shared, with mild sanity checking before accepting entries: it maintains a distinction between site (which includes all subdomains) and origin (which doesn’t). Safe Browsing wants to block sites, but if you can designate your domain a public suffix without oversight, you can bypass that so that it will only manage to block your subdomains individually (until they adjust their heuristics to something much more complicated and less reliable than what we have now).
neon_erosion•1h ago
Exactly, this has been documented knowledge for many years now, even decades. Github and other large providers of user-generated content have public-facing documentation on the risks and ways to mitigate them. Any hosting provider that chooses to ignore those practices is putting themselves, and their customers, at risk.
dawnerd•1h ago
To be fair I’ve been in the space for close to 20 years now, worked on some of the largest sites and this is the first I’m hearing of the public suffix list.
lcnPylGDnU4H9OF•43m ago
> There's a lot of vitriol here that is ultimately misplaced away from the author's own ignorance.

For what it's worth, this makes it sound like you think the vitriol should be aimed at the author's ignorance rather than the circumstances which led to it, presuming you meant the latter.

kbolino•30m ago
I do think the author's ignorance was a bigger problem--both in the sense of he should have known better and also in the sense that the PSL needs to be more discoverable--than anything Google('s automated systems) did.

However, I'm now reflecting on what I said as "be careful what you wish for", because the comments on this HN post have done a complete 180 since I wrote it, to the point of turning into a pile-on in the opposite direction.

lcnPylGDnU4H9OF•1m ago
> also in the sense that the PSL needs to be more discoverable

Well, this is a problem that caused the author's ignorance but you present it as though it's the other way around. That's all I meant. Not disagreeing with "should have known better" either.

yafinder•27m ago
For something that you think is a de-facto standard, public suffix list seems kinda raw to me for now.

I checked it for two popular public suffixes that came to mind: 'livejournal.com' and 'substack.com'. Both weren't there.

Maybe I'm mistaken, it's not a bug and these suffixes shouldn't be included, but I can't think of the reason why.

jeroenhd•20m ago
I don't know about LiveJournal, but I don't believe you can host any interactive content on substack (without hacking substack at least). You can't sign up and host a phishing site, for instance.

User-uploaded content (which does pose a risk) is all hosted on substackcdn.com.

The PSL is more for "anyone can host anything in a subdomain of any domain on this list" rather than "this domain contains user-generated content". If you're allowing people to host raw HTML and JS then the PSL is the right place to go, but if you're just offering a user post/comment section feature, you're probably better off getting an early alert if someone has managed to breach your security and hacked your system into hosting phishing.

jeroenhd•24m ago
The PSL is something you find out about after it goes wrong.

It's a weird thing, to be honest, a Github repo mentioned nowhere in any standards that browsers use to treat some subdomains differently.

Information like this doesn't just manifest itself into your brain once you start hosting stuff, and if I hadn't known about its existence I wouldn't have thought to look for a project like this either. I certainly wouldn't have expected it to be both open for everyone and built into every modern internet-capable computer or anti malware service.

sarathyweb•2h ago
Does anyone know if adding our domains to Public Suffix List will prevent incidents like this?
SquareWheel•2h ago
It's generally good advice, but I don't see that Safe Browsing did anything wrong in this case. First, it sounds like they actually were briefly hosting phishing sites:

> All sites on statichost.eu get a SITE-NAME.statichost.eu domain, and during the weekend there was an influx of phishing sites.

Second, they should be using the public suffix list (https://publicsuffix.org/) to avoid having their entire domain tagged. How else is Google supposed to know that subdomains belong to different users? That's what the PSL is for.

From my reading, Safe Browsing did its job correctly in this case, and they restored the site quickly once the threat was removed.

ericselin•2h ago
I'm not saying that Google or Safe Browsing in particular did anything wrong per se. My point is primarily that Google has too much power over the internet. I know that in this case what actually happened is because of me not putting enough effort into fending off bad guys.

The new separate domain is pending inclusion in the PSL, yes.

Edit: the "effort" I'm talking about above refers to more real time moderation of content.

dormento•2h ago
Exactly.

> Second, they should be using the public suffix list (https://publicsuffix.org/) to avoid having their entire domain tagged.

NO, Google should be "mindful" (I know companies are not people but w/e) of the power it unfortunately has. Also, Cloudflare. All my homies hate Cloudflare.

shadowgovt•2h ago
It is mindful.

... by using the agreed-upon tool to track domains that treat themselves as TLDs for third-party content: the public suffix list. Microsoft Edge and Firefox also use the PSL and their mechanisms for protecting users would be similarly suspicious that attacks originating from statichost.eu were originating from the owners of that domain and not some third-party that happened to independently control foo.statichost.eu.

sokoloff•2h ago
> My point is primarily that Google has too much power over the internet.

That is probably true, but in this case I think most people would think that they used that power for good.

It was inconvenient for you and the legitimate parts of what was hosted on your domain, but it was blocking genuinely phishing content that was also hosted on your domain.

shadowgovt•2h ago
There are two aspects to the Internet: the technical and the social.

In the social, there is always someone with most of the power (distributed power is an unstable equilibrium), and it's incumbent upon us, the web developers, to know the current status quo.

Back in the day, if you weren't testing on IE6 you weren't serving a critical mass of your potential users. Nowadays, the nameplates have changed but the same principles hold.

stickfigure•2h ago
"Google does good thing, therefore Google has too much power over the internet" is not a convincing point to make.

This safety feature saves a nontrivial number of people from life-changing mistakes. Yes we publishers have to take extra care. Hard to see a negative here.

ericselin•2h ago
I respectfully disagree with your premise. In this specific case, yes, "Google does good thing" in a sense. That is not why I'm saying Google has too much power. "Too much" is relative and whether they do good or bad debatable, of course, but it's hard to argue that they don't have a gigantic influence on the whole internet, no? :)

Helping people avoid potentially devastating mistakes is of course a good thing.

thetimman8•1h ago
You're not wrong. You just picked a poor example which illustrates the opposite of the point you're making.
ericselin•1h ago
Fair enough! :)
neon_erosion•1h ago
What point are you trying to make here? You hosted phishing sites on your primary domain, which was then flagged as unsafe. You chose not to use the tools that would have marked those sites as belonging to individual users, and the system worked as designed.
ericselin•1h ago
Please note that this tool (PSL) is not available until you have a significant user base. Which probably means a significant amount of spam as well.
zamadatix•43m ago
Where'd you see/hear that? It hasn't been my experience at least - but maybe I've just been lucky or undercounting the sites.

There are required steps to follow but none are "have x users" or "see a lot of spam". It's mostly "follow proper DNS steps and guidelines in the given format" with a little "show you're doing this for the intended reason rather than to circumvent something the PSL is not meant for/for something the public can't get to anyways" (e.g. tricking rate limits, internal only or single user personal sites) added on top.

ericselin•12m ago
https://github.com/publicsuffix/list/wiki/Guidelines#validat...

"Projects that are smaller in scale or are temporary or seasonal in nature will likely be declined. Examples of this might be private-use, sandbox, test, lab, beta, or other exploratory nature changes or requests. It should be expected that despite whatever site or service referred a requestor to seek addition of their domain(s) to the list, projects not serving more then thousands of users are quite likely to be declined."

Maybe the rules have changed, or maybe you were lucky? :)

neon_erosion•1h ago
How does flagging a domain that was actively hosting phishing sites demonstrate that Google has too much power? They do, but this is a terrible example, undermining any point you are trying to make.
jeroenhd•30m ago
The thing about Google is that they regularly get this stuff wrong, and there is no recourse when they do.

I think most people working in tech know the extent to which Google can screw over a business when they make a mistake, but the gravity of the situation becomes much clearer when it actually happens to you.

This time it's a phishing website, but what if the same happens five years down the line because of an unflattering page about a megalomaniac US politician?

kyledrake•2h ago
Google has some sort of internal flag for determining origin is different on some platforms. We don't get a complete takedown of Neocities every time there's a spam site reported. It is likely that they were not on that list but perhaps have been manually added to whatever that internal list is at this point.

The public suffix list (https://publicsuffix.org/) is good and if I were to start from scratch I would do it that way (with a different root domain) but it's not absolutely required, the search engines can and do make exceptions that don't just exclusively use the PSL, but you'll hit a few bumps in the road before that gets established.

Ultimately Google needs to have a search engine that isn't full of crap, so moving user content to a root domain on the PSL that is infested with phishing attacks isn't going to save you. You need to do prolific and active moderation to root out this activity or you'll just be right back on their shit list. Google could certainly improve this process by providing better tooling (a safe browsing report/response API would be extremely helpful) but ultimately the burdon is on platforms to weed out malicious activity and prevent it from happening, and it's a 24/7 job.

BTW the PSL is a great example of the XKCD "one critical person doing thankless unpaid work" comic, unless that has changed in recent years. I am a strong advocate of having the PSL management become an annual fee driven structure (https://groups.google.com/g/publicsuffix-discuss/c/xJZHBlyqq...), the maintainer deserves compensation for his work and requiring the fee will allow the many abandoned domains on the list to drop off of it.

IX-103•2h ago
If you're not using separate domains then I hope you don't have any kind of sensitive information stored in cookies. You can't rely on the path restrictions for cookies because it's easily bypassed.
kyledrake•2h ago
You can set cookies that strictly stay on the root domain and don't cross to subdomain origins, and vise versa (https://developer.mozilla.org/en-US/docs/Web/HTTP/Guides/Coo...). We've been doing this for 12 years without issue.

Strict cookies crossing root to subdomains would be a major security bug in browsers. It's always been a (valid) theoretical concern but it's never happened on a large scale to the point I've had to address it. There is likely regression testing on all the major browsers that will catch a situation where this happens.

veeti•2h ago
It can happen to anyone and cause a reputational risk. Once upon a time $workplace had a Zoho Form that would be blacklisted by Google Safe Browsing or Microsoft Edge for arbitrary periods of time, presumably because someone used Zoho to make a phishing site, leading to some very confused calls.
freehorse•2h ago
Sounds a very convenient mistake to do on your competitors? It does not sound believable that they would not know what zoho was or that it makes no sense to flag all the zoho domain.
haktan•2h ago
If user1.statichost.page gets blacklisted now will it affect user2.statichost.page as well?
kijin•2h ago
Yes, unless they submit statichost.page to the public suffix list.
kijin•2h ago
It's also good from a security perspective.

Anyone who can upload HTML pages to subdomain.domain.com can read and write cookies for *.domain.com, unless you declare yourself a public suffix and enough time has passed for all the major browsers to have updated themselves.

I've seen web hosts in the wild who could have their control panel sessions trivially stolen by any customer site. Reported the problem to two different companies. One responded fairly quickly, but the other one took several years to take any action. They eventually moved customers to a separate domain, so the control panel is now safe. But customers can still execute session fixation attacks against one another.

ericselin•2h ago
(Author here) This is all true. The main assumption from my part is that anything remotely important or even sensitive should be and is hosted on a domain that is _not_ companysubdomain.domain.com but instead www.company.com.
freehorse•2h ago
I don't see how a separate domain would solve the main issue here. If something on that separate domain was flagged, it would still affect all user content on that domain. If your business is about serving such user content, the main service of your business would be down, even though your main domain would still be up.
ericselin•2h ago
You are right, it would still affect all users. Until the pending PSL inclusion is complete, that is. But it now separates my own resources, such as the website and dashboard of statichost.eu from that.
toast0•2h ago
A separate domain may not prevent users' conten from being blocked, but it may prevent blocking of the administrative interfaces. Which would help affected customers get their content and the service could more easily put a banner advising users of the situation, etc.
thehyperflux•2h ago
Google services simply behaved the way I would expect them to here. Who knows... they may even have saved some users from coming to harm.
ericselin•2h ago
That is a great point. When I see these sites I'm always seeing a dozen red flags, and maybe the biggest one is that it's showing a "NatWest" banking site or something and is hosted on "portal-abc.statichost.eu". But the whole point is of course saving users from coming to harm, and if it did - great!
johnwheeler•2h ago
Seems like a reasonable trade-off I mean six hours is not the worst thing in the world. What if you were hosting mission-critical such as such? Were you?
oefrha•2h ago
Honestly, this is extremely basic stuff in hosting, not only due to safe browsing, but also—and more importantly—cookie safety, etc. If a hosting provider didn’t know (already bad enough) and turn to whining after being hit, then

> Static site hosting you can trust

is more like amateur hour static site hosting you can’t trust. Sorry.

ericselin•1h ago
The thing is, you cannot just add any domain to the PSL. You need a significant amount of users before they will include your domain. Before recently, there really was no point in even submitting, since the domain would have been rejected as too small. An increase in user base, increase in malicious content and the ability to add your domain to the PSL all happen sort of simultaneously.

I'm also trusting my users to not expose their cookies for the whole *.statichost.eu domain. And all "production" sites use a custom domain anyway, which avoids all of this anyway.

neon_erosion•1h ago
There are well-documented solutions to this that don't rely on the PSL. Choosing to ignore all of that advice while hosting user content is a very irresponsible choice, at best.
shadowgovt•2h ago
Not sure who changed the HN headline, but I appreciate the change. Especially since the concept in the headline is buried at the bottom of the post.

Post author is throwing a lot of sand at Google for a process that has (a) been around for, what, over a decade now and (b) works. The fact of the matter is this hosting provider was too open, several users of the provider used it to put up content intended to attack users, and as far as Google (or anyone else on the web is concerned) the TLD is where the buck stops for that kind of behavior. This is one of the reasons why you host user-generated content off your TLD, and several providers have gotten the memo; it is unfortunate statichost.eu had not yet.

I'm sorry this domain admin had to learn an industry lesson the hard way, but at least they won't forget it.

gwbas1c•2h ago
> To be fair, many or even most sites on the Google Safe Browsing blacklist are probably unworthy. But I’m pretty sure this was not the first false positive.

The bigger issue is that the internet needs governance. And, in the absence of regulation, someone has stepped in and done it in a way that the author didn't like.

Perhaps we could start by requiring that Google provide ways to contact a living, breathing human. (Not an AI bot that they claim is equivalent.)

dylan604•2h ago
why do you assume that the living, breathing human hired by theGoogs will be competent at handling all of the crazy that will be flung at them by the living, breathing human on the other end of the line. One single person cannot handle that. Naturally, you need a team of living, breathing humans. You might even have them in triage level groups like level 1 support, level 2 support and so on where each level is a more trained/experienced living, breathing human. Eventually, you'll have an entire department of people of varying degrees of skill. Oh, wait, I'm sorry, I thought it was the year 2000.

Hopefully, this helps you understand why your living, breathing human is such a farcical idea for theGoogs to consider.

gwbas1c•51m ago
Well, Google did self-appoint itself the "internet police," and the general job of the police is to deal with screwballs.

So you can't take one part of the responsibility and abdicate the other part!

dynm•2h ago
This is a bit of a tangent, the whole concept of "domain reputation" can be infuriating. For example, my blog has been marked as suspicious by spamhaus.org: https://check.spamhaus.org/results?query=dynomight.net

As a result, some ISPs apparently block the domain. Why is it listed? I have no idea. There are no ads, there is no user content, and I've never sent any email from the domain. I've tried contacting spamhaus, but they instantly closed the ticket with a nonsensical response to "contact my IT department" and then blocked further communication. (Oddly enough, my personal blog does not have an IT department.)

Just like it's slowly become quasi-impossible for an individual to host their own email, I fear the same may happen with independent websites.

MoreQARespect•2h ago
From reading that my guess would be that the IP of your host gotten from your hosting provider had some spammy history before you started hosting your blog on it.

Either that or your DNS provider hosts a lot of spam.

dynm•1h ago
Hmmm, I use https://njal.la/ for DNS. Could spamhaus really just auto-mark every njalla user as suspicious?
Retr0id•2h ago
I don't like nor trust google, but "Use your own judgement and hard-earned Internet street smarts" doesn't work either, because the median internet user does not have anything resembling internet street smarts.
ArnoVW•1h ago
As a CISO I am happy with many of the protections that Google creates. They are in a unique position, and probably the only ones to be able to do it.

However, I think the issue is that with great power comes great responsibility.

They are better than most organisations, and working with many constraints that we cannot always imagine.

But several times a week we get a false "this mail is phishing" incident, where a mail from a customer or prospect is put in "Spam", with a red security banner saying it contains "dangerous links". Generally it is caused by domain reputation issues, that block all mail that uses an e-mail scanning product. These products wrap URLs so they can scan when the mail is read, and thus when they do not detect a virus, they become defacto purveyors of virii, and their entire domain is tagged as dangerous.

I have raised this to Google in May (!) and have been exchanging mail on a nearly daily basis. Pointing out a new security product that has been blacklisted, explaining the situation to a new agent, etc.

Not only does this mean that they are training our staff that security warnings are generally false, but it means we are missing important mail from prospects and customers. Our customers are generally huge corporations, missing a mail for us is not like missing one mail for a B2C outfit.

So far the issue is not resolved (we are in Oct now!) and recently they have stopped responding. I appreciate our organisation is not the US Government, but still, we pay upwards of 20K$ / year for "Google Workspace Enterprise" accounts. I guess I was expecting something more.

If someone within Google reads this: you need to fix this.

seethishat•1h ago
I'm old. I've been doing security for a very long time. Started back in the 1990s. Here's what I have learned over the last 30 years...

Half (or more) of security alerts/warnings are false positives. Whether it's the vulnerability scanner complaining about some non-existent issues (based on the version of Apache alone... which was back ported by the package maintaner), or an AI report generated by interns at Deloitte fresh out of college, or someone reporting www.example.com to Google Safe Browsing as malicious, etc. At least half of the things they report on are wrong.

You sort of have to have a clue (technically) and know what you are doing to weed through all the bullshit. Tools that block access, based on these things do more harm than good.

sire-vc•1h ago
I am a solo developer. I recently created a new web app for a client. Google has marked as phishing so they can't use it. Obviously I can't do anything about it except report error and wait. I'm worried if I move it to a new domain that one will get marked as well. Not sure what to do TBH.
procaryote•1h ago
Is it phishing?
sire-vc•1h ago
No, however it does include a Microsoft entra/Azure AD/Microsoft 365 login for that clients tenant. It is also a newly registered domain so I can understand why it looks suspicious. The most frustrating thing is that this is all a machine I.e. no-one I can speak to, nothing I can do to fix it. My fate has been decided by an algorithm.
seanw265•1h ago
I’ve got a random subdomain hosting a little internal tool. About twice a year, Google Safe Browsing decides it’s phishing and flags it. Sometimes they flag the whole domain for good measure.

Search Console always points to my internal login page, which isn’t public and definitely isn’t phishing.

They clear it quickly when I appeal, and since it’s just for me, I’ve mostly stopped worrying about it.

watermelon0•1h ago
I encountered something similar. I have `*.domain.tld` pointed to an internal IP address, and over the past few years it happened a few times where some subdomain would be flagged as dangerous by Google Safe Browsing.
ericselin•1h ago
Since there's a lot of discussion about the Public Suffix list, let me point out that it's not just a webform where you can add any domain. There's a whole approval process where one very important criterion is that the domain to be added has a large enough user base. When you have a large enough user base, you generally have scammers as well. That's what happened here.

It basically goes: growing user base -> growing amount of malicious content -> ability to submit domain to PSL. In that order, more or less.

In terms of security, for me, there's no issue with being on the same domain as my users. My cookies are scoped to my own subdomain, and HTTPS only. For me, being blocked was the only problem, one that I can honestly admit was way bigger than I thought.

Hence, the PSA. :)