frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: Env-shelf – Open-source desktop app to manage .env files

https://env-shelf.vercel.app/
1•ivanglpz•2m ago•0 comments

Show HN: Almostnode – Run Node.js, Next.js, and Express in the Browser

https://almostnode.dev/
1•PetrBrzyBrzek•2m ago•0 comments

Dell support (and hardware) is so bad, I almost sued them

https://blog.joshattic.us/posts/2026-02-07-dell-support-lawsuit
1•radeeyate•3m ago•0 comments

Project Pterodactyl: Incremental Architecture

https://www.jonmsterling.com/01K7/
1•matt_d•3m ago•0 comments

Styling: Search-Text and Other Highlight-Y Pseudo-Elements

https://css-tricks.com/how-to-style-the-new-search-text-and-other-highlight-pseudo-elements/
1•blenderob•5m ago•0 comments

Crypto firm accidentally sends $40B in Bitcoin to users

https://finance.yahoo.com/news/crypto-firm-accidentally-sends-40-055054321.html
1•CommonGuy•6m ago•0 comments

Magnetic fields can change carbon diffusion in steel

https://www.sciencedaily.com/releases/2026/01/260125083427.htm
1•fanf2•6m ago•0 comments

Fantasy football that celebrates great games

https://www.silvestar.codes/articles/ultigamemate/
1•blenderob•6m ago•0 comments

Show HN: Animalese

https://animalese.barcoloudly.com/
1•noreplica•7m ago•0 comments

StrongDM's AI team build serious software without even looking at the code

https://simonwillison.net/2026/Feb/7/software-factory/
1•simonw•7m ago•0 comments

John Haugeland on the failure of micro-worlds

https://blog.plover.com/tech/gpt/micro-worlds.html
1•blenderob•8m ago•0 comments

Show HN: Velocity - Free/Cheaper Linear Clone but with MCP for agents

https://velocity.quest
2•kevinelliott•8m ago•2 comments

Corning Invented a New Fiber-Optic Cable for AI and Landed a $6B Meta Deal [video]

https://www.youtube.com/watch?v=Y3KLbc5DlRs
1•ksec•10m ago•0 comments

Show HN: XAPIs.dev – Twitter API Alternative at 90% Lower Cost

https://xapis.dev
2•nmfccodes•10m ago•0 comments

Near-Instantly Aborting the Worst Pain Imaginable with Psychedelics

https://psychotechnology.substack.com/p/near-instantly-aborting-the-worst
2•eatitraw•16m ago•0 comments

Show HN: Nginx-defender – realtime abuse blocking for Nginx

https://github.com/Anipaleja/nginx-defender
2•anipaleja•17m ago•0 comments

The Super Sharp Blade

https://netzhansa.com/the-super-sharp-blade/
1•robin_reala•18m ago•0 comments

Smart Homes Are Terrible

https://www.theatlantic.com/ideas/2026/02/smart-homes-technology/685867/
1•tusslewake•20m ago•0 comments

What I haven't figured out

https://macwright.com/2026/01/29/what-i-havent-figured-out
1•stevekrouse•20m ago•0 comments

KPMG pressed its auditor to pass on AI cost savings

https://www.irishtimes.com/business/2026/02/06/kpmg-pressed-its-auditor-to-pass-on-ai-cost-savings/
1•cainxinth•20m ago•0 comments

Open-source Claude skill that optimizes Hinge profiles. Pretty well.

https://twitter.com/b1rdmania/status/2020155122181869666
3•birdmania•20m ago•1 comments

First Proof

https://arxiv.org/abs/2602.05192
5•samasblack•23m ago•2 comments

I squeezed a BERT sentiment analyzer into 1GB RAM on a $5 VPS

https://mohammedeabdelaziz.github.io/articles/trendscope-market-scanner
1•mohammede•24m ago•0 comments

Kagi Translate

https://translate.kagi.com
2•microflash•25m ago•0 comments

Building Interactive C/C++ workflows in Jupyter through Clang-REPL [video]

https://fosdem.org/2026/schedule/event/QX3RPH-building_interactive_cc_workflows_in_jupyter_throug...
1•stabbles•26m ago•0 comments

Tactical tornado is the new default

https://olano.dev/blog/tactical-tornado/
2•facundo_olano•27m ago•0 comments

Full-Circle Test-Driven Firmware Development with OpenClaw

https://blog.adafruit.com/2026/02/07/full-circle-test-driven-firmware-development-with-openclaw/
1•ptorrone•28m ago•0 comments

Automating Myself Out of My Job – Part 2

https://blog.dsa.club/automation-series/automating-myself-out-of-my-job-part-2/
1•funnyfoobar•28m ago•1 comments

Dependency Resolution Methods

https://nesbitt.io/2026/02/06/dependency-resolution-methods.html
1•zdw•29m ago•0 comments

Crypto firm apologises for sending Bitcoin users $40B by mistake

https://www.msn.com/en-ie/money/other/crypto-firm-apologises-for-sending-bitcoin-users-40-billion...
1•Someone•29m ago•0 comments
Open in hackernews

Time Series Forecasting with Graph Transformers

https://kumo.ai/research/time-series-forecasting/
131•turntable_pride•7mo ago

Comments

ziofill•7mo ago
I can't stand websites that override scrolling
pealco•7mo ago
Most of my time interacting with this site was spent in developer tools, trying to figure out where the scrolling behavior was coming from. (Couldn't figure it out.) I can't understand why people are still doing this in 2025.
almosthere•7mo ago
Most likely the developer is using a Windows computer.
bestest•7mo ago
Enter this in the console:

document.body.onwheel = (e) => e.stopPropagation();

rossant•7mo ago
I came here to say this. Don't mess with my scrollbar. Ever.
monkeydust•7mo ago
wow didn't realize that until I saw this comment, now I cant unrealize it and angry
cwmoore•7mo ago
“Here, sign this.”

    accept all cookies
cye131•7mo ago
I'm not a fan of this blog post as it tries to pass off a method that's not accepted as a good or standard time series methodology (graph transformers) as though it were a norm. Transformers perform poorly on time series, and graph deep learning performs poorly for tasks that don't have real behaviorial/physical edges (physical space/molecules/social graphs etc), so it's unclear why combining them would produce anything useful for "business applications" of time series like sales forecasting.

For those interested in transformers with time series, I recommend reading this paper: https://arxiv.org/pdf/2205.13504. There is also plenty of other research showing that transformers-based time series models generally underperform much simpler alternatives like boosted trees.

After looking further it seems like this startup is both trying to publish academic research promoting these models as well as selling it to businesses, which seems like a conflict of interest to me.

tough•7mo ago
thoughts on TimesFM?

> After looking further it seems like this startup is both trying to publish academic research promoting these models as well as selling it to businesses, which seems like a conflict of interest to me.

is this a general rule of thumb that one should not use the same organization to publish research and pursue commercialization generally?

orochimaaru•7mo ago
Not really. There is no rule against it. You can have a team that research, publishes, patents and shares the patents with commercial scalers. It’s easier with ML than with manufacturing.
shirokiba•7mo ago
Would you be so kind as to recommend some resources on modern, promising methods for time series forecasting? I'm starting a position doing this work soon and would like to learn more about it if you'd be willing to share
srean•7mo ago
Read all the M series of competitions and the papers that come out of those exercises. Read Keogh. Also have a healthy respect and understanding of the traditional methods rather than getting distracted by all that happens to be shiny now.
lamename•7mo ago
Wow a sane person among all the hype. Great to see you!
srean•7mo ago
Lol. Yeah, the hype train blinds.
ethan_smith•7mo ago
Recent work like Informer (AAAI'21) and Autoformer (NeurIPS'21) have shown competitive performance against statistical methods by addressing the quadratic complexity and long-range dependency issues that plagued earlier transformer architectures for time series tasks.
rusty1s•7mo ago
Hey, one of the authors here—happy to clarify a few things.

> Transformers perform poorly on time series.

That’s not quite the point of our work. The model isn’t about using Transformers for time series per se. Rather, the focus is on how to enrich forecasting models by combining historical sequence data with external information, which is often naturally structured as a graph. This approach enables the model to flexibly incorporate a wide range of useful signals, such as:

* Weather forecasts for a region

* Sales from similar products or related categories

* Data from nearby locations or stations

* More fine-granular recent interactions/activities

* Price changes and promotional campaigns

* Competitor data (e.g., pricing, availability)

* Aggregated regional or market-level statistics

The architecture is modular: we don't default to a Transformer for the past sequence component (and in fact use a simpler architecture). The Graph Transformer/Graph Neural Network then extends the past sequence component by aggregating from additional sources.

> It seems like this startup is both trying to publish academic research promoting these models as well as selling it to businesses which seems like a conflict of interest to me.

That’s a bold claim. All of our academic work is conducted in collaboration with university partners, is peer-reviewed, and has been accepted at top-tier conferences. Sharing blog posts that explain the design decisions behind our models isn’t a conflict of interest—it's part of making our internals more transparent.

fumeux_fume•7mo ago
Lol, a bold claim. It's a rational assumption that any business publishing "academic work" is selling you the upside while omitting or downplaying the downside.
ayongpm•7mo ago
https://dontfuckwithscroll.com/
rusty1s•7mo ago
Forwarded :)