frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Bringing Polars to .NET

https://github.com/ErrorLSC/Polars.NET
1•CurtHagenlocher•1m ago•0 comments

Adventures in Guix Packaging

https://nemin.hu/guix-packaging.html
1•todsacerdoti•2m ago•0 comments

Show HN: We had 20 Claude terminals open, so we built Orcha

1•buildingwdavid•2m ago•0 comments

Your Best Thinking Is Wasted on the Wrong Decisions

https://www.iankduncan.com/engineering/2026-02-07-your-best-thinking-is-wasted-on-the-wrong-decis...
1•iand675•2m ago•0 comments

Warcraftcn/UI – UI component library inspired by classic Warcraft III aesthetics

https://www.warcraftcn.com/
1•vyrotek•3m ago•0 comments

Trump Vodka Becomes Available for Pre-Orders

https://www.forbes.com/sites/kirkogunrinde/2025/12/01/trump-vodka-becomes-available-for-pre-order...
1•stopbulying•4m ago•0 comments

Velocity of Money

https://en.wikipedia.org/wiki/Velocity_of_money
1•gurjeet•7m ago•0 comments

Stop building automations. Start running your business

https://www.fluxtopus.com/automate-your-business
1•valboa•11m ago•1 comments

You can't QA your way to the frontier

https://www.scorecard.io/blog/you-cant-qa-your-way-to-the-frontier
1•gk1•12m ago•0 comments

Show HN: PalettePoint – AI color palette generator from text or images

https://palettepoint.com
1•latentio•13m ago•0 comments

Robust and Interactable World Models in Computer Vision [video]

https://www.youtube.com/watch?v=9B4kkaGOozA
2•Anon84•17m ago•0 comments

Nestlé couldn't crack Japan's coffee market.Then they hired a child psychologist

https://twitter.com/BigBrainMkting/status/2019792335509541220
1•rmason•18m ago•0 comments

Notes for February 2-7

https://taoofmac.com/space/notes/2026/02/07/2000
2•rcarmo•20m ago•0 comments

Study confirms experience beats youthful enthusiasm

https://www.theregister.com/2026/02/07/boomers_vs_zoomers_workplace/
2•Willingham•27m ago•0 comments

The Big Hunger by Walter J Miller, Jr. (1952)

https://lauriepenny.substack.com/p/the-big-hunger
2•shervinafshar•28m ago•0 comments

The Genus Amanita

https://www.mushroomexpert.com/amanita.html
1•rolph•33m ago•0 comments

We have broken SHA-1 in practice

https://shattered.io/
9•mooreds•33m ago•2 comments

Ask HN: Was my first management job bad, or is this what management is like?

1•Buttons840•35m ago•0 comments

Ask HN: How to Reduce Time Spent Crimping?

2•pinkmuffinere•36m ago•0 comments

KV Cache Transform Coding for Compact Storage in LLM Inference

https://arxiv.org/abs/2511.01815
1•walterbell•40m ago•0 comments

A quantitative, multimodal wearable bioelectronic device for stress assessment

https://www.nature.com/articles/s41467-025-67747-9
1•PaulHoule•42m ago•0 comments

Why Big Tech Is Throwing Cash into India in Quest for AI Supremacy

https://www.wsj.com/world/india/why-big-tech-is-throwing-cash-into-india-in-quest-for-ai-supremac...
2•saikatsg•42m ago•0 comments

How to shoot yourself in the foot – 2026 edition

https://github.com/aweussom/HowToShootYourselfInTheFoot
2•aweussom•43m ago•0 comments

Eight More Months of Agents

https://crawshaw.io/blog/eight-more-months-of-agents
4•archb•45m ago•0 comments

From Human Thought to Machine Coordination

https://www.psychologytoday.com/us/blog/the-digital-self/202602/from-human-thought-to-machine-coo...
1•walterbell•45m ago•0 comments

The new X API pricing must be a joke

https://developer.x.com/
1•danver0•46m ago•0 comments

Show HN: RMA Dashboard fast SAST results for monorepos (SARIF and triage)

https://rma-dashboard.bukhari-kibuka7.workers.dev/
1•bumahkib7•46m ago•0 comments

Show HN: Source code graphRAG for Java/Kotlin development based on jQAssistant

https://github.com/2015xli/jqassistant-graph-rag
1•artigent•51m ago•0 comments

Python Only Has One Real Competitor

https://mccue.dev/pages/2-6-26-python-competitor
4•dragandj•53m ago•0 comments

Tmux to Zellij (and Back)

https://www.mauriciopoppe.com/notes/tmux-to-zellij/
1•maurizzzio•54m ago•1 comments
Open in hackernews

Time Series Forecasting with Graph Transformers

https://kumo.ai/research/time-series-forecasting/
131•turntable_pride•7mo ago

Comments

ziofill•7mo ago
I can't stand websites that override scrolling
pealco•7mo ago
Most of my time interacting with this site was spent in developer tools, trying to figure out where the scrolling behavior was coming from. (Couldn't figure it out.) I can't understand why people are still doing this in 2025.
almosthere•7mo ago
Most likely the developer is using a Windows computer.
bestest•7mo ago
Enter this in the console:

document.body.onwheel = (e) => e.stopPropagation();

rossant•7mo ago
I came here to say this. Don't mess with my scrollbar. Ever.
monkeydust•7mo ago
wow didn't realize that until I saw this comment, now I cant unrealize it and angry
cwmoore•7mo ago
“Here, sign this.”

    accept all cookies
cye131•7mo ago
I'm not a fan of this blog post as it tries to pass off a method that's not accepted as a good or standard time series methodology (graph transformers) as though it were a norm. Transformers perform poorly on time series, and graph deep learning performs poorly for tasks that don't have real behaviorial/physical edges (physical space/molecules/social graphs etc), so it's unclear why combining them would produce anything useful for "business applications" of time series like sales forecasting.

For those interested in transformers with time series, I recommend reading this paper: https://arxiv.org/pdf/2205.13504. There is also plenty of other research showing that transformers-based time series models generally underperform much simpler alternatives like boosted trees.

After looking further it seems like this startup is both trying to publish academic research promoting these models as well as selling it to businesses, which seems like a conflict of interest to me.

tough•7mo ago
thoughts on TimesFM?

> After looking further it seems like this startup is both trying to publish academic research promoting these models as well as selling it to businesses, which seems like a conflict of interest to me.

is this a general rule of thumb that one should not use the same organization to publish research and pursue commercialization generally?

orochimaaru•7mo ago
Not really. There is no rule against it. You can have a team that research, publishes, patents and shares the patents with commercial scalers. It’s easier with ML than with manufacturing.
shirokiba•7mo ago
Would you be so kind as to recommend some resources on modern, promising methods for time series forecasting? I'm starting a position doing this work soon and would like to learn more about it if you'd be willing to share
srean•7mo ago
Read all the M series of competitions and the papers that come out of those exercises. Read Keogh. Also have a healthy respect and understanding of the traditional methods rather than getting distracted by all that happens to be shiny now.
lamename•7mo ago
Wow a sane person among all the hype. Great to see you!
srean•7mo ago
Lol. Yeah, the hype train blinds.
ethan_smith•7mo ago
Recent work like Informer (AAAI'21) and Autoformer (NeurIPS'21) have shown competitive performance against statistical methods by addressing the quadratic complexity and long-range dependency issues that plagued earlier transformer architectures for time series tasks.
rusty1s•7mo ago
Hey, one of the authors here—happy to clarify a few things.

> Transformers perform poorly on time series.

That’s not quite the point of our work. The model isn’t about using Transformers for time series per se. Rather, the focus is on how to enrich forecasting models by combining historical sequence data with external information, which is often naturally structured as a graph. This approach enables the model to flexibly incorporate a wide range of useful signals, such as:

* Weather forecasts for a region

* Sales from similar products or related categories

* Data from nearby locations or stations

* More fine-granular recent interactions/activities

* Price changes and promotional campaigns

* Competitor data (e.g., pricing, availability)

* Aggregated regional or market-level statistics

The architecture is modular: we don't default to a Transformer for the past sequence component (and in fact use a simpler architecture). The Graph Transformer/Graph Neural Network then extends the past sequence component by aggregating from additional sources.

> It seems like this startup is both trying to publish academic research promoting these models as well as selling it to businesses which seems like a conflict of interest to me.

That’s a bold claim. All of our academic work is conducted in collaboration with university partners, is peer-reviewed, and has been accepted at top-tier conferences. Sharing blog posts that explain the design decisions behind our models isn’t a conflict of interest—it's part of making our internals more transparent.

fumeux_fume•7mo ago
Lol, a bold claim. It's a rational assumption that any business publishing "academic work" is selling you the upside while omitting or downplaying the downside.
ayongpm•7mo ago
https://dontfuckwithscroll.com/
rusty1s•7mo ago
Forwarded :)