frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: Gemini Pro 3 hallucinates the HN front page 10 years from now

https://dosaygo-studio.github.io/hn-front-page-2035/news
515•keepamovin•3h ago•221 comments

PeerTube is recognized as a digital public good by Digital Public Goods Alliance

https://www.digitalpublicgoods.net/r/peertube
71•fsflover•1h ago•8 comments

Mistral Releases Devstral 2 (72.2% SWE-Bench Verified) and Vibe CLI

https://mistral.ai/news/devstral-2-vibe-cli
221•pember•3h ago•78 comments

Handsdown one of the coolest 3D websites

https://bruno-simon.com/
128•razzmataks•2h ago•36 comments

Kaiju – General purpose 3D/2D game engine in Go and Vulkan with built in editor

https://github.com/KaijuEngine/kaiju
88•discomrobertul8•3h ago•38 comments

LLM from scratch, part 28 – training a base model from scratch on an RTX 3090

https://www.gilesthomas.com/2025/12/llm-from-scratch-28-training-a-base-model-from-scratch
367•gpjt•6d ago•82 comments

My favourite small hash table

https://www.corsix.org/content/my-favourite-small-hash-table
54•speckx•3h ago•8 comments

Clearspace (YC W23) Is Hiring a Founding Designer

https://www.ycombinator.com/companies/clearspace/jobs/yamWTLr-founding-designer-at-clearspace
1•roycebranning•1h ago

Launch HN: Mentat (YC F24) – Controlling LLMs with Runtime Intervention

https://playground.ctgt.ai
14•cgorlla•1h ago•7 comments

AWS Trainium3 Deep Dive – A Potential Challenger Approaching

https://newsletter.semianalysis.com/p/aws-trainium3-deep-dive-a-potential
38•Symmetry•4d ago•9 comments

AI needs more power than the grid can deliver – supersonic tech can fix that

https://boomsupersonic.com/flyby/ai-needs-more-power-than-the-grid-can-deliver-supersonic-tech-ca...
12•simonebrunozzi•2h ago•8 comments

The Joy of Playing Grandia, on Sega Saturn

https://www.segasaturnshiro.com/2025/11/27/the-joy-of-playing-grandia-on-sega-saturn/
145•tosh•8h ago•83 comments

Transformers know more than they can tell: Learning the Collatz sequence

https://www.arxiv.org/pdf/2511.10811
79•Xcelerate•6d ago•30 comments

Constructing the Word's First JPEG XL MD5 Hash Quine

https://stackchk.fail/blog/jxl_hashquine_writeup
69•luispa•1w ago•16 comments

Show HN: AlgoDrill – Interactive drills to stop forgetting LeetCode patterns

https://algodrill.io
114•henwfan•7h ago•78 comments

Show HN: Detail, a Bug Finder

https://detail.dev/
15•drob•38m ago•4 comments

If you're going to vibe code, why not do it in C?

https://stephenramsay.net/posts/vibe-coding.html
65•sramsay•1h ago•69 comments

30 Year Anniversary of WarCraft II: Tides of Darkness

https://www.jorsys.org/archive/december_2025.html#newsitem_2025-12-09T07:42:19Z
86•sjoblomj•9h ago•61 comments

Icons in Menus Everywhere – Send Help

https://blog.jim-nielsen.com/2025/icons-in-menus/
759•ArmageddonIt•22h ago•306 comments

Ask HN: Should "I asked $AI, and it said" replies be forbidden in HN guidelines?

282•embedding-shape•2h ago•156 comments

How Private Equity Is Changing Housing

https://www.theatlantic.com/ideas/2025/12/private-equity-housing-changes/685138/
20•harambae•47m ago•15 comments

Oliver Sacks Put Himself into His Case Studies. What Was the Cost?

https://www.newyorker.com/magazine/2025/12/15/oliver-sacks-put-himself-into-his-case-studies-what...
29•barry-cotter•4h ago•6 comments

A deep dive into QEMU: The Tiny Code Generator (TCG), part 1 (2021)

https://airbus-seclab.github.io/qemu_blog/tcg_p1.html
63•costco•1w ago•2 comments

Donating the Model Context Protocol and Establishing the Agentic AI Foundation

https://www.anthropic.com/news/donating-the-model-context-protocol-and-establishing-of-the-agenti...
8•meetpateltech•1h ago•2 comments

Brent's Encapsulated C Programming Rules (2020)

https://retroscience.net/brents-c-programming-rules.html
55•p2detar•6h ago•27 comments

ZX Spectrum Next on the Internet: Xberry Pi ESP01 and Pi Zero Upgrades

https://retrogamecoders.com/zx-spectrum-next-on-the-internet-xberry-pi-esp01-and-pi-zero-upgrades/
46•ibobev•7h ago•0 comments

Epsilon: A WASM virtual machine written in Go

https://github.com/ziggy42/epsilon
124•ziggy42•1w ago•30 comments

Animalcules and Their Motors

https://www.asimov.press/p/flagella
4•surprisetalk•6d ago•0 comments

Kroger acknowledges that its bet on robotics went too far

https://www.grocerydive.com/news/kroger-ocado-close-automated-fulfillment-centers-robotics-grocer...
242•JumpCrisscross•18h ago•271 comments

The Gamma Language

https://lair.masot.net/gamma/
26•RossBencina•3d ago•4 comments
Open in hackernews

Beyond Elk: Lightweight and Scalable Cloud-Native Log Monitoring

https://greptime.com/blogs/2025-04-24-elasticsearch-greptimedb-comparison-performance
25•xzhuang1984•7mo ago

Comments

firesteelrain•7mo ago
Any reason to use this like in Azure over their cloud native options such as with AKS that has fluentd built into the ama-pod? It already sends logs to Azure Monitor/LogA. Azure Managed Grafana can take in Kusto queries. AMA can monitor VMs. Further you can use DCE/DCRs for custom logs. Azure provides Azure native ElasticSearch too. It seems to own this market.

You can predictably control costs and predict costs with these models.

killme2008•7mo ago
Agree. Leveraging capabilities provided by cloud vendors is always a good idea. However, as the scale grows, cost inevitably becomes an issue. Third-party solutions often offer cost advantages because they support multi-cloud deployments and are optimized for specific scenarios.
chreniuc•7mo ago
How does it compare to openobserve?
atombender•7mo ago
How does Greptime handle dynamic schemas where you don't know most of the shape of the data upfront?

Where I work, we have maybe a hundred different sources of structured logs: Our own applications, Kubernetes, databases, CI/CD software, lots of system processes. There's no common schema other than the basics (timestamp, message, source, Kubernetes metadata). Apps produce all sorts JSON fields, and we have thousands and thousands of fields across all these apps.

It'd be okay to define a small core subset, but we'd need a sensible "catch all" rule for the rest. All fields need to be searchable, but it's of course OK if performance is a little worse for non-core fields, as long as you can go into the schema and explicitly add it in order to speed things up.

Also, how does Greptime scale with that many fields? Does it do fine with thousands of columns?

I imagine it would be a good idea to have one table per source. Is it easy/performant to search multiple tables (union ordered by time) in a single query?

killme2008•7mo ago
Thanks for your question. GreptimeDB, like MongoDB, is schemaless. When ingesting data via OTEL or its gRPC SDKs, it automatically creates tables by inferring the schema and dynamically adds new columns as needed.

Secondly, I prefer wide tables to consolidate all sources for easy management and scalability. With GreptimeDB's columnar storage based on Parquet, unused columns don't incur storage costs.

atombender•7mo ago
Thanks, that seems promising. So much of the documentation is schema-oriented, I didn't see that it supported dynamic schemas.

I find it interesting that Greptime is completely time-oriented. I don't think you can create tables without a time PK? The last time I needed log storage, I ended up picking ClickHouse, because it has no such restrictions on primary keys. We use non-time-based tables all the time, as well as dictionaries. So it seems Greptime is a lot less flexible?

killme2008•7mo ago
Yes, GreptimeDB requires a time index column for optimized storage and querying. It's not a constraint of a primary key, but just an independent table constraint.

Could you elaborate on why you find this inconvenient? I assumed logs, for example, would naturally include a timestamp.

atombender•7mo ago
It's less convenient because it makes the database less general-purpose. The moment you need to go beyond time-based data, you have to reach for other tools.

ClickHouse is such a wonderful database precisely it's so incredibly flexible. While most data I interact with is time-based, I also store lots of non-time-based data there to complement the time-based tables. The rich feature set of table engines, materialized views, and dictionaries means you have a lot of different tools to pick from to design your solution. For example, to optimize ETL lookup, I use a lot of dictionaries, which are not time-based.

As an example, let's say I'm ingesting logs into Greptime and some log lines have a customer_id. I would like the final table, or least a view, to be cross-referenced with the customer so that it can include the customer's name. I suppose one would have to continually ingest customer data into a Greptime table with today's date, and then join on today's date?

killme2008•7mo ago
Fair point. Joining time-series data with business data is often necessary. While GreptimeDB currently supports external tables for Parquet and CSV files, we plan to expand this support to include datasources like MySQL and PG in the future.
client4•7mo ago
For logs I'd be more likely to choose https://www.gravwell.io as it's log agnostic and I've seen it crush 40Tb/s a day, whereas it looks like greptime is purpose-tuned for metrics and telemetry data.
dijit•7mo ago
is gravwell open source?

(it seems greptime is.)

reconnecting•7mo ago
I'm always skeptical toward software companies with an outdated year in the footer.
killme2008•7mo ago
Thanks for pointing it out! The footer has been updated.
reconnecting•7mo ago
Thank you for your prompt attention to this matter. Until next year, then.
killme2008•7mo ago
We'll find a way to fix it forever :D
emmanueloga_•7mo ago
a "no brown M&Ms" razor!
reconnecting•7mo ago
From a website perspective, finding the current year can be challenging, but there's always a way to hack around it. For example, by parsing another website to get the year.
ByteBard1979•7mo ago
What scenario would I use best?
qmarchi•7mo ago
Am I the only one that got, "This article smells like it was written by an AI told to 'compare these two products'"?

Something around the sentence structure just is offputting.

killme2008•7mo ago
The author is not a native speaker; I promised it's not an AI article but with some minor reviews from AI :)
up2isomorphism•7mo ago
This space is so crowded, I think any new startup is very unlikely to survive, unless it solves its own business case first.
killme2008•7mo ago
Yes, so many startups are trying to solve the log issue in the current stack.

In my personal observation, the vast majority of startups are still focused on the product layer and use ClickHouse directly for storage. However, ClickHouse’s tightly coupled storage and compute architecture makes it difficult to scale, and this becomes a real problem as workloads grow. GreptimeDB, on the other hand, is more focused on being an all-in-one observability database. Our log UI, however, still has quite a gap compared to products like Kibana.

This space is very crowded. I think it’s unlikely that any new startup will succeed here unless it can first solve its own business use case exceptionally well.

Would love to hear your thoughts.

atombender•7mo ago
Reading the web site, I just noticed the open-source version does not have "Log query endpoints".

Does that mean you have to use SQL (or the visual SQL builder) to query logs, and you don't get access to a log query language the way Kibana gives you KQL and Lucene syntax?

If so, I think it's a little disingenuous to write an article comparing the ELK stack, which is open source and comes with a perfectly usable query UI, to Greptime's equivalent, which is not.

killme2008•7mo ago
In fact, we have an open-source query language, but it's still in experimental, so we don't present it on the website. The description of the enterprise feature is not precise. Sorry for the inconvenience.

GreptimeDB also open-sources the log view UI if you read the article.

I agree with you that ETL is so powerful, and GreptimeDB is so young, we still have lots of work to do. Thank you.

atombender•7mo ago
Thanks, sounds interesting. It's actually not at all clear from the article that the UI, as presented, is open source. I'm looking for an ELK replacement (in an enterprise setting), so it sounds like Greptime is something I might be able to use.
killme2008•7mo ago
Thanks for your feedback. We fixed the descriptions of log query endpoints. Hope it's more clear. Glad you're considering giving it a try and looking forward to your feedback.