frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
228•isitcontent•14h ago•25 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
329•vecti•16h ago•143 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
286•eljojo•16h ago•167 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
72•phreda4•13h ago•14 comments

Show HN: Smooth CLI – Token-efficient browser for AI agents

https://docs.smooth.sh/cli/overview
90•antves•1d ago•66 comments

Show HN: I built a free UCP checker – see if AI agents can find your store

https://ucphub.ai/ucp-store-check/
2•vladeta•1h ago•1 comments

Show HN: ARM64 Android Dev Kit

https://github.com/denuoweb/ARM64-ADK
16•denuoweb•1d ago•2 comments

Show HN: Slack CLI for Agents

https://github.com/stablyai/agent-slack
47•nwparker•1d ago•11 comments

Show HN: Compile-Time Vibe Coding

https://github.com/Michael-JB/vibecode
10•michaelchicory•3h ago•1 comments

Show HN: Artifact Keeper – Open-Source Artifactory/Nexus Alternative in Rust

https://github.com/artifact-keeper
150•bsgeraci•1d ago•63 comments

Show HN: Gigacode – Use OpenCode's UI with Claude Code/Codex/Amp

https://github.com/rivet-dev/sandbox-agent/tree/main/gigacode
17•NathanFlurry•22h ago•7 comments

Show HN: Slop News – HN front page now, but it's all slop

https://dosaygo-studio.github.io/hn-front-page-2035/slop-news
10•keepamovin•4h ago•2 comments

Show HN: Horizons – OSS agent execution engine

https://github.com/synth-laboratories/Horizons
23•JoshPurtell•1d ago•5 comments

Show HN: Daily-updated database of malicious browser extensions

https://github.com/toborrm9/malicious_extension_sentry
14•toborrm9•19h ago•7 comments

Show HN: Fitspire – a simple 5-minute workout app for busy people (iOS)

https://apps.apple.com/us/app/fitspire-5-minute-workout/id6758784938
2•devavinoth12•6h ago•0 comments

Show HN: I built a RAG engine to search Singaporean laws

https://github.com/adityaprasad-sudo/Explore-Singapore
4•ambitious_potat•7h ago•4 comments

Show HN: Micropolis/SimCity Clone in Emacs Lisp

https://github.com/vkazanov/elcity
172•vkazanov•2d ago•49 comments

Show HN: Sem – Semantic diffs and patches for Git

https://ataraxy-labs.github.io/sem/
2•rs545837•8h ago•1 comments

Show HN: BioTradingArena – Benchmark for LLMs to predict biotech stock movements

https://www.biotradingarena.com/hn
25•dchu17•18h ago•12 comments

Show HN: Falcon's Eye (isometric NetHack) running in the browser via WebAssembly

https://rahuljaguste.github.io/Nethack_Falcons_Eye/
4•rahuljaguste•13h ago•1 comments

Show HN: Local task classifier and dispatcher on RTX 3080

https://github.com/resilientworkflowsentinel/resilient-workflow-sentinel
25•Shubham_Amb•1d ago•2 comments

Show HN: FastLog: 1.4 GB/s text file analyzer with AVX2 SIMD

https://github.com/AGDNoob/FastLog
5•AGDNoob•10h ago•1 comments

Show HN: Gohpts tproxy with arp spoofing and sniffing got a new update

https://github.com/shadowy-pycoder/go-http-proxy-to-socks
2•shadowy-pycoder•10h ago•0 comments

Show HN: A password system with no database, no sync, and nothing to breach

https://bastion-enclave.vercel.app
11•KevinChasse•19h ago•16 comments

Show HN: I built a directory of $1M+ in free credits for startups

https://startupperks.directory
4•osmansiddique•11h ago•0 comments

Show HN: GitClaw – An AI assistant that runs in GitHub Actions

https://github.com/SawyerHood/gitclaw
9•sawyerjhood•19h ago•0 comments

Show HN: A Kubernetes Operator to Validate Jupyter Notebooks in MLOps

https://github.com/tosin2013/jupyter-notebook-validator-operator
2•takinosh•11h ago•0 comments

Show HN: 33rpm – A vinyl screensaver for macOS that syncs to your music

https://33rpm.noonpacific.com/
3•kaniksu•12h ago•0 comments

Show HN: Chiptune Tracker

https://chiptunes.netlify.app
3•iamdan•13h ago•1 comments

Show HN: Craftplan – I built my wife a production management tool for her bakery

https://github.com/puemos/craftplan
568•deofoo•5d ago•166 comments
Open in hackernews

Show HN: Server-rendered multiplayer games with Lua (no client code)

https://cleoselene.com/
85•brunovcosta•1mo ago
Hey folks — here’s a small experiment I hacked together over the weekend:

https://cleoselene.com/

In short, it’s a way to build multiplayer games with no client-side game logic. Everything is rendered on the server, and the game itself is written as simple Lua scripts.

I built this to explore a few gamedev ideas I’ve been thinking about while working on Abstra: - Writing multiplayer games as if they were single-player (no client/server complexity) - Streaming game primitives instead of pixels, which should be much lighter - Server-side rendering makes cheating basically impossible - Game secrets never leave the server

This isn’t meant to be a commercial project — it’s just for fun and experimentation for now.

If you want to try it out, grab a few friends and play here: https://cleoselene.com/astro-maze/

Comments

ghxst•1mo ago
IMO eliminating as much client side authority as possible is a very good foundation for MMOs where the latency is acceptable or factored into all aspects of the game (looking at old school runescape as an example). Very cool project!
brunovcosta•1mo ago
Thank you!
duduzeta•1mo ago
Cool!! I'm trying to test here, but other ships keep attacking me and I don't know how to shoot :s
brunovcosta•1mo ago
Amazing! hahaha.. Tip: Arrows + Z (shoot)
kibbi•1mo ago
In my case, Z for shooting works only rarely. Usually nothing happens. How does the game code query the key?
iku•1mo ago
i think, after shooting, it has to recharge. When recharged, the ship momentarily blink-expands in yellow. This means it is ready to fire again.

But sometimes, i been left without a recharge, and without shooting, and I don't know why.

brunovcosta•1mo ago
exactly as @iku commented.. there is a cooldown time between shots.. it pulses when you're ready and resets when you try before it's loaded!

It seems that I should add a better visual feedback haha

cmrdporcupine•1mo ago
"We stream drawing primitives instead of heavy pixels or complex state objects."

This is cool ... but I suspect just pushing video frames like Stadia etc did is just as efficient these days and a lot less complicated to implement and no special client really needed. Decent compression, and hardware decode on almost every machine, hardware encode possible on the server side, and excellent browser support.

MarsIronPI•1mo ago
On the other hand, you could take a list of primitives from, say, the JS Canvas API, come up with a format that can encode all of them and use that as your protocol. Bam, with that you get one client for any game that uses JS Canvas.
brunovcosta•1mo ago
That's exactly my approach! I'm sending canvas commands instead of pixels, which makes things faster

That said.. I don't think stadia could do that since it's not opinionated about the game engine. Unless they go really deep on the graphics card instructions instead, but then it becomes comparable to pixel rendering I guess

fionic•1mo ago
Cool! Besides the productizing or making a framework, I’m trying to understand if this is different than the elementary idea (which probably every game dev who worked on game networking has tinkered with) of sending inputs to the server and then sending player positions back to all the clients…? I think even smaller footprint would be position: two or three floats x,y(,z) instead of shapes too? Anyway this is always fine for very low latency environments where client side prediction, lag comp etc would not be required. Thanks for sharing, I might give it a try! sorry if I’m missing something.
brunovcosta•1mo ago
You're correct

My approach lives in some place between video streaming and data streaming in terms of performance

It's not intended to be faster than a proper client that brings a lot of logic and information that diminish the amount of information required to be transfered

My proof of concept is more about: Can my dev exp be much better without relying on the video streaming approach? (which is havier)

kibbi•1mo ago
Interesting approach! I've thought about a similar method after reading about the PLATO platform.

When playing astro‑maze, the delay is noticeable, and in a 2D action game such delays are especially apparent. Games that don’t rely on tight real‑time input might perform better. (I'm connecting from Europe, though.)

If you add support for drawing from images (such as spritesheets or tilesheets) in the future, and the client stores those images and sounds locally, the entire screen could be drawn from these assets, so no pixel data would need to be transferred, only commands like "draw tile 56 at position (x, y)."

(By the way, opening abstra.io in a German-language browser leads to https://www.abstra.io/deundefined which shows a 404 error.)

brunovcosta•1mo ago
Yeah.. As people are playing and I'm watching their feedbacks it is becoming clear to me that the main source of input delay comes from the distance to the server.. the whole game is running in a single machine in SFO, so it makes total sense this bad exp in europe

I think this is inevitable unless I add some optimism/interpolation in the client

Also, thanks for the feedback! I will fix the Abstra landing page

try https://www.abstra.io/en instead

ModernMech•1mo ago
You're running this at the airport?
fragmede•1mo ago
It's a Googleism. The datacenter is referred to by the nearest airport because airport codes make it straightforwardsto know roughly where the DC is.
Bender•1mo ago
For completeness sake this is not strictly a Google thing. Many companies use airport codes for their data-centers.
allthatineed•1mo ago
BYOND/Space Station 13 is built upon this model.

Sprite sheets are png with ztxt blocks with meta/frame info and a list of drawing operations to be done to construct vsprites based on any runtime server side operations done on the sprites.

There is limited client coding via popup Web view windows and a few js apis back to the client but nothing you can build too much off of.

(SS14 brings this model to an open source c# framework called The Robust Engine but has some limitations related to maintainer power tripping over who should be allowed to use their open source project.)

brunovcosta•1mo ago
Amazing! Never heard of this byond/ss13/14

Thank you for the reference!

Thaxll•1mo ago
"Impossible to Cheat"

Let me tell you that there is cheating in cloud rendering solution ( Stadia, AWS Luna ect ... )

So 100% there is cheating in your solution.

It's trivial to read the screen.

brunovcosta•1mo ago
You're right

Especially with today's computer vision

The cheating I'm more protected (just as stadia, etc..) is regarded to client/code exploitation

which we don't have to worry about in this approach

6r17•1mo ago
Cheating with AI will be possible even with server side rendering ; nvidia has released models able to learn to play - it's going to be very difficult to detect whether it's an AI or a human ; very impressive however
nkrisc•1mo ago
That’s a very different kind of cheating though. The kind of cheating this effectively makes impossible is cheating where a player has more information than they’re intended to have.

If someone makes an AI that plays the game as a good player, then it’s effectively indistinguishable from a real player who is good. If they make it super-humanly good, then it would probably be detectable anyway.

It’s still fair in the sense that all players have the same (intended) information per the game rules.

throwaway894345•1mo ago
I’m also curious if an AI could process the screen feed quickly enough to compete in first-person shooter games. Seems like it would be difficult without extremely high end hardware for the foreseeable future?
Thaxll•1mo ago
It already exists.
ModernMech•1mo ago
I had students build this kind of thing in 2020 by screenshotting the game and processing it with a standard OpenCV pipeline. No GenAI needed.
throwaway894345•1mo ago
Thank you for educating me. How does OpenCV work from the perspective of recognizing things in an image? Is there some kind of underlying model there that learns what a target looks like or not?
ModernMech•1mo ago
The way they did it, they were writing an aimbot. So the pipeline was:

- take a screenshot

- run massive skeletal detection on it to get the skeletons of any humanoids

- of those skeletons, pick a target closest to the player

- for that target, get coordinates of head node

- run a PID to control the cursor to where the head node is located

- move the camera one frame, repeat the whole process. If you can fit that pipeline to 16ms it can run in real time.

throwaway894345•1mo ago
Wow, that's fascinating. Were they able to fit the whole thing inside the 16ms frame?
ModernMech•1mo ago
oh yeah, with little problem, especially with a GPU it's not hard at all.
Scion9066•1mo ago
There's already models specifically for things like identifying players in Counter-Strike 2, including which team they're on.

Someone has even rigged up a system like that to a TENS system to stimulate the nerves in their arm and hand to move the mouse in the correct direction and fire when the crosshair is over the enemy.

We are definitely already there.

fragmede•1mo ago
They documented it on YouTube for us to see:

https://youtu.be/9alJwQG-Wbk

holy shit that's amazing.

ASalazarMX•1mo ago
If there's enough players, the server could segregate them in tiers. All superhuman players would eventually end up competing between themselves.
tnelsond4•1mo ago
Since you're doing this is rust, try experimenting to see what would happen if you did zstd compression using a dictionary on the data you're sending back and forth, it might give you a performance benefit.
brunovcosta•1mo ago
I will definitely try it!

I'm using Gzip since it comes with all browsers hence a easy approach

That said, I will find som zstd decompressor for js/wasm and try!

edit:

I just added and the difference was huge! Thank you!

RodgerTheGreat•1mo ago
I had a similar idea about 13 years ago, but I didn't extend it into a generalized game engine: https://github.com/JohnEarnest/VectorLand
aetherspawn•1mo ago
The latency is a little intense from Australia … but surprisingly not as bad as I thought it would be.

It was playable.

I wonder if you can use speculative execution to play the game a few frames ahead and then the client picks what to display based on user input, or something like that.

Each frame is 16ms, so you’d have to work ahead 6 frames to conquer the nominal latency of around 100ms, which may actually be 200ms round trip.

(In that case, something like Haskell would be a good candidate to build a DSL to build the decision tree to send to the JS client…)

Neywiny•1mo ago
It could help visually but you'll still have 200ms between you and your next door neighbor's actions
lurkshark•1mo ago
What you’re describing is called “rollback netcode”. It’s a pretty cool chunk of theory, usually used for fighting games which are extremely sensitive to latency. This explainer has some nice graphic demos

https://bymuno.com/post/rollback

dustbunny•1mo ago
It's a common misconception that this is only used in fighting games. This technique was developed first in Duke Nukem, and then exploited heavily by Carmack in Quake, and subsequently refined and built upon in other AAA FPS games, specifically for the local player movement and shooting.
ThatPlayer•1mo ago
I don't think it's quite the same. Rollback netcode is like lockstep netcode, where the entire game is simulated locally and only inputs are networked. Since it's still only input being networked, network drops (or slow computers) affect everyone, requiring the simulation to slow down. Not just fighting games, but RTS games would do this. If you've ever played Starcraft/Warcraft 3 where it would freeze when a player disconnected.

With rollback/lockstep, there's no need for a server simulation at all. Most games are not doing that: the client's local simulations are less important than the server's simulation, even missing information (good to prevent wallhacks). Any dropped packets are handled with the server telling the client the exact positions of everything, leading to warping. Dropped packets and latency also only affect the problem player, rather than pausing everyone's simulations.

aetherspawn•1mo ago
This is awesome and exactly what it needs, but good luck creating a language that’s “signal driven” enough to encode it and then send all the possible states to the client.

If you were able to make it, it would be kind of a Hail Mary moment for making easy server games without the latency.

ingen0s•1mo ago
Might be the greatest thing I have seen made in like 10 years
ftgffsdddr•1mo ago
Is the source code available?
efilife•1mo ago
Doesn't seem like it
emmelaich•1mo ago
Reminds me of the cave X11 games. For game play I'd suggest slowing it way down.
brunovcosta•1mo ago
good feedback! I'm seeing people really struggly with the control lag + speed.

I'm always biased since I test locally with no delay when developing :)

LoganDark•1mo ago
are there any affordances for prediction or replay? you could try to help network latency by having the server resimulate a small period of time roughly equivalent to the client's network delay - it's not perfect without client-side prediction but it could help
brunovcosta•1mo ago
It's possible, but harder than traditional client/server paradigm since the client here is generic so the predictablity should be based on something other than heuristics

I'm thinking about simple ML to predict inputs and feedbacks. Since the amount of data generated in the streaming is massive and well structured, it looks like a possible approach

Matheus28•1mo ago
Client-server multiplayer games are already kind of a very specialized type of video playback if you squint a bit (you're transmitting entities rather than pixels).

This method of multiplayer you propose is inferior in basically every way: you can't do client-side prediction to make inputs feel smoother, and non-trivial scenes will surely take up more bandwidth than just transmitting entity deltas.

ycombinatrix•1mo ago
Is this how Stadia was supposed to work?
brunovcosta•1mo ago
Not exactly, since they aren't opinionated in the game engine, they can't control what "primitives" are being used to render.. they probably just encode video
ycombinatrix•1mo ago
I guess you're right, all their talk about working with game developers was probably just to get the games working on their hardware.
modinfo•1mo ago
Im just vibe-coded a multiplayer game with deterministic terrain world generation with Cleoselene in 5min.

https://github.com/skorotkiewicz/proximity-explorer

brunovcosta•1mo ago
That's AMAZING!

I will definitively follow that!

efilife•1mo ago
Was it really 5 min or more like 30?
2001zhaozhao•1mo ago
> - Streaming game primitives instead of pixels, which should be much lighter - Server-side rendering makes cheating basically impossible

This doesn't work in 3D. Unless you have the server do the work of the GPU and compute occlusion, you'll end up sending data to the client that they shouldn't be able to have (e.g. location of players and other objects behind walls)

nkrisc•1mo ago
Theoretically couldn’t you send each client its data after performing occlusion culling based on each one’s camera?

Don’t some competitive games more or less already do this? Not sending player A’s position to player B if the server determines player A is occluded from player B?

I seem to recall a blog post about Apex Legenda dealing with the issue of “leaker’s advantage” due to this type of culling, unless I’m just totally misremembering the subject.

Regardless, seems like it would work just fine even in 3D for the types of games where everyone has the same view more or less.

chasebrignac•4w ago
Sick! And the game is fun too! I’m sending to my brother who is a junior game dev.