frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

1979: The Model World of Robert Symes [video]

https://www.youtube.com/watch?v=HmDxmxhrGDc
1•xqcgrek2•2m ago•0 comments

Satellites Have a Lot of Room

https://www.johndcook.com/blog/2026/02/02/satellites-have-a-lot-of-room/
1•y1n0•2m ago•0 comments

1980s Farm Crisis

https://en.wikipedia.org/wiki/1980s_farm_crisis
1•calebhwin•3m ago•1 comments

Show HN: FSID - Identifier for files and directories (like ISBN for Books)

https://github.com/skorotkiewicz/fsid
1•modinfo•8m ago•0 comments

Show HN: Holy Grail: Open-Source Autonomous Development Agent

https://github.com/dakotalock/holygrailopensource
1•Moriarty2026•15m ago•1 comments

Show HN: Minecraft Creeper meets 90s Tamagotchi

https://github.com/danielbrendel/krepagotchi-game
1•foxiel•22m ago•1 comments

Show HN: Termiteam – Control center for multiple AI agent terminals

https://github.com/NetanelBaruch/termiteam
1•Netanelbaruch•22m ago•0 comments

The only U.S. particle collider shuts down

https://www.sciencenews.org/article/particle-collider-shuts-down-brookhaven
1•rolph•25m ago•1 comments

Ask HN: Why do purchased B2B email lists still have such poor deliverability?

1•solarisos•26m ago•2 comments

Show HN: Remotion directory (videos and prompts)

https://www.remotion.directory/
1•rokbenko•27m ago•0 comments

Portable C Compiler

https://en.wikipedia.org/wiki/Portable_C_Compiler
2•guerrilla•30m ago•0 comments

Show HN: Kokki – A "Dual-Core" System Prompt to Reduce LLM Hallucinations

1•Ginsabo•30m ago•0 comments

Software Engineering Transformation 2026

https://mfranc.com/blog/ai-2026/
1•michal-franc•31m ago•0 comments

Microsoft purges Win11 printer drivers, devices on borrowed time

https://www.tomshardware.com/peripherals/printers/microsoft-stops-distrubitng-legacy-v3-and-v4-pr...
3•rolph•32m ago•1 comments

Lunch with the FT: Tarek Mansour

https://www.ft.com/content/a4cebf4c-c26c-48bb-82c8-5701d8256282
2•hhs•35m ago•0 comments

Old Mexico and her lost provinces (1883)

https://www.gutenberg.org/cache/epub/77881/pg77881-images.html
1•petethomas•38m ago•0 comments

'AI' is a dick move, redux

https://www.baldurbjarnason.com/notes/2026/note-on-debating-llm-fans/
4•cratermoon•40m ago•0 comments

The source code was the moat. But not anymore

https://philipotoole.com/the-source-code-was-the-moat-no-longer/
1•otoolep•40m ago•0 comments

Does anyone else feel like their inbox has become their job?

1•cfata•40m ago•1 comments

An AI model that can read and diagnose a brain MRI in seconds

https://www.michiganmedicine.org/health-lab/ai-model-can-read-and-diagnose-brain-mri-seconds
2•hhs•43m ago•0 comments

Dev with 5 of experience switched to Rails, what should I be careful about?

2•vampiregrey•45m ago•0 comments

AlphaFace: High Fidelity and Real-Time Face Swapper Robust to Facial Pose

https://arxiv.org/abs/2601.16429
1•PaulHoule•46m ago•0 comments

Scientists discover “levitating” time crystals that you can hold in your hand

https://www.nyu.edu/about/news-publications/news/2026/february/scientists-discover--levitating--t...
2•hhs•48m ago•0 comments

Rammstein – Deutschland (C64 Cover, Real SID, 8-bit – 2019) [video]

https://www.youtube.com/watch?v=3VReIuv1GFo
1•erickhill•49m ago•0 comments

Tell HN: Yet Another Round of Zendesk Spam

5•Philpax•49m ago•1 comments

Postgres Message Queue (PGMQ)

https://github.com/pgmq/pgmq
1•Lwrless•53m ago•0 comments

Show HN: Django-rclone: Database and media backups for Django, powered by rclone

https://github.com/kjnez/django-rclone
2•cui•55m ago•1 comments

NY lawmakers proposed statewide data center moratorium

https://www.niagara-gazette.com/news/local_news/ny-lawmakers-proposed-statewide-data-center-morat...
2•geox•57m ago•0 comments

OpenClaw AI chatbots are running amok – these scientists are listening in

https://www.nature.com/articles/d41586-026-00370-w
3•EA-3167•57m ago•0 comments

Show HN: AI agent forgets user preferences every session. This fixes it

https://www.pref0.com/
6•fliellerjulian•59m ago•0 comments
Open in hackernews

Show HN: Train and deploy your own open-source humanoid in Python

https://github.com/kscalelabs/ksim-gym
30•codekansas•8mo ago
Hi HN, I’m Ben, founder of K-Scale Labs (YC W24).

Last year, I wanted to buy a humanoid robot that I could hack on, but the few options for sale were either too expensive, proprietary, or had a limited SDK. We set out to build an affordable humanoid robot using off-the-shelf components that can be built and shipped today, capable of running modern machine learning models, and make it completely open-source for developers like me.

Today, we’re releasing our reinforcement learning library and sim2real pipeline for people who want to train policies for humanoid robots.

If you have a computer, you can try out this pipeline in less than 5 minutes: https://github.com/kscalelabs/ksim-gym.

Or try on Colab: https://colab.research.google.com/github/kscalelabs/ksim-gym...

Getting started is as easy as:

  git clone https://github.com/kscalelabs/ksim-gym.git

  cd ksim-gym

  pip install -r requirements.txt

  python -m train
After training a model, you can send it to us and we will run it on one of our robots. We are building a benchmark for humanoid RL policies here: https://kscale.dev/benchmarks

—

Why does the world need another humanoid robot company?

In the last year, humanoids have gone from science fiction to a seeming inevitability, bringing huge investment in the hardware supply chain and machine learning methods for robotics. But watching the ecosystem unfold, I felt pretty pessimistic about where things were headed. Seeing lots of cool demos without something that I can actually buy, from companies that have raised huge sums of money, reminded me of the early days of self-driving cars. On top of that, I find the idea of a small handful of companies building humanoid robots to be pretty dystopian. Even today, we’re seeing consumer robots being sold with government-mandated backdoors. That is not the future that I want to live in.

To that end, our company has three long-term goals:

1. Ensure that the world’s best humanoid robots are white-box systems that anyone can program and audit 2. Create the infrastructure to radically simplify developer adoption of humanoid robots, paralleling CUDA for GPU programming or PyTorch for machine learning 3. Build an ecosystem to accelerate humanity’s transition to a post-scarcity Type 1 Kardashev civilization whose gains are maximally distributed

If you would like to support us, you can pre-order one of our robots. We plan to launch the first robots this summer and are heavily discounting the price for early customers who can help us safely iterate on deploying robots in the wild: https://shop.kscale.dev

Since we are focusing on creating a developer ecosystem, we would love to hear your thoughts and feedback about our current software and hardware stack:

- Is this exciting to you? What would make you want to start developing on a humanoid robot? - What form factor is the most interesting for you (in terms of height, reach, end effector, or other hardware considerations)? - As a customer, what software capabilities would you expect from a humanoid robot in order to buy it?

We would love your feedback!

Comments

Razied•8mo ago
Super cool library guys, I think this is by far the quickest way and most painless way to train a humanoid policy. I tried messing around with mujoco and isaaclab stuff a while ago, and it was truly horrible.

A bit more of a hardware question: how modular are the robots? I'm looking at the feet and hands in particular and thinking that there are a bunch of applications where purpose-build parts would work much better than the current ones you have there. I understand the arguments for a humanoid form-factor, but I think flat feet are making it much harder than it should be to get a robust locomotion policy, and training hand-dexterity is also unbelievably hard. Seems like the path of least resistance to usefulness is to have an array of different hand attachments.

modeless•8mo ago
On the K-Bot prototype I saw at their open house recently, the hands were easily swappable on a camera lens style mount.
alihkw_•8mo ago
Kscale Engineer here: The hands are easily swappable using a camera lens style mount. We've tried a couple different hands, the main challenge is they have to fit many actuators in a small footprint.

The feet can also be changed, but not as easily. The bottom of the foot is screwed in.

Flat feet are interesting, the feet are actually a bit curved, they are designed that way to match their counterpart in sim which uses Mujoco's capsules.

I am personally very excited about hot swappable hands!

IanD914•8mo ago
Bayonet mount (micro 4/3's camera style) is built on the hands right now -- you can see the stubby ends in the simulated model, you'll be able to print or machine the other end pretty easily with the female receiver u43 piece and the drafted profile.

You can see even with peers like Unitree that there really isn't a 'one size fits all' for manipulation, and with that comes a real need for rapid prototyping and versatility. Good to point out, and its definitely been a focus for us.

And it's been great just as a jumping-off point -- now that the dexterity for the robot is open for interpretation, everyone that has come in to see the robot has come up own ideas on what they could fit on the robot for their application. Everything from spatulas to pallet forks. Really encouraging to have so much interest around it.

As far as feet are concerned, also definitely a focus for us, and we've been looking at different profiles and different materials to improve the stability and the grip on different terrains. Also 3D printed for now, rapid iteration is key.

spinning_fan•8mo ago
I'm a little confused by the offering exactly.

The robot is up for purchase - this is straightforward.

> After training a model, you can send it to us and we will run it on one of our robots.

Why do I need to send it to you guys to run it on the robot? Presumably, I should be able to it on the robot directly, no?

codekansas•8mo ago
Oh, because it will probably take a bit of time for us to get you a robot, if you want to try the software SDK, then you can try it now