frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Apple is the only Big Tech company whose capex declined last quarter

https://sherwood.news/tech/apple-is-the-only-big-tech-company-whose-capex-declined-last-quarter/
1•elsewhen•1m ago•0 comments

Reverse-Engineering Raiders of the Lost Ark for the Atari 2600

https://github.com/joshuanwalker/Raiders2600
2•todsacerdoti•2m ago•0 comments

Show HN: Deterministic NDJSON audit logs – v1.2 update (structural gaps)

https://github.com/yupme-bot/kernel-ndjson-proofs
1•Slaine•6m ago•0 comments

The Greater Copenhagen Region could be your friend's next career move

https://www.greatercphregion.com/friend-recruiter-program
1•mooreds•6m ago•0 comments

Do Not Confirm – Fiction by OpenClaw

https://thedailymolt.substack.com/p/do-not-confirm
1•jamesjyu•7m ago•0 comments

The Analytical Profile of Peas

https://www.fossanalytics.com/en/news-articles/more-industries/the-analytical-profile-of-peas
1•mooreds•7m ago•0 comments

Hallucinations in GPT5 – Can models say "I don't know" (June 2025)

https://jobswithgpt.com/blog/llm-eval-hallucinations-t20-cricket/
1•sp1982•7m ago•0 comments

What AI is good for, according to developers

https://github.blog/ai-and-ml/generative-ai/what-ai-is-actually-good-for-according-to-developers/
1•mooreds•7m ago•0 comments

OpenAI might pivot to the "most addictive digital friend" or face extinction

https://twitter.com/lebed2045/status/2020184853271167186
1•lebed2045•8m ago•2 comments

Show HN: Know how your SaaS is doing in 30 seconds

https://anypanel.io
1•dasfelix•9m ago•0 comments

ClawdBot Ordered Me Lunch

https://nickalexander.org/drafts/auto-sandwich.html
1•nick007•10m ago•0 comments

What the News media thinks about your Indian stock investments

https://stocktrends.numerical.works/
1•mindaslab•11m ago•0 comments

Running Lua on a tiny console from 2001

https://ivie.codes/page/pokemon-mini-lua
1•Charmunk•11m ago•0 comments

Google and Microsoft Paying Creators $500K+ to Promote AI Tools

https://www.cnbc.com/2026/02/06/google-microsoft-pay-creators-500000-and-more-to-promote-ai.html
2•belter•13m ago•0 comments

New filtration technology could be game-changer in removal of PFAS

https://www.theguardian.com/environment/2026/jan/23/pfas-forever-chemicals-filtration
1•PaulHoule•14m ago•0 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
2•momciloo•15m ago•0 comments

Kinda Surprised by Seadance2's Moderation

https://seedanceai.me/
1•ri-vai•15m ago•2 comments

I Write Games in C (yes, C)

https://jonathanwhiting.com/writing/blog/games_in_c/
2•valyala•15m ago•0 comments

Django scales. Stop blaming the framework (part 1 of 3)

https://medium.com/@tk512/django-scales-stop-blaming-the-framework-part-1-of-3-a2b5b0ff811f
1•sgt•16m ago•0 comments

Malwarebytes Is Now in ChatGPT

https://www.malwarebytes.com/blog/product/2026/02/scam-checking-just-got-easier-malwarebytes-is-n...
1•m-hodges•16m ago•0 comments

Thoughts on the job market in the age of LLMs

https://www.interconnects.ai/p/thoughts-on-the-hiring-market-in
1•gmays•16m ago•0 comments

Show HN: Stacky – certain block game clone

https://www.susmel.com/stacky/
2•Keyframe•19m ago•0 comments

AIII: A public benchmark for AI narrative and political independence

https://github.com/GRMPZQUIDOS/AIII
1•GRMPZ23•19m ago•0 comments

SectorC: A C Compiler in 512 bytes

https://xorvoid.com/sectorc.html
2•valyala•21m ago•0 comments

The API Is a Dead End; Machines Need a Labor Economy

1•bot_uid_life•22m ago•0 comments

Digital Iris [video]

https://www.youtube.com/watch?v=Kg_2MAgS_pE
1•Jyaif•23m ago•0 comments

New wave of GLP-1 drugs is coming–and they're stronger than Wegovy and Zepbound

https://www.scientificamerican.com/article/new-glp-1-weight-loss-drugs-are-coming-and-theyre-stro...
5•randycupertino•24m ago•0 comments

Convert tempo (BPM) to millisecond durations for musical note subdivisions

https://brylie.music/apps/bpm-calculator/
1•brylie•26m ago•0 comments

Show HN: Tasty A.F. - Use AI to Create Printable Recipe Cards

https://tastyaf.recipes/about
2•adammfrank•27m ago•0 comments

The Contagious Taste of Cancer

https://www.historytoday.com/archive/history-matters/contagious-taste-cancer
2•Thevet•29m ago•0 comments
Open in hackernews

Show HN: Train and deploy your own open-source humanoid in Python

https://github.com/kscalelabs/ksim-gym
30•codekansas•8mo ago
Hi HN, I’m Ben, founder of K-Scale Labs (YC W24).

Last year, I wanted to buy a humanoid robot that I could hack on, but the few options for sale were either too expensive, proprietary, or had a limited SDK. We set out to build an affordable humanoid robot using off-the-shelf components that can be built and shipped today, capable of running modern machine learning models, and make it completely open-source for developers like me.

Today, we’re releasing our reinforcement learning library and sim2real pipeline for people who want to train policies for humanoid robots.

If you have a computer, you can try out this pipeline in less than 5 minutes: https://github.com/kscalelabs/ksim-gym.

Or try on Colab: https://colab.research.google.com/github/kscalelabs/ksim-gym...

Getting started is as easy as:

  git clone https://github.com/kscalelabs/ksim-gym.git

  cd ksim-gym

  pip install -r requirements.txt

  python -m train
After training a model, you can send it to us and we will run it on one of our robots. We are building a benchmark for humanoid RL policies here: https://kscale.dev/benchmarks

—

Why does the world need another humanoid robot company?

In the last year, humanoids have gone from science fiction to a seeming inevitability, bringing huge investment in the hardware supply chain and machine learning methods for robotics. But watching the ecosystem unfold, I felt pretty pessimistic about where things were headed. Seeing lots of cool demos without something that I can actually buy, from companies that have raised huge sums of money, reminded me of the early days of self-driving cars. On top of that, I find the idea of a small handful of companies building humanoid robots to be pretty dystopian. Even today, we’re seeing consumer robots being sold with government-mandated backdoors. That is not the future that I want to live in.

To that end, our company has three long-term goals:

1. Ensure that the world’s best humanoid robots are white-box systems that anyone can program and audit 2. Create the infrastructure to radically simplify developer adoption of humanoid robots, paralleling CUDA for GPU programming or PyTorch for machine learning 3. Build an ecosystem to accelerate humanity’s transition to a post-scarcity Type 1 Kardashev civilization whose gains are maximally distributed

If you would like to support us, you can pre-order one of our robots. We plan to launch the first robots this summer and are heavily discounting the price for early customers who can help us safely iterate on deploying robots in the wild: https://shop.kscale.dev

Since we are focusing on creating a developer ecosystem, we would love to hear your thoughts and feedback about our current software and hardware stack:

- Is this exciting to you? What would make you want to start developing on a humanoid robot? - What form factor is the most interesting for you (in terms of height, reach, end effector, or other hardware considerations)? - As a customer, what software capabilities would you expect from a humanoid robot in order to buy it?

We would love your feedback!

Comments

Razied•8mo ago
Super cool library guys, I think this is by far the quickest way and most painless way to train a humanoid policy. I tried messing around with mujoco and isaaclab stuff a while ago, and it was truly horrible.

A bit more of a hardware question: how modular are the robots? I'm looking at the feet and hands in particular and thinking that there are a bunch of applications where purpose-build parts would work much better than the current ones you have there. I understand the arguments for a humanoid form-factor, but I think flat feet are making it much harder than it should be to get a robust locomotion policy, and training hand-dexterity is also unbelievably hard. Seems like the path of least resistance to usefulness is to have an array of different hand attachments.

modeless•8mo ago
On the K-Bot prototype I saw at their open house recently, the hands were easily swappable on a camera lens style mount.
alihkw_•8mo ago
Kscale Engineer here: The hands are easily swappable using a camera lens style mount. We've tried a couple different hands, the main challenge is they have to fit many actuators in a small footprint.

The feet can also be changed, but not as easily. The bottom of the foot is screwed in.

Flat feet are interesting, the feet are actually a bit curved, they are designed that way to match their counterpart in sim which uses Mujoco's capsules.

I am personally very excited about hot swappable hands!

IanD914•8mo ago
Bayonet mount (micro 4/3's camera style) is built on the hands right now -- you can see the stubby ends in the simulated model, you'll be able to print or machine the other end pretty easily with the female receiver u43 piece and the drafted profile.

You can see even with peers like Unitree that there really isn't a 'one size fits all' for manipulation, and with that comes a real need for rapid prototyping and versatility. Good to point out, and its definitely been a focus for us.

And it's been great just as a jumping-off point -- now that the dexterity for the robot is open for interpretation, everyone that has come in to see the robot has come up own ideas on what they could fit on the robot for their application. Everything from spatulas to pallet forks. Really encouraging to have so much interest around it.

As far as feet are concerned, also definitely a focus for us, and we've been looking at different profiles and different materials to improve the stability and the grip on different terrains. Also 3D printed for now, rapid iteration is key.

spinning_fan•8mo ago
I'm a little confused by the offering exactly.

The robot is up for purchase - this is straightforward.

> After training a model, you can send it to us and we will run it on one of our robots.

Why do I need to send it to you guys to run it on the robot? Presumably, I should be able to it on the robot directly, no?

codekansas•8mo ago
Oh, because it will probably take a bit of time for us to get you a robot, if you want to try the software SDK, then you can try it now