frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Open in hackernews

Show HN: Semantic Calculator (king-man+woman=?)

https://calc.datova.ai
75•nxa•4h ago
I've been playing with embeddings and wanted to try out what results the embedding layer will produce based on just word-by-word input and addition / subtraction, beyond what many videos / papers mention (like the obvious king-man+woman=queen). So I built something that doesn't just give the first answer, but ranks the matches based on distance / cosine symmetry. I polished it a bit so that others can try it out, too.

For now, I only have nouns (and some proper nouns) in the dataset, and pick the most common interpretation among the homographs. Also, it's case sensitive.

Comments

antidnan•4h ago
Neat! Reminds me of infinite craft

https://neal.fun/infinite-craft/

thaumasiotes•2h ago
I went to look at infinite craft.

It provides a panel filled with slowly moving dots. Right of the panel, there are objects labeled "water", "fire", "wind", and "earth" that you can instantiate on the panel and drag around. As you drag them, the background dots, if nearby, will grow lines connecting to them. These lines are not persistent.

And that's it. Nothing ever happens, there are no interactions except for the lines that appear while you're holding the mouse down, and while there is notionally a help window listing the controls, the only controls are "select item", "delete item", and "duplicate item". There is also an "about" panel, which contains no information.

n2d4•2h ago
In the panel, you can drag one of the items (eg. Water) onto another one (eg. Earth), and it will create a new word (eg. Plant). It uses AI, so it goes very deep
thaumasiotes•2h ago
No, that was the first thing I tried. The only thing that happens is that the two objects will now share their location. There are no interactions.
n2d4•2h ago
Probably a bug then, you can check YouTube to find videos of people playing it (eg. [0])

[0] https://youtu.be/8-ytx84lUK8

firejake308•4h ago
King-man+woman=Navratilova, who is apparently a Czech tennis player. Apparently, it's very case-sensitive. Cool idea!
fph•4h ago
"King" (capital) probably was interpreted as https://en.wikipedia.org/wiki/Billie_Jean_King , that's why a tennis player showed up.
nxa•4h ago
when I first tried it, king was referring to the instrument and I was getting a result king-man+woman=flute ... :-D
BeetleB•3h ago
Heh. This is fun:

Navratilova - woman + man = Lendl

nikolay•4h ago
Really?!

  man - brain = woman
  woman - brain = businesswoman
2muchcoffeeman•4h ago
Man - brain = Irish sea
nikolay•4h ago
Case matters, obviously! Try "man" with a lower-case "M"!
Alifatisk•4h ago
Why does case matter? How does it affect the meaning?
bfLives•4h ago
“Man” is probably being interpreted as the Isle of Man.

https://en.m.wikipedia.org/wiki/Isle_of_Man

G1N•4h ago
Man (capital M) is probably being interpreted as some proper noun, maybe Isle of Man in this case?
karel-3d•4h ago
woman+penis=newswoman (businesswoman is second)

man+vagina=woman (ok that is boring)

sapphicsnail•4h ago
Telling that Jewess, feminist, and spinster were near matches as well.
nxa•4h ago
I probably should have prefaced this with "try at your own risk, results don't reflect the author's opinions"
dmonitor•2h ago
I'm sure it would be trivial to get it to say something incredibly racist, so that's probably a worthwhile disclaimer to put on the website
dalmo3•4h ago
I think subtraction is broken. None of what I tried made any sense. Water - oxygen = gin and tonic.
adzm•4h ago
noodle+tomato=pasta

this is pretty fun

growlNark•4h ago
Surely the correct answer would be `pasta-in-tomato-sauce`? Pasta exists outside of tomato sauce.
cabalamat•4h ago
What does it mean when it surrounds a word in red? Is this signalling an error?
nxa•4h ago
Yes, word in red = word not found mostly the case when you try plurals or non-nouns (for now)
rpastuszak•4h ago
This is neat!

I think you need to disable auto-capitalisation because on mobile the first word becomes uppercase and triggers a validation error.

iambateman•4h ago
Try Lower casing, my phone tried to capitalize and it was a problem.
fallinghawks•4h ago
Seems to be a word not in its dictionary. Seems to not have any country or language names.

Edit: these must be capitalized to be recognized.

zerof1l•4h ago
male + age = female

female + age = male

G1N•4h ago
twelve-ten+five=

six (84%)

Close enough I suppose

lightyrs•4h ago
I don't get it but I'm not sure I'm supposed to.

    life + death = mortality
    life - death = lifestyle

    drug + time = occasion
    drug - time = narcotic

    art + artist + money = creativity
    art + artist - money = muse

    happiness + politics = contentment
    happiness + art      = gladness
    happiness + money    = joy
    happiness + love     = joy
grey-area•3h ago
Does the system you’re querying ‘get it’? From the answers it doesn’t seem to understand these words or their relations. Once in a while it’ll hit on something that seems to make sense.
bee_rider•2h ago

    Life + death = mortality  
is pretty good IMO, it is a nice blend of the concepts in an intuitive manner. I don’t really get

   drug + time = occasion
But

   drug - time = narcotic
Is kind of interesting; one definition of narcotic is

> a drug (such as opium or morphine) that in moderate doses dulls the senses, relieves pain, and induces profound sleep but in excessive doses causes stupor, coma, or convulsions

https://www.merriam-webster.com/dictionary/narcotic

So we can see some element of losing time in that type of drug. I guess? Maybe I’m anthropomorphizing a bit.

woodruffw•4h ago
colorless+green+ideas doesn't produce anything of interest, which is disappointing.
dmonitor•2h ago
well green is not a creative color, so that's to be expected
skeptrune•4h ago
This is super fun. Offering the ranked matches makes it significantly more engaging than just showing the final result.
spindump8930•4h ago
First off, this interface is very nice and a pleasure to use, congrats!

Are you using word2vec for these, or embeddings from another model?

I also wanted to add some flavor since it looks like many folks in this thread haven't seen something like this - it's been known since 2013 that we can do this (but it's great to remind folks especially with all the "modern" interest in NLP).

It's also known (in some circles!) that a lot of these vector arithmetic things need some tricks to really shine. For example, excluding the words already present in the query[1]. Others in this thread seem surprised at some of the biases present - there's also a long history of work on that [2,3].

[1] https://blog.esciencecenter.nl/king-man-woman-king-9a7fd2935...

[2] https://arxiv.org/abs/1905.09866

[3] https://arxiv.org/abs/1903.03862

nxa•3h ago
Thank you! I actually had a hard time finding prior work on this, so I appreciate the references.

The dictionary is based on https://wordnet.princeton.edu/, no word2vec. It's just a plain lookup among precomputed embeddings (with mxbai-embed-large). And yes, I'm excluding words that are present in the query because.

It would be interesting to see how other models perform. I tried one (forgot the name) that was focused on coding, and it didn't perform nearly as well (in terms of human joy from the results).

kaycebasques•3h ago
(Question for anyone) how could I go about replicating this with Gemini Embedding? Generate and store an embedding for every word in the dictionary?
nxa•3h ago
Yes, that's pretty much what it is. Watch out for homographs.
7373737373•4h ago
it doesn't know the word human
grey-area•3h ago
As you might expect from a system with knowledge of word relations but without understanding or a model of the world, this generates gibberish which occasionally sounds interesting.
fallinghawks•3h ago
goshawk-cocaine = gyrfalcon , which is funny if you know anything about goshawks and gyrfalcons

(Goshawks are very intense, gyrs tend to be leisurely in flight.)

kataqatsi•3h ago
garden + sin = gardening

hmm...

MYEUHD•3h ago
king - man + woman = queen

queen - woman + man = drone

bee_rider•2h ago
The second makes sense, I think, if you are a bee.
blobbers•3h ago
rice + fish = fish meat

rice + fish + raw = meat

hahaha... I JUST WANT SUSHI!

godelski•3h ago

  data + plural = number
  data - plural = research
  king - crown = (didn't work... crown gets circled in red)
  king - princess = emperor
  king - queen = kingdom
  queen - king = worker
  king + queen = queen + king = kingdom
  boy + age = (didn't work... boy gets circled in red)
  man - age = woman
  woman - age = newswoman
  woman + age = adult female body (tied with man)
  girl + age = female child
  girl + old = female child
The other suggestions are pretty similar to the results I got in most cases. But I think this helps illustrate the curse of dimensionality (i.e. distances are ill-defined in high dimensional spaces). This is still quite an unsolved problem and seems a pretty critical one to resolve that doesn't get enough attention.
Affric•3h ago
Yeah I did similar tests and got similar results.

Curious tool but not what I would call accurate.

n2d4•2h ago
For fun, I pasted these into ChatGPT o4-mini-high and asked it for an opinion:

   data + plural    = datasets
   data - plural    = datum
   king - crown     = ruler
   king - princess  = man
   king - queen     = prince
   queen - king     = woman
   king + queen     = royalty
   boy + age        = man
   man - age        = boy
   woman - age      = girl
   woman + age      = elderly woman
   girl + age       = woman
   girl + old       = grandmother

The results are surprisingly good, I don't think I could've done better as a human. But keep in mind that this doesn't do embedding math like OP! Although it does show how generic LLMs can solve some tasks better than traditional NLP.

The prompt I used:

> Remember those "semantic calculators" with AI embeddings? Like "king - man + woman = queen"? Pretend you're a semantic calculator, and give me the results for the following:

nbardy•2h ago
I hate to be pedantic, but the llm is definitely doing embedding math. In fact that’s all it does.
franga2000•1h ago
This is an LLM approximating a semantic calculator, based solely on trained-in knowledge of what that is and probably a good amount of sample output, yet somehow beating the results of a "real" semantic calculator. That's crazy!

The more I think about it the less surprised I am, but my initial thoughts were quite simply "now way" - surely an approximation of an NLP model made by another NLP model can't beat the original, but the LLM training process (and data volume) is just so much more powerful I guess...

CamperBob2•1h ago
This is basically the whole idea behind the transformer. Attention is much more powerful than embedding alone.
gweinberg•2h ago
I got a bunch of red stuff also. I imagine the author cached embeddings for some words but not really all that many to save on credits. I gave it mermaid - woman and got merman, but when I tried to give it boar + woman - man or ram + woman - man, it turns out it has never heard of rams or boars.
thatguysaguy•1h ago
Can you elaborate on what the unsolved problem you're referring to is?
mathgradthrow•1h ago
Distance is extremely well defined in high dimensional spaces. That isn't the problem.
ericdiao•3h ago
Interesting: parent + male = female (83%)

Can not personally find the connection here, was expecting father or something.

ericdiao•3h ago
Though dad is in the list with lower confidence (77%).

High dimension vector is always hard to explain. This is an example.

TZubiri•3h ago
I'm getting Navralitova instead of queen. And can't get other words to work, I get red circles or no answer at all.
gus_massa•3h ago
From another comment, https://news.ycombinator.com/item?id=43988861 King (with capital K) was a top 1 male tenis player.
nxa•3h ago
This might be helpful: I haven't implemented it in the UI, but from the API response you can see what the word definitions are, both for the input and the output. If the output has homographs, likeliness is split per definition, but the UI only shows the best one.

Also, if it gets buried in comments, proper nouns need to be capitalized (Paris-France+Germany).

I am planning on patching up the UI based on your feedback.

ericdiao•3h ago
wine - alcohol = grape juice (32%)

Accurate.

afandian•3h ago
There was a site like this a few years ago (before all the LLM stuff kicked off) that had this and other NLP functionality. Styling was grey and basic. That’s all I remember.

I’ve been unable to find it since. Does anyone know which site I’m thinking of?

halter73•2h ago
I'm not sure this is old enough, but could you be referencing https://neal.fun/infinite-craft/ from https://news.ycombinator.com/item?id=39205020?
montebicyclelo•3h ago
> king-man+woman=queen

Is the famous example everyone uses when talking about word vectors, but is it actually just very cherry picked?

I.e. are there a great number of other "meaningful" examples like this, or actually the majority of the time you end up with some kind of vaguely tangentially related word when adding and subtracting word vectors.

(Which seems to be what this tool is helping to illustrate, having briefly played with it, and looked at the other comments here.)

(Btw, not saying wordvecs / embeddings aren't extremely useful, just talking about this simplistic arithmetic)

raddan•2h ago
> is it actually just very cherry picked?

100%

gregschlom•2h ago
Also, as I just learned the other day, the result was never equal, just close to "queen" in the vector space.
Retr0id•2h ago
I think it's slightly uncommon for the vectors to "line up" just right, but here are a few I tried:

actor - man + woman = actress

garden + person = gardener

rat - sewer + tree = squirrel

toe - leg + arm = digit

groby_b•2h ago
I think it's worth keeping in mind that word2vec was specifically trained on semantic similarity. Most embedding APIs don't really give a lick about the semantic space

And, worse, most latent spaces are decidedly non-linear. And so arithmetic loses a lot of its meaning. (IIRC word2vec mostly avoided nonlinearity except for the loss function). Yes, the distance metric sort-of survives, but addition/multiplication are meaningless.

(This is also the reason choosing your embedding model is a hard-to-reverse technical decision - you can't just transform existing embeddings into a different latent space. A change means "reembed all")

jbjbjbjb•58m ago
Well when it works out it is quite satisfying

India - Asia + Europe = Italy

Japan - Asia + Europe = Netherlands

China - Asia + Europe = Soviet-Union

Russia - Asia + Europe = European Russia

calculation + machine = computer

bee_rider•13m ago
Hmm, well I got

    cherry - picker = blackwood
if that helps.
jumploops•3h ago
This is super neat.

I built a game[0] along similar lines, inspired by infinite craft[1].

The idea is that you combine (or subtract) “elements” until you find the goal element.

I’ve had a lot of fun with it, but it often hits the same generated element. Maybe I should update it to use the second (third, etc.) choice, similar to your tool.

[0] https://alchemy.magicloops.app/

[1] https://neal.fun/infinite-craft/

ezbie•3h ago
Can someone explain me what the fuck this is supposed to be!?
mhitza•2h ago
Semantical subtraction within embeddings representation of text ("meaning")
matallo•2h ago
uncle + aunt = great-uncle (91%)

great idea, but I find the results unamusing

HWR_14•2h ago
Your aunt's uncle is your great-uncle. It's more correct than your intuition.
matallo•2h ago
I asked ChatGPT (after posting my comment) and this is the response. "Uncle + Aunt = Great-Uncle is incorrect. A great-uncle is the brother of your grandparent."
lcnPylGDnU4H9OF•2h ago
Some of these make more sense than others (and bookshop is hilarious even if it's only the best answer by a small margin; no shade to bookshop owners).

  map - legend = Mercator projection
  noodle - wheat = egg noodle
  noodle - gluten = tagliatelle
  architecture - calculus = architectural style
  answer - question = comment
  shop - income = bookshop
  curry - curry powder = cuisine
  rice - grain = chicken and rice
  rice + chicken = poultry
  milk + cereal = grain
  blue - yellow = Fiji
  blue - Fiji = orange
  blue - Arkansas + Bahamas + Florida - Pluto = Grenada
kylecazar•2h ago
Woman + president = man
tlhunter•2h ago
man + woman = adult female body
__MatrixMan__•2h ago
Here's a challenge: find something to subtract from "hammer" which does not result in a word that has "gun" as a substring. I've been unsuccessful so far.
neom•1h ago
if I'm allowed only 1 something, I can't find anything either, if I'm allowed a few somethings, "hammer - wine - beer - red - child" will get you there. Guessing given that a gun has a hammer and is also a tool, it's too heavily linked in the small dataset.
tough•1h ago
hammer + man = adult male body (75%)
rdlw•58m ago
Close, that's addition
Retr0id•1h ago
Well that's easy, subtract "gun" :P
mrastro•1h ago
The word "gun" itself seems to work. Package this as a game and you've got a pretty fun game on your hands :)
downboots•1h ago
Bullet
aniviacat•1h ago
Gun related stuff works: bullet, holster, barrel

Other stuff that works: key, door, lock, smooth

Some words that result in "flintlock": violence, anger, swing, hit, impact

soxfox42•44m ago
hammer - red = lock
neom•2h ago
cool but not enough data to be useful yet I guess. Most of mine either didn't have the words or were a few % off the answer, vehicle - road + ocean gave me hydrosphere, but the other options below were boat, ship, etc. Klimt almost made it from Mozart - music + painting. doctor - hospital + school = teacher, nailed it.

Getting to cornbread elegantly has been challenging.

downboots•2h ago
three + two = four (90%)
LadyCailin•1h ago
Haha, yes, this was my first thought too. It seems it’s quite bad at actual math!
yigitkonur35•1h ago
shows how bad embeddings are in a practical way
rdlw•1h ago
I've always wondered if there's s way to find which vectors are most important in a model like this. The gender vector man-woman or woman-man is the one always used in examples, since English has many gendered terms, but I wonder if it's possible to generate these pairs given the data. Maybe to list all differences of pairs of vectors, and see if there are any clusters. I imagine some grammatical features would show up, like the plurality vector people-person, or the past tense vector walked-walk, but maybe there would be some that are surprisingly common but don't seem to map cleanly to an obvious concept.

Or maybe they would all be completely inscrutable and man-woman would be like the 50th strongest result.

Jimmc414•1h ago
dog - cat = paleolith

paleolith + cat = Paleolithic Age

paleolith + dog = Paleolithic Age

paleolith - cat = neolith

paleolith - dog = hand ax

cat - dog = meow

Wonder if some of the math is off or I am not using this properly

downboots•1h ago
mathematics - Santa Claus = applied mathematics

hacker - code = professional golf

quantum_state•1h ago
The app produces nonsense ... such as quantum - superposition = quantum theory !!!
nxa•39m ago
artificial intelligence - bullsh*t = computer science (34%)
behnamoh•33m ago
This. I'm tired of so many "it's over, shocking, game changer, it's so over, we're so back" announcements that turn out to be just gpt-wrappers or resume-builder projects.

Very few papers that actually say something meaningful are left unnoticed, but as soon as you say something generic like "language models can do this", it gets featured in "AI influencer" posts.

galaxyLogic•30m ago
What about starting with the result and finding set of words that when summed together give that result?

That could be seen as trying to find the true "meaning" of a word.

GrantMoyer•18m ago
These are pretty good results. I messed around with a dumber and more naive version of this a few years ago[1], and it wasn't easy to get sensinble output most of the time.

[1]: https://github.com/GrantMoyer/word_alignment

AlphaEvolve: A Gemini-powered coding agent for designing advanced algorithms

https://deepmind.google/discover/blog/alphaevolve-a-gemini-powered-coding-agent-for-designing-advanced-algorithms/
646•Fysi•9h ago•176 comments

Migrating to Postgres

https://engineering.usemotion.com/migrating-to-postgres-3c93dff9c65d
81•shenli3514•3h ago•32 comments

Show HN: Muscle-Mem, a behavior cache for AI agents

https://github.com/pig-dot-dev/muscle-mem
127•edunteman•5h ago•32 comments

What is HDR, anyway?

https://www.lux.camera/what-is-hdr/
495•_kush•12h ago•246 comments

Show HN: Semantic Calculator (king-man+woman=?)

https://calc.datova.ai
76•nxa•4h ago•100 comments

A server that wasn't meant to exist

https://it-notes.dragas.net/2025/05/13/the_server_that_wasnt_meant_to_exist/
243•jaypatelani•9h ago•62 comments

Copaganda: How Police and the Media Manipulate Our News

https://www.teenvogue.com/story/copaganda-when-the-police-and-the-media-manipulate-our-news
53•pavel_lishin•1h ago•9 comments

Git Bug: Distributed, Offline-First Bug Tracker Embedded in Git, with Bridges

https://github.com/git-bug/git-bug
161•stefankuehnel•1d ago•60 comments

Getting Started with Celtic Coins – Crude and Barbarous, or Just Different?

https://collectingancientcoins.co.uk/getting-started-with-celtic-coins-crude-and-barbarous-or-just-different/
27•jstrieb•3d ago•4 comments

Hegel 2.0: The imaginary history of ternary computing (2018)

https://www.cabinetmagazine.org/issues/65/weatherby.php
25•Hooke•2d ago•1 comments

Variadic Switch

https://pydong.org/posts/variadic-switch/
22•Tsche•1d ago•2 comments

Our narrative prison

https://aeon.co/essays/why-does-every-film-and-tv-series-seem-to-have-the-same-plot
119•anarbadalov•8h ago•104 comments

Changes since congestion pricing started in New York

https://www.nytimes.com/interactive/2025/05/11/upshot/congestion-pricing.html
178•Vinnl•1d ago•199 comments

StackAI (YC W23) Is Hiring Pydantic and FastAPI Wizard

https://www.ycombinator.com/companies/stackai/jobs/8nYnmlN-backend-engineer
1•baceituno•3h ago

Smalltalk-78 Xerox NoteTaker in-browser emulator

https://smalltalkzoo.thechm.org/users/bert/Smalltalk-78.html
69•todsacerdoti•7h ago•26 comments

The cryptography behind passkeys

https://blog.trailofbits.com/2025/05/14/the-cryptography-behind-passkeys/
156•tatersolid•13h ago•134 comments

Databricks and Neon

https://www.databricks.com/blog/databricks-neon
263•davidgomes•14h ago•183 comments

NASA Stennis Releases First Open-Source Software

https://www.nasa.gov/centers-and-facilities/stennis/stennis-first-open-source-software/
4•mindcrime•1d ago•1 comments

UK's Ancient Tree Inventory

https://ati.woodlandtrust.org.uk/
52•thinkingemote•14h ago•50 comments

How the economics of multitenancy work

https://www.blacksmith.sh/blog/the-economics-of-operating-a-ci-cloud
137•tsaifu•11h ago•30 comments

Updated rate limits for unauthenticated requests

https://github.blog/changelog/2025-05-08-updated-rate-limits-for-unauthenticated-requests/
50•xena•5d ago•66 comments

Launch HN: Jazzberry (YC X25) – AI agent for finding bugs

35•MarcoDewey•8h ago•17 comments

How to Build a Smartwatch: Picking a Chip

https://ericmigi.com/blog/how-to-build-a-smartwatch-picking-a-chip/
224•rcarmo•17h ago•99 comments

Interferometer Device Sees Text from a Mile Away

https://physics.aps.org/articles/v18/99
187•bookofjoe•4d ago•50 comments

Show HN: Lumier – Run macOS VMs in a Docker

https://github.com/trycua/cua/tree/main/libs/lumier
109•GreenGames•9h ago•37 comments

Writing that changed how I think about programming languages

https://bernsteinbear.com/blog/pl-writing/
388•r4um•20h ago•48 comments

Perverse incentives of vibe coding

https://fredbenenson.medium.com/the-perverse-incentives-of-vibe-coding-23efbaf75aee
143•laurex•5h ago•152 comments

Uber to introduce fixed-route shuttles in major US cities

https://techcrunch.com/2025/05/14/uber-to-introduce-fixed-route-shuttles-in-major-us-cities-other-ways-to-save/
124•rpgbr•9h ago•328 comments

Various Things in MetaPost (2019)

https://habr.com/en/articles/454376/
29•Tomte•6h ago•4 comments

David Hilbert's radio address (2014)

https://old.maa.org/press/periodicals/convergence/david-hilberts-radio-address
31•anigbrowl•5h ago•6 comments