I still can not understand the name of "attention is what you need" study.
yorwba•1h ago
The "attention is all you need" paper did not invent attention mechanisms. It showed that existing models that were already using attention could have their non-attention parts removed and still worked. So those other parts were unnecessary and only attention was needed.
JSR_FDED•1h ago
Cute riff on the title of the Google paper. Kind of like abusing "GOTO considered harmful". Ironic though because it's a way of drawing attention, which he denounces in his article:
> Real technology isn’t being built by this new cohort of founders, and we’ve entered into an era of hyper competitive degeneracy, fighting for attention.
eimrine•1h ago
yorwba•1h ago