They are also quite good at translating poorly written and only semi coherent writing, which can be incredibly useful if the person you are communicating with is quite sloppy.
The whole LLM scene today came about because context was really important to translations. The "attention is all you need" paper was by the Google Translation team as they came up with ideas to improve how to map context of words and carry them across in translations.
At some point people started asking the translation to "translate from English to English as if you're an AI assistant".
Anyway it shouldn't surprise anyone that LLMs are good at translation. The real surprise to everyone is how powerful translation engines that understood context could be!
The translation transformer also was able to peek ahead in the context window while (most?) LLM's now only consider previous tokens.
You see this with recent automated translation on YouTube. If the creator of (say) an English-language video doesn't upload subtitles, YouTube automatically creates them based on the audio, but they lack punctuation and have nonsense phrases. The AI-driven translation of those subtitles to other languages cleans up the text along the way, so the end result is that non-English speakers get better subtitles than English speakers.
I wonder if one could put this larger ROM, and the other files into a custom built image so no swaps are required.
They had enough room left in the 512KB ROM to fit a 357KB boot disk a stripped down System 6.0.3 and a few useful tools (MacsBug and AppleShare Prep)
hoherd•4h ago
https://infinitemac.org/1996/KanjiTalk%207.5.3
msephton•2h ago
JKCalhoun•1h ago
npunt•1h ago
msephton•26m ago