https://www.fujitsu.com/global/about/resources/news/press-re...
ARM is supposed to be releasing their own AI chip soon, so I'm not really sure what's going on here, other than all these major companies just see incredible demand for chips:
https://www.ft.com/content/95367b2b-2aa7-4a06-bdd3-0463c9bad...
Would it be wrong to say these will probably be our inferencing chips instead of GPUs going forward?
My only worry would be that the software model of compute that ROCm has is going to drastically different than the hardware model of compute the Fujitsu chip has. This would be like using one of those CUDA compilers that target AMD hardware. Sure you can produce something that will run, but depending on what you are compiling, it will run dog slow.
LargoLasskhyfv•9mo ago
https://morethanmoore.substack.com/p/a-quiet-giant-steps-for...
ahartmetz•9mo ago