> If cosmic rays flip bits in storage or on the network, that can be detected through error coding. But there's no analogy for a CPU that allows cheap online verification of its correctness.
Note that CPU can be represented as mostly a set of memory cells, so some techniques for memory correction can probably be used with CPUs as well. https://bailleux.net/pub/ob-project-gray1.pdf
Maybe not cheap, but back in the 80's there were the Tandem NonStop mainframes that used double or triple CPU's to check calculation results for correctness.
Are these errors consistent for the same instructions? For example, in the same ALU, will 2+2 always equal 5, or will it spontaneously produce 5 and not happen again for a "long while"?
It's not just CPUs that can cause Silent Data Corruption errors (SDCs). Essentially any chip in the system can give bad results, and those bad results are often not detected.
Funny they should mention Google. Isn’t that the company who’s chatbot regularly and consistently gets things wrong?
LLM limitations are different from silent data corruption errors.