Oh dang, branch hints. I always thought they were so obvious, and never implemented, so they must be obviously bad. But, Intel is giving them a shot. Neat!
Giving them another shot. Pentium 4 had them, but there were some skill issues on the part of programmers using them, and so code quality rose when CPUs started ignoring them.
Do compilers even generate these hints?
Usually not by default, but GCC and other compilers have intrinsics like __builtin_expect [1] which may generate branch hints.
[1]: https://gcc.gnu.org/onlinedocs/gcc/Other-Builtins.html#index...
IIRC, SPARC also used them, which... probably suggests that it's not totally terrible.
I believe PowerPC had them as well. No idea how effective they were.
I took computer architecture in the mid-late-90s and branch hints were talked about as a thing that was being done.
"AI has been very popular among people wishing they had seven fingers on each hand instead of five"
But most unfortunate for Numbers. We have 4 fingers, making binary capabilities, now if we had 60 fingers... then it would be a different story. With 5 fingers on each hand, and two hands, that is the product of two primes, and with 7 fingers on 3 hands, also the product of two primes... You have something to look forward to in Genetic Engineering 2077. Eight fingers on 4 hands? Personally, my fingers count only to 1023, and have for decades.
The article points out specific use of specific techniques to probe cache and other efficiencies, which is valuable.
Think of how much more you could get done
I call it "the stranger".
People downvoting you probably don't realise this is a direct quote from the article. Maybe if you had written some actual comment, too...
Sorry about no additional comment, I couldn't think of something clever that included an Inigo Montoya reference...
It was a no-content throwaway comment in the article too; highlighting it on HN is pointless. I doubt the downvoters would care much whether it was in the article or not.