Do most GPUs made for AI even have a graphical output buffer and a video output any more?
This is cool! I love this kind of simulation GPU programming stuff. Reminds me of this awesome talk from Peter Whidden: https://youtu.be/Hju0H3NHxVI?si=V_UZugPSL9a8eHEM
Not as technicial but similarly cool.
The thought expressed in the title came to my mind when I saw Nvidia described as an "AI company" in the press recently...
to be fair, the percentage of their revenue derived from ai-related sales is much higher now than before. Why is that not accurate?
This is awesome. It also brought back some anxiety from >10 years ago in college that reminds me that computer graphics and my brain do not agree whatsoever.
Graphics is trivial until you get to shadows and lighting. Then all the simple tricks stop working.
Everything's just triangles and numbers, and my brain's no good with numbers. Linear algebra I can do though.
Not always. Disregarding CSGs and parametrics, Nvidia itself was almost buried for not adhering to that philosophy with their first product https://en.wikipedia.org/wiki/NV1
funny side note. SEGA invested $5m in Nvidia then, after the fiasco to keep them alive. They sold that equity when Nvidia went IPO for roughly $15m. Have they kept it, it would be worth $3b today. SEGA's market cap is around $4b today.
Funny nvidias first 3d accelerator used quaternions
It's not the numbers that freak me out, it's what they do to each other...
Generator