Originally posted by: VirtualLarry
Originally posted by: Blastman
No. NV has been working
very closely with id on optimizing Doom3. Is it any wonder it runs comparatively well on NV cards.
beyond3d?
The quote is from me. Nvidia probably IS "cheating" to some degree,
recognizing the Doom shaders and substituting optimized ones, because I
have found that making some innocuous changes causes the performance to
drop all the way back down to the levels it used to run at.
Carmack again refers to that in the HardOCP article, too:
On the other hand, the Nvidia drivers have been tuned for Doom's primary light/surface interaction fragment program, and innocuous code changes can "fall off the fast path" and cause significant performance impacts, especially on NV30 class cards.
Reading between the lines, I would say that he is basically saying up-front, "Both ATI and NV cheat. Live with it. Both give good frame rates, so just enjoy the game and stop worrying about it."
For those that DO care, though, I suppose the other interpretation could be, "none of these benchmark numbers are accurate, because both ATI and NV cheat".
One other thing, independent of the "cheats", is that I saw mentioned that the D3 "demo1" benchmark, doesn't run the game AI/physics, only the 3D engine (and sounds?). So actual in-game framerates will necessarily be *slower* than those shown. Additionally, since the demo appears to be CPU-bound at 1024x768 no AA/AF levels, on the 6800/X800 cards, then it appears that D3 may require quite a bit of CPU power, perhaps moreso than graphics-card power. (Since the 3D engine alone is already sucking up 100% of the CPU power of the systems that they tested those cards on - once you start playing the actual game, frame rates are bound to drop lower. How much lower remains to be seen.)