Originally posted by: Genx87
You might want to refine your sense of irony. When nV substituted shaders in 3DM03, they did so without telling anyone until they were caught, they used their whole company to discredit 3DM03, and they never once apologized for cheating in a benchmark.
Not quite. I am sure if we go back a year or so we will see the same people who praise this were bashing Nvidia
for doing the same thing. So I ask what is the difference? Both are hacks, both do not give the same exact pic quality.
^Telltale sign of a fanboy, hyperbole to the point of lying.
------------------------------------------
[/quote]
Why are you pandering to the lowest common denominator in this discussion? nV never "gave out" driver tweaks, it basically slipped them into its drivers without telling anyone. nV certainly was cheating, though the devil comment is hyperbolic fanboy talk.
Hyperbolic fanboy talk also results in people claiming "ID is working against ATI." Why do you bother listening to people ignorant or biased enough to make such statements? Do you enjoy bringing this forum down to the level of soccer hooligans? Who or what exactly is the "it" that is proclaiming that "ID is working against ATI," BTW? Surely you can't mean ATI itself, as it hasn't commented on this--but then, who?
Did you even bother to read the original posters post?[/quote]
Do you ever bother to acknowledge when someone makes a strong point that what you are doing is helping to fuel the flames?
This frothy mouthed trollship of the first order has to stop, and insecure nVidiots need to give this schtick up!
------------------------------------------
Now that that is out of the way, on to the real
discussion (ie, not flame-fest!).
Alright, surely you guys have seen the benchmarks! Nvidia crushes ATI in Doom3! The 6800 series makes mince meat out of everything, often by a twofold margin in lower resolutions, and by 35%++ gaps at high resolutions. The card of the day is the 6800GT, and nothing is going to change that. Heck, I'm dying to get a 6800GT, but the $400 price tag is just too much for me to be able to justify to myself!
With that said, use some objectivity and reason here. Yes, Doom3 was custom made for Nvidia hardware - there are some good articles on this on the web about how ATI's "fast Z clear" is not a good match for D3, for example, as well as other technologies that Nvidia runs better than ATI or ATI worse than Nvidia for Doom3. Nevertheless, is this not an (at least slightly) uncharacteristic throttling of ATI by Nvidia, particularly when comparing the 16-pipe X800XT with the two 16-pipe 6800GT and 6800 Ultra cards? They kick the X800XT's arse, and make the Pro look like last gen tech, especially at lower resolutions (1280 and below).
If you guys took the time to even liberally parse through the Beyond3D article, instead of jumping into this flame-fest feet first, you'd see the very essence of this optimization is based on a very simple idea - how Nvidia is much faster at texture lookups (what is done by default, obviously), while ATI can sometimes just do a calculation mathematically faster.
And if you read even a little deeper into the B3D article, you'd read that by 'doing the math' so to speak, and using this "shader replacement," you get similar, if not
more precise results than the stock way of doing it (hint: read past page 5, when Humus' results were producing inferior results). Of course, it actually hurts performance on Nvidia cards because, as said above, Nvidia does texture lookups faster than working it out; Nvidia essentially did custom tailor their card to be a Doom3 beast out of the box, and tweaks like this generally make the Nvidia cards perform worse.
The counterpoint by John Carmack to these types of optimizations is that he wanted to standardize shaders across the board, instead of introducing tons of custom shaders and paths for everything, which in turn greatly simplifies troubleshooting across hardware configurations.
Both sides make excellent points - I agree 100% with JC's decision to standardize shaders for the game, and I agree 100% with Humus' decision to play around with the shaders and, being an ATI developer as well as a hardware enthusiast, using his ATI hardware know-how to tweak a setting in D3 to work better on the hardware he knows best.
If we can take this for what it is - a clever hack by a knowledgeable hobbyist who works for ATI, which can give ATI users a tangible, if (IMO) a bit of an underwhelming speed boost, then what is wrong with that?
Instead, the mere thought of ATI getting better performance by an unofficial hack triggers a tidal wave of name calling and mud throwing, accusations and biases. Never mind the fact that the ATI cards still gets crushed by the 6800 series!
-------------------------------
This isn't the optimizations of yore by Nvidia to artificially level the playing field with inferior IQ and to make performance +/- 10% equal; this is a shot in the arm to ATI's rendering speed to just make them less crippled, and it isn't being force-fed to anyone through either company's drivers, but through a hardcore 3d discussion forum!
Rant over - sometimes the internet can be such a drag!