Originally posted by: BFG10K
The Far Cry benching tool I used doesn't support it, neither does Doom3 (AFAIK).
Like I said, maybe next year you'll figure it out, along with the ability to move that AF slider of yours.
The problem with your "logic" is that it doesn't take into account that I game daily
How can you game daily and not finish games unless you're playing MP? Which then begs the question of whether you're using HDR and soft shadows at slideshow settings in MP?
Are you?
If not then when
exactly are you using these "must have" settings?
Have you
ever finished Riddick or Far Cry?
16X AF is not "free" and never has been.
If you're going to argue with that simpleton logic then the same thing could be said about 8x too.
You running at diminished IQ "Performance" settings doesn't count, because upping the filtering while decreasing the detail sort of defeats the purpose.
So I run at 16x and you run at 8x and you've come to the conclusion that I'm reducing detail? Do you even know what AF is?
LOL- I'll "limp along" at my 19X14 4X TAA 8X AF,
You will? So which games will you be using soft shadows and HDR at those settings Rollo? You know, the features that make "OMG nVidia a must have" that you keep constantly pimping?
Tell me Rollo, which games?
Where does your X800XL fall in the hierarchy?
That depends on the game. LIke I said, sometimes it's faster than a 6800U and sometimes it isn't. Of course you'd never know because you spend your days parroting the same benchmarks from three games and pimping "must-have" features.
Yah, you're climbing the video card hierarchy there.
You mean like your pair of 6600 GTs compared to a single 6800GT?
Or your three 5800s compared to a 9700 Pro and 9800 Pro?
Seems to me you gave me a hard time for trying a 5800U for a couple months before the 6800s were available
Yes but my games aren't "W00T, cowering at 1024x768". I run at equal to or better settings on the X800 XL compared to the 6800U (though 8xA compared to 6xAA is certainly up for debate though I personally would give the nod to nVidia for this one).