Originally posted by: nemesismk2
Originally posted by: Deeko
Originally posted by: nemesismk2
I wonder how bad image quality will be effected when using ATI's new "3Dc compression", we could be having Quack style image quality all over again!
Have you even read anything about 3Dc compression?
From the impression I've gotten, 3Dc does NOT, and CANNOT, affect current games. Software must be designed to use it, and if you failed to notice, 3Dc appears to have signifigantly BETTER IQ than DXTC. Kinda like how 3dfx had a compression algorith im the T-Buffer.
Yes I have read up about it and if "3Dc compression" is any good then I am sure that alot of current games will be patched to support it. As for your next comment, take a look here:-
3Dc compression doesn't lower IQ as much as DXTC but it does lower IQ imo!
Originally posted by: tk109
Games are never made perfect and can ALWAYS use optimizations. That's all nvidia did was optomize the game better, like the game could have been in the first place. Hey have you ever read the release notes of EVERY ATI driver release? Every one has tons of game fixes. Shoot some ONLY have game fixes and optomizations. It doesn't mean the drivers were bad, perhaps (and probably) the game itself was broken in a sense (in that it wasn't as good as it should have been in the first place). So sadly instead of the game developer fixing their game it's left up to the video card companies. Now let up already, sheesh!
Originally posted by: ChkSix
I agree. I think only claiming that one company optimizes their drivers, yet both are constantly releasing new versions every month with fixes and corrections, is a moot point.
What a blatently obvious way to cheat!disabled 2x and 4x AA if selected thru the CP. Enabling it in the game however works fine
Originally posted by: Ackmed
I think they did it because if FSAA is selected in the CP with the 61.11 drivers, the game allocates oversized buffers. And that hurts performance. So its just disabled instead.
I didnt see that in the readme.txt, and dont recall any reviewer making a note of it. It explains Anands Farcry numbers. It looks like they selected thru the CP, instead of the game itself.
Take a look at the numbers from Anands review:
Farcry 1280x1024 no AA/AF
X800XT: 113
X800 Pro: 87
6800U: 82
6800 GT: 74
Farcry 1280x1024 4xAA/AF
X800XT: 66
X800 Pro: 49
6800U: 58
6800 GT: 51
So the X800XT was 26 frames faster than the 6800U, without AA. Then just 17 with 4xAA/AF on.
And the X800 Pro was 13 frames faster than the 6800 GT, without AA. Then 2 frames slower with 4xAA/AF on? Then factor in the 60.72 drivers, and their scores. Factoring in NV's AF takes more of a hit than ATi's, and it does indeed look like AA was off.
http://www.anandtech.com/video/showdoc.html?i=2044&p=11
Come to your own conclusions.
That one could be related to the game though. IIRC Far Cry is one of the games where AA works best if you enable it through the game rather than the control panel. nVidia might've gone the compatibility route with this one.What a blatently obvious way to cheat!
Originally posted by: Ackmed
I think they did it because if FSAA is selected in the CP with the 61.11 drivers, the game allocates oversized buffers. And that hurts performance. So its just disabled instead.
I didnt see that in the readme.txt, and dont recall any reviewer making a note of it. It explains Anands Farcry numbers. It looks like they selected thru the CP, instead of the game itself.
Take a look at the numbers from Anands review:
Farcry 1280x1024 no AA/AF
X800XT: 113
X800 Pro: 87
6800U: 82
6800 GT: 74
Farcry 1280x1024 4xAA/AF
X800XT: 66
X800 Pro: 49
6800U: 58
6800 GT: 51
So the X800XT was 26 frames faster than the 6800U, without AA. Then just 17 with 4xAA/AF on.
And the X800 Pro was 13 frames faster than the 6800 GT, without AA. Then 2 frames slower with 4xAA/AF on? Then factor in the 60.72 drivers, and their scores. Factoring in NV's AF takes more of a hit than ATi's, and it does indeed look like AA was off.
http://www.anandtech.com/video/showdoc.html?i=2044&p=11
Come to your own conclusions.
Originally posted by: Viper96720
Nvidia is using the same type as AF as ATI with the 6800. read the 6800Ultra preview/review