Its not a flawed comparison
Of course it's flawed, we're talking about a comparison at the
same resolution where one card can run 4xAA but the other can't due to performance reasons.
You comparing 1920x1200 to 1280x1024 is a strawman argument.
I'm just pointing out there is less need to run ANY AA at higher resolutions, so how a card performs at high resolutions + AA isn't as significant as how it performs at higher resolutions w/out AA.
I disagree. If anything I would argue that AA is even more important at high resolutions because aliasing stands out even more when it's being compared to an otherwise sharp picture.
And of course res + AA is significant because one card can run AA while another can't. You don't buy $300 video cards to run without AA.
There's very little difference until 16x12/10 in a few games and 19x14/12 on everything else,
The differences look pretty big to me.
2560 with 4x AA in today's games are borderline unplayable.
Again with your resolution fixation. Nobody was talking about 2560x1600
per-se, we're talking about resolutions where 4xAA is possible on the 640 but not on the 320. These might be 2560x1600 but they might not be.
You're basically saying "AA doesn't matter because I can't run [Insert Game Here] at 2560x1600 with 4xAA". That's fine but it's also irrelevant to the games where the 640
can get playable scores with 4xAA but the 320
can't.
I fail to see why you're so fixated with arbitrary resolutions when the issue is AA.
So you can either split hairs between 20 fps and 13 fps at 2560 with 4x AA or you can acknowledge they both run like pigs, turn off AA and then split hairs between 45 and 60 fps.
The difference in many cases is the difference between playable and not playable, like in AT's Fear score for example: 51 FPS vs 37 FPS.
Just curious what newer games you're running at 1920+ w/ 4x AA?
Fear, Doom 3, Quake 4, Condemned, Call of Juarez (SM 2.0), Serious Sam 2 and Far Cry to name a few.
The next generation of games and their offspring (UT3, Crysis) are all going to stress the GPU similarly, if not moreso.
Right, and when that happens what card do you suppose will run out of steam first, the 320 or the 640?
And yes, the 640MB 8800GTS runs 1900 acceptably in today's games, but even the more taxing games give it problems with AA turned on.
That's when you have the option to turn off AA. You don't have the option on the 320 MB if you're already running AA disabled in current games so what are you going to do then?
Its only going to get worst, not better, so don't get used to those AA settings for too long.
As long as you have at least a minimum of 4xAA then dropping the resolution has far less impact on image quality than it does when AA is disabled. Without AA jagged edges explode exponentially when the resolution is dropped.
1600x1200 with 4xAA looks vastly better than 1920x1440 with no AA.
And again, if things get worse for the 640 then they'll be get much worse for the 320 as games start overflowing its VRAM even without AA.
As for huge impact lol....I'm sure you'll be complaining about jaggies even when you're running 12 billion x 10 billion resolution.....
At 1920x1440 the difference between 0xAA, 2xAA and 4xAA is as plain as day.