utahraptor
Golden Member
- Apr 26, 2004
- 1,066
- 232
- 116
I've been considering it personally.
I tried this analogy with my sugar momma, what do you guys think? Anything I could change in it to better my chances of getting a green light for upgrade?
"Look at it like this one dragster does the quarter mile in 10 seconds, another does it in 7. The dragster that does it in 10 seconds uses 44 gallons of gas, while the one that does it in 7 uses only 18."
I'm trying to sell her on the efficiency, lulz.
Plug in actual numbers. Consider that your power is likely at least $.10 per KWhr. Do you have a killawatt meter? That makes the calculation pretty darn easy.
http://www.handymath.com/cgi-bin/electric.cgi
The more your power costs the easier it will be to make your case. Especially if you consider you'll probably be able to get ~$150 each or so for your 470's... I don't remember if you are using full/custom blocks or not...
Thanks for uploading, but what an absolutely useless bunch of graphs and testing. Tom's never fails.
Tom's fails.
I'm interested in seeing what the extra TMU's do for nvidia's high resolution performance. In the past, nvidia's card's have lost their lead/AMD closed the gap at 1600p+, and it was speculated AMD's large texture fillrate advantage was part of this. If the GTX 680 does perform well, I'd wonder if this points to how ROP-limited the 79xx series might be.
Finally, where are the overclocking numbers.
Or plays 1080p and doesn't turn on at least 4xMSAA? Or how about play Skyrim at 2560x1600 and then smudge it all up with FXAA? Looking forward to AT and [H]'s reviews.
Yeah, that's off. AT got 142.3% indexed to the GTX 580 at 100%. But like I said, it seems like Tom's followed the review guide letter for letter, so I'd assume this shows the GTX 680 in the best light.You can tell how horribly skewed the chart is when you see that the 7970 is only 20% more efficient than the GTX580.
Great info :thumbsup:. The question would then be is this with over volting or not. If it's without any volts, that's amazing. With volts, it's pretty good.One particualr reviewer who posted at OCN under NDA said he ran at 1300 core, and the VRzone leak had 7200 memory. So, about the same potential as a 7970 speed wise, dunno if it will scale with clocks like the 7900s do though
Yeah, that's off. AT got 142.3% indexed to the GTX 580 at 100%. But like I said, it seems like Tom's followed the review guide letter for letter, so I'd assume this shows the GTX 680 in the best light.
Great info :thumbsup:. The question would then be is this with over volting or not. If it's without any volts, that's amazing. With volts, it's pretty good.
7970 is about 20% faster than the 580 at the same TDP...
Hard to call that from one set of reviews. Every single one of those games has performed better on NVIDIA cards in the past, so it just seems they're playing to their strengths. For instance, why no 4xMSAA or 8x MSAA in Skyrim, especially since it runs so easily? Who cares about HAWX 2 results? BF3 with MSAA has been a strength for NVIDIA in the past, why all of a sudden do they not use FXAA there? Is FXAA all of a sudden no good for BF3?Well Crysis 2 doesn't use MSAA in Ultra spec.
They tested Dirt 3, WOW, HAWX 2, Metro 2033 and Battlefield 3 with MSAA.
So they tested 5 out of 7 games with MSAA on.
GTX680 only loses in Metro 2033. Obviously it makes no sense for HD7970 users to upgrade, but for new buyers, HD7970 makes no sense at $550 anymore.
Well Crysis 2 doesn't use MSAA in Ultra spec.
They tested Dirt 3, WOW, HAWX 2, Metro 2033 and Battlefield 3 with MSAA.
Hard to call that from one set of reviews. Every single one of those games has performed better on NVIDIA cards in the past, so it just seems they're playing to their strengths.
Ya, that's why I always say people should buy a card for the games they play or w/e tasks they run. If someone plays BF3 95% of the time with 4x MSAA, then GTX680 isn't going to set the world on fire vs. the HD7970.
But if you play a bunch of games, HD7970 has a weakness in DX9 game engines. So as an all around card, GTX680 looks better already. It pretty much at worst is as fast as an HD7970 and sometimes pulls wins of 15-25%. Not a game changer, but $550 for HD7970 is absurd at this point, especially since GTX680 has a quieter reference cooler and HD7970's reference cooler is way too loud for real world 1200mhz+ overclocks.
Would you call pulling 100fps vs the GTX680 120fps a weakness in dx9 games?
Again, you're extrapolating a lot from one set of heavily nvidia-weighted benchmarks and assuming your opinion is fact, which it isn't. We aren't even sure what the price is of the GTX 680, so it'd be tough to therefore judge its impact on the competition. Secondly, since when does a cooler's performance negate everything else a card has to offer, especially in something as idiosyncratic as perceived loudness? Also, how does the noise output of the cooler negatively impact real world overclocks? I'd wait for more reviews.Ya, that's why I always say people should buy a card for the games they play or w/e tasks they run. If someone plays BF3 95% of the time with 4x MSAA, then GTX680 isn't going to set the world on fire vs. the HD7970.
But if you play a bunch of games, HD7970 has a weakness in some DX9 games. Also, Nvidia has had excellent 1080P performance for a while now. So as an all around card, GTX680 looks better already. At worst, it pretty much as fast as an HD7970, but and sometimes pulls wins of 15-25%. Not a game changer, but $550 for HD7970 is absurd at this point, especially since GTX680 has a quieter reference cooler and HD7970's reference cooler is way too loud for real world 1200mhz+ overclocks.
Again, you're extrapolating a lot from one set of heavily nvidia-weighted benchmarks and assuming your opinion is fact, which it isn't.
We aren't even sure what the price is of the GTX 680, so it'd be tough to therefore judge its impact on the competition.
Secondly, since when does a cooler's performance negate everything else a card has to offer, especially in something as idiosyncratic as perceived loudness? Also, how does the noise output of the cooler negatively impact real world overclocks? I'd wait for more reviews.