OFFICIAL KEPLER "GTX680" Reviews

Page 38 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Tessellation power means nothing if they don't do anything with it.

I'm not supporting a company that pushes for a game to tessellate an entire ocean under the level or concrete barriers (like Crysis 2) or adds oodles of tessellation to a poor console port (like Batman: AC), just so they gain a few more FPS over the competition while making the game run poorly for everyone.

The 2 statements appear to be a contradiction. You think Tessellation is a marketing gimmick since it's not widely or effectively used in games. However, when 1 company actually pushes for this graphics demanding feature to be present in games you find it a "cheating tactic". Also, for better or worse, Tessellation is a PC specific feature PC gamers are enjoying vs. what are otherwise straight up console ports with high resolution texture packs. I would even go as far as to say DX11 will be remembered as the beginning of Tessellation era.

I agree with you that the way Tessellation is utilized in Crysis 2 is not very efficient (i.e., Concrete barrier, ocean). However, the fact that Tessellation coding is inefficient doesn't override the notion that it is because of NV that Tessellation as a feature has been elevated into the spotlight in the last 2 years. After all, AMD's Cayman architecture uses a 7th generation Tessellation engine. Actually AMD introduced a form of Tessellation in its basic form with TRUFORM on Radeon 8500 in 2001. Unfortunately, we haven't seen much use of Tessellation until NV put $$ behind it. If NV didn't work closely with developers who knows how long it would have taken before we would have seen tessellation. For all we know AMD might have been on their 12th generation tessellation videocard and none of us would have known this feature even mattered.

It appears to me your two main issues are inefficiency of Tessellation coding and that one company is willing to spend more $ to implement features in which its products offer a performance advantage. That's a valid point, but AMD can do the same. At the same time, HD7970 wins in the very heavy Tessellated games you are criticizing against HD6970: Crysis 2, Batman AC. If you aren't going to use next generation features in new games, what's the point of spending $500 on 7970 in th first place? HD6970/GTX570 can play them perfectly fine if you lay off that particular feature.

If you are upset that NV pushes PC centric features, then who else is going to do it? Nothing stops AMD from making games using bullet physics, or using its enormous GPGPU compute advantage for shaders, etc. I think I'd rather see NV spend marketing $ on Tessellation than to play console ports, even if my AMD card is much slower than the competitor's.
 
Last edited:

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
The 2 statements appear to be a contradiction. You think Tessellation is a marketing gimmick since it's not widely or effectively used in games. However, when 1 company actually pushes for this graphics demanding feature to be present in games you find it a "cheating tactic". Also, for better or worse, Tessellation is a PC specific feature PC gamers are enjoying vs. what are otherwise straight up console ports with high resolution texture packs. I would even go as far as to say DX11 will be remembered as the beginning of Tessellation era.

I agree with you that the way Tessellation is utilized in Crysis 2 is not very efficient (i.e., Concrete barrier, ocean). However, the fact that Tessellation coding is inefficient doesn't override the notion that it is because of NV that Tessellation as a feature has been elevated into the spotlight in the last 2 years. After all, AMD's Cayman architecture uses a 7th generation Tessellation engine. Actually AMD introduced a form of Tessellation in its basic form with TRUFORM on Radeon 8500 in 2001. Unfortunately, we haven't seen much use of Tessellation until NV put $$ behind it. If NV didn't work closely with developers who knows how long it would have taken before we would have seen tessellation. For all we know AMD might have been on their 12th generation tessellation videocard and none of us would have known this feature even mattered.

It appears to me your two main issues are inefficiency of Tessellation coding and that one company is willing to spend more $ to implement features in which its products offer a performance advantage. That's a valid point, but AMD can do the same. At the same time, HD7970 wins in the very heavy Tessellated games you are criticizing against HD6970: Crysis 2, Batman AC. If you aren't going to use next generation features in new games, what's the point of spending $500 on 7970 in th first place? HD6970/GTX570 can play them perfectly fine if you lay off that particular feature.

If you are upset that NV pushes PC centric features, then who else is going to do it? Nothing stops AMD from making games using bullet physics, or using its enormous GPGPU compute advantage for shaders, etc. I think I'd rather see NV spend marketing $ on Tessellation than to play console ports, even if my AMD card is much slower than the competitor's.
The bolded is simply incorrect it wasn't nVidia that brought tessellation into gaming but Microsoft when they made it part of DX11. Saying nVidia was the one putting money into games to include tessellation would imply that no game had tessellation before the GTX 480 released which again is incorrect.

As for Crysis 2, I don't know whether nVidia paid Crytek to put tessellation everywhere or not and quite frankly I don't care because
a.) Crysis 2 sucked
b.) I beleive Crysis 1 looked better
c.) The "generous" use of tessellation also hampered performance in nVidia's cards too, specifically the midrange offerings so it wasn't an AMD specific problem.

I am glad that nVidia is pushing for more gaming centric features but that doesn't mean they are good features. GPU PhysX is a good example, I think it's a fantastic feature but locking out 40% of the market from using this feature will not help anyone, it just gives nVidia a bullet point to put on a box. I know nVidia spends money on PhysX and they shouldn't just give it to AMD for free but there should be alternate ways to pushing PhysX where everyone benefits from this technology and more than 3 games a year implement it.

As for tessellation specifically I think it should be used more efficiently by game developers because the ultimate goal of it is to improve image quality. If you are implementing tessellation into a game and there is no difference in the way the graphics look then I think it's a waste of resources. Other than that I think nVidia is definitely better than AMD when it comes to supporting PC gaming, they push a lot of features and have better developer relations.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
That's funny you accuse me of providing "NV marketing post" while you are the one missing the entire picture here - a more efficient, cheaper card is faster. You can discount the review I linked if you don't like its results. That review is not the only review which shows that in certain games 680 leads by a 20-30%. Furthermore, their card reached 1300mhz on reference cooling, and so did Xbitlabs' card. That's at least 2 samples that have achieved > 1300mhz overclockers. Many reviewers were able to achieve 1180-1250mhz as well. Regardless of how "cherry-picked" it could have been, I haven't seen any HD7970 accomplish 1300mhz on air. In fact 2 out of 3 Lightning 7970s couldn't even get to 1250mhz on air.

In fairness, if we are going to compare the best 7970 card vs. a reference 680, it makes sense to wait until after-market 680s arrive. There is little doubt that HD7970 will be left in the dust completely once Direct CU, Windforce 3X, Lightning, Classified editions of those cards launch. The fact that we are even discussing a $600 Lightning cards vs. a reference $500 is in itself telling how great the 680 is.

It's interesting how last generation HD6970 was 90% as good as GTX580 and you claimed it offered amazing bang for the buck for $370 and now HD7970 offers 90% of the performance of a 680 and it's OK that it's priced at $500? 2nd fastest card should always cost less unless it offers some unique features worth paying a premium for.

You say 3GB is a key advantage but so far no benchmarks have shown this to be true. I'd much rather recommend someone spends $500 on a card with proper working drivers and proper working features (h.264 encoding):




3GB of HD7970 didn't really help it in SKYRIM where on 3 monitors 680 handily won.

Previously you claimed that HardOCP focusing on newer games is preferable since testing older games is irrelevant. This was before all the 680 reviews came now. Now the only way HD7970 even manages to get tangible wins is when 2-4 year old games are used in the reviews: Crysis Warhead, Metro 2033 and AvP. Do those games matter more than BF3, Dirt 3, SKYRIM? Maybe to some gamers, but prob. not to most.

I don't believe that a reference HD7970 can be justified even at $500.

If you don't like the Bjorn3D review, look at any other notable ones: Hardware Canucks, ComputerBase, AnandTech and even your #1 source to go to - HardOCP. GTX680 wins in all of them.



"We could prattle on and on extolling the GTX 680’s virtues but here’s what really matters: NVIDIA’s newest flagship card is superior to the HD 7970 in almost every way. Whether it is performance, power consumption, noise, features, price or launch day availability, it currently owns the road and won’t be looking over its shoulder for some time to come." ~ Hardware Canucks

In their review, GTX680 was 16% faster at 1080P 4AA and 15% faster at 2560x1600 4AA. HD7970 needs to cost at most 90% of the GTX680 -- that's $450. And that's being generous since people always pay a premium for the fastest single GPU, especially one that's better in most other metrics too: power consumption, noise, most 680s have longer 3 year warranty, working H.264 decoding, new TXAA mode that might be good (or maybe marketing), etc.

I find it odd that you are trying to defend 7970's pricing and yet accuse me of spitting out NV "marketing pamphlet post". Last time I checked lower prices are better for consumers and here you are advocating that HD7970 doesn't need to fall in price by more than $50. It is a rather interesting position esp. after you've promoted AMD's bang for the buck philosophy for years. Yet now you think consumers should pay the same for a card that needs to be overclocked to match guaranteed performance. I don't see that as a reasonable expectation since overclocking is always just a nice bonus, while performance out of the box is guaranteed. Actually part of the enthusiasm behind overclocking is to get a card that's cheaper and performs as fast as a higher SKU part. In this case, a reference card is faster than a 1050mhz 7970 and that performance is guaranteed for everyone.

Since you already stated you won't support NV because of their business practices, thank you for finally admitting that you prob. won't buy NV products in the first place. If you prefer AMD cards for any reason, there is nothing wrong with that. Plenty of posters buy NV cards for Folding@Home for example. However, by more or less stating that you won't support NV as a company due to their business practices, don't expect most posters to view your opinion as objective when it comes to videocard recommendations.

Spot on. :thumbsup:
 
Feb 19, 2009
10,457
10
76
Okay, being mostly an FPS guy, I'm looking at the BF3 benchmarks. I know that is a strong NVIDIA title..



7970 with a small OC nearly matching gtx680 w/ turbo at 1080p and beating it at 1600p. That's 1070mhz 7970, well below the average 1.2ghz on air.

I'm expecting to see a price drop (at least $50 each) on the 7950 and 7970 but it hasn't happened..

I guess we're stuck with paying stupid amounts of $$ for mid-range products for a long time to come.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I guess we're stuck with paying stupid amounts of $$ for mid-range products for a long time to come.

No. We don't have to do a damned thing. AMD and NVDA on the other hand, need to recoup their R&D and manufacturing costs. In a game of chicken, I'd rather be the consumer in this market bout. Thanks to consolification, we have less need to upgrade video cards unless it's for some reason other than games getting more GPU-intense, like for multi-monitor or 120Hz or 3D or something. My formerly midrange HD 6850 is still strong enough to run any game at 1080p with playable framerates. I might have to turn down AA a bit, and my overclock is a hefty 23%, but it gets the job done. Many others have decent graphics cards that also get the job done. Some people with money to burn will buy the latest and greatest no matter what, and that's fine, but I think most people would rather just wait for prices to fall.

If neither company is willing to price 28nm GPUs better, then both can watch their cards rot for all I care.
 

mkmitch

Member
Nov 25, 2011
146
2
81
Other than running 2 dozen monitors at once just to so we say we can, what out in the marketplace demands that we buy either 680 or 7970/7950? Is their some game out there that can only run nice and playable with only those three cards? Is there some commercial software that can only be productively run out their on these three cards.

So far the only benefit I've gleaned is that the same group of posters get to keep posting slanted viewpoints day after day after day. I've only been here a short time and I can predict who will say what. Who is pulling all these puppet strings anyway. Carry on. .

Edited to add, well said blastingcap.
 

kreacher

Member
May 15, 2007
64
0
0
Well, if someone has a 30 inch monitor, many games can't manage 60fps at 25x16 with everything on max.
 

felang

Senior member
Feb 17, 2007
594
1
81
What are you running now for everything non-gpu in your system, 225w or so? The hx 520 is rated for 41a @ 50c continuous usage. Mine has been going literally 24/7/365 on seti for around 4 years with various cpu/gpu combos (e6750 + 4850, q6600 + 4850, x3350 + gtx 260, i7 920 + 9600gso). The older cpus ran around 3.5, but the i7 920 is running around 3.95 these days, so overclocking clearly never bothered me, either. As long as you remain under 300w peak for your gpu then you won't have any psu-related issues.

Of course, I do all this bragging on my hx 520 then remembered that I switched it out a couple of months ago for an xfx 850w unit that was just sitting around taking up space in my closet... I think I'll pair it with my i7 rig again this winter when I make my BF upgrade.

Currently running 2600K W/ p67 mobo, HD5870, 8 gb RAM, ssd, 2 hd´s and a total of 6 fans.

I´m pretty sure I´m under 300 watts at the wall while overclocked (both CPU and GPU) while playing BF3 multiplayer, I´ll make sure a little later though...
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Just an tid bit of information from that guy at EVGA who does the videos (I believe it's him)...

He is saying not to purchase a waterblock for the reference cards, they're TDP hardlocked to 225w max due to the 2x6pin configuration.

He said waiting for non reference cards with additional power would be wise, as it will unlock additional OC headroom via additional TDP made available.

Dynamic OC'ing isn't going anywhere, but more robust cards with more PCIe pinouts are coming.




Here is a non related video from PC Perspective, it's almost 2 hours long and talks all about the 680 with a PR guy from Nvidia.

http://www.youtube.com/watch?v=ER3nv7NTwbs
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Well, if someone has a 30 inch monitor, many games can't manage 60fps at 25x16 with everything on max.

There's always going to be some eye candy that, turned on, will drag the FPS down to unplayable. That doesn't mean it will give you any better of a gaming experience.

It would be different if any of these features would, all of a sudden, give you photo-realistic real time rendering with these cards. Or even with 4 of these cards. They don't though. They, more times than not, make an improvement that you need a screen grab (or a wire frame render ) to see.
 

BoFox

Senior member
May 10, 2008
689
0
0
Guru 3D OC review is in. About a 20% OC yielding a 20% performance gain, very solid. Even with this running 300mhz faster, the noise is STILL the same as a stock 7970. The noise is really a negative for me on the 7970; I can't wait for a higher-TDP version of the 680.

http://www.guru3d.com/article/geforce-gtx-680-overclock-guide/1

Thanks for the heads-up.. yummy! However, a higher-TDP version would probably mean more noise also (even though GTX 590, out of all the cards, was still relatively quiet).
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
I see this being repeated again and again. Correct me if I am wrong but the GTX 680 dynamically overclocks and overvolts correct? If so then all this talk of no voltage adjustment is nonsense because the card is doing the volting for you.

At least that's the way I read about it.
In your very quote (the one that you quoted RussianSensation), he actually said "GTX680 can do 1200-1300mhz on air with dynamic voltages in Precision X in 15 seconds by moving the slider."

To add to RussianSensation's sincere context, GTX 680 hardly consumes much more power when overclocked. From the same source that he quoted, see:

source: http://bjorn3d.com/read.php?cID=2199&pageID=11686
Pretty impressive, compared against the other 2 cards above it, with one being wayyyyy above for a single GPU card. The authors never edited it ever since the article was published, so I guess it was indeed correct, if it raised some questions already.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
In your very quote (the one that you quoted RussianSensation), he actually said "GTX680 can do 1200-1300mhz on air with dynamic voltages in Precision X in 15 seconds by moving the slider."

To add to RussianSensation's sincere context, GTX 680 hardly consumes much more power when overclocked. From the same source that he quoted, see:

source: http://bjorn3d.com/read.php?cID=2199&pageID=11686
Pretty impressive, compared against the other 2 cards above it, with one being wayyyyy above for a single GPU card. The authors never edited it ever since the article was published, so I guess it was indeed correct, if it raised some questions already.

I question the way they benched this as it shows the 7970 using more power than the 580.
 

BoFox

Senior member
May 10, 2008
689
0
0
Yeah, I'm thinking that GTX 580 and HD 7970 should switch places, lol.. probably a "charto", like "typo"! I guess nobody really brought it to attention to them?!?

EDIT: Even though HD 7970 can be "caught" consuming 50W more than GTX 680 at times, I went ahead and posted a comment over there asking them if it's a typo or not.
Because GTX 580 pretty much always eats at least 50W more than GTX 680 in any demanding game.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
That's funny you accuse me of providing "NV marketing post" while you are the one missing the entire picture here - a more efficient, cheaper card is faster. You can discount the review I linked if you don't like its results. That review is not the only review which shows that in certain games 680 leads by a 20-30%. Furthermore, their card reached 1300mhz on reference cooling, and so did Xbitlabs' card. That's at least 2 samples that have achieved > 1300mhz overclockers. Many reviewers were able to achieve 1180-1250mhz as well. Regardless of how "cherry-picked" it could have been, I haven't seen any HD7970 accomplish 1300mhz on air. In fact 2 out of 3 Lightning 7970s couldn't even get to 1250mhz on air.
That's funny, because hwbot.org says the average air overclock for a GTX 680 is 1175: http://hwbot.org/hardware/videocard/geforce_gtx_680/ . Looks like your lame attempt at cherry picking benchmarks failed again. Also, you failed to address my comment that the 7970 starts is clocked at 925MHz where as the GTX 680 starts at 1006MHz. Therefore, when the 7970 overclocks to a proven average of 1202MHz, it's a 30% overclock, where as the GTX 680 only hits 1175MHz, or 17%.
In fairness, if we are going to compare the best 7970 card vs. a reference 680, it makes sense to wait until after-market 680s arrive. There is little doubt that HD7970 will be left in the dust completely once Direct CU, Windforce 3X, Lightning, Classified editions of those cards launch. The fact that we are even discussing a $600 Lightning cards vs. a reference $500 is in itself telling how great the 680 is.
It's ironic that you speak of "fairness" in all your biased posting. You're comparing more expensive AIB models just to bolster your "price/performance" argument, when in reality a reference 7970 has no problem hitting those clocks either: http://hardocp.com/article/2012/01/09/amd_radeon_hd_7970_overclocking_performance_review/ . But once again you cherry pick benchmarks and lie instead of presenting a situation honestly and completely.
It's interesting how last generation HD6970 was 90% as good as GTX580 and you claimed it offered amazing bang for the buck for $370 and now HD7970 offers 90% of the performance of a 680 and it's OK that it's priced at $500? 2nd fastest card should always cost less unless it offers some unique features worth paying a premium for.
Where did I ever claim that? Go post a quotation, except you won't find it because once again you are lying.
You say 3GB is a key advantage but so far no benchmarks have shown this to be true.
Never said any of that either, here's what I said:
The 7970 base MSRP needs to come down to $500 to be competitive. Extras like 3GB of vRAM mean little to the vast majority of the market, and although the 7970 ends up being faster, you do have to work for it.
So once again you are lying and committing libel since you can't properly rebut any of my arguments.
Since you already stated you won't support NV because of their business practices, thank you for finally admitting that you prob. won't buy NV products in the first place. If you prefer AMD cards for any reason, there is nothing wrong with that. Plenty of posters buy NV cards for Folding@Home for example. However, by more or less stating that you won't support NV as a company due to their business practices, don't expect most posters to view your opinion as objective when it comes to videocard recommendations.
I've actually owned twice as many NV cards in my life as ATI/AMD. I don't support underhanded tactics that harm consumers, but it seems you do as long as they support nvidia. It's clear to me that you're posting on some agenda or a personal vendetta, not as a contributing member of the forum. In this one post you:

A) failed to rebut any of my arguments
B) Ignored, deflected, or changed the subject when you couldn't
C) flat out personally attacked me and lied when you couldn't

Shameful.


There is simply too much inflammatory rhetoric here. It is unproductive.

Posting in this manner, with such personal conviction and sense of being wronged personally, is unproductive and a dead-end as far as the technical discussion is concerned.

Please, if you find the subject material to be this personally discomforting then take it to pm, or put the person on ignore, or leave it be.

We need the discourse here to be less personal and less inflammatory.

Administrator Idontcare
 
Last edited by a moderator:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
The 2 statements appear to be a contradiction. You think Tessellation is a marketing gimmick since it's not widely or effectively used in games. However, when 1 company actually pushes for this graphics demanding feature to be present in games you find it a "cheating tactic". Also, for better or worse, Tessellation is a PC specific feature PC gamers are enjoying vs. what are otherwise straight up console ports with high resolution texture packs. I would even go as far as to say DX11 will be remembered as the beginning of Tessellation era.

I agree with you that the way Tessellation is utilized in Crysis 2 is not very efficient (i.e., Concrete barrier, ocean). However, the fact that Tessellation coding is inefficient doesn't override the notion that it is because of NV that Tessellation as a feature has been elevated into the spotlight in the last 2 years. After all, AMD's Cayman architecture uses a 7th generation Tessellation engine. Actually AMD introduced a form of Tessellation in its basic form with TRUFORM on Radeon 8500 in 2001. Unfortunately, we haven't seen much use of Tessellation until NV put $$ behind it. If NV didn't work closely with developers who knows how long it would have taken before we would have seen tessellation. For all we know AMD might have been on their 12th generation tessellation videocard and none of us would have known this feature even mattered.
The statements don't contradict each other in the least. Having tessellation power means nothing when they waste it tessellating an ocean beneath the level in order to exploit a game to win in reviews. Once again you're trying with all your might to put nvidia in a positive light instead of write an honest opinion. I was playing games like STALKER: CoP on my 5870 and 5850's well before nvidia even had a tessellation capable video card.
It appears to me your two main issues are inefficiency of Tessellation coding and that one company is willing to spend more $ to implement features in which its products offer a performance advantage. That's a valid point, but AMD can do the same. At the same time, HD7970 wins in the very heavy Tessellated games you are criticizing against HD6970: Crysis 2, Batman AC. If you aren't going to use next generation features in new games, what's the point of spending $500 on 7970 in th first place? HD6970/GTX570 can play them perfectly fine if you lay off that particular feature.
Actually the 6970 plays it fine with tessellation as long as it's scaled back in the drivers to run appropriately on its tessellation engine. That's something the GTX570 can't do, but I'm sure you'd never mention a feature AMD has done better. However, tons of tessellation doesn't save the mediocre, but poorly running, graphics of Batman: AA. Luckily, the gameplay makes up for it.
If you are upset that NV pushes PC centric features, then who else is going to do it? Nothing stops AMD from making games using bullet physics, or using its enormous GPGPU compute advantage for shaders, etc. I think I'd rather see NV spend marketing $ on Tessellation than to play console ports, even if my AMD card is much slower than the competitor's.
I find it a shame that you go through anything and everything to defend nvidia rather present an accurate or fair argument. I'd like to see open development of games and technology that gives the best experience for the consumer for the least cost. You support proprietary technology and companies funding developers while universally harming gameplay, just to win a few more frames in a benchmark. That's idiotic.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Yeah, I'm thinking that GTX 580 and HD 7970 should switch places, lol.. probably a "charto", like "typo"! I guess nobody really brought it to attention to them?!?

EDIT: Even though HD 7970 can be "caught" consuming 50W more than GTX 680 at times, I went ahead and posted a comment over there asking them if it's a typo or not.
Because GTX 580 pretty much always eats at least 50W more than GTX 680 in any demanding game.

If nothing else, the substantially lower idle power usage of the 580 indicates something's wrong. The 7970 doesn't even use 36W idling, never mind 36W more than a 580.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
That's funny, because hwbot.org says the average air overclock for a GTX 680 is 1175: http://hwbot.org/hardware/videocard/geforce_gtx_680/ . Looks like your lame attempt at cherry picking benchmarks failed again. Also, you failed to address my comment that the 7970 starts is clocked at 925MHz where as the GTX 680 starts at 1006MHz. Therefore, when the 7970 overclocks to a proven average of 1202MHz, it's a 30% overclock, where as the GTX 680 only hits 1175MHz, or 17%.

Just for the hell of it I went through all the reviews in the op (plus a couple of extra ones from the dailytech link), and the average core clock offset achieved was 151 mhz for a clock of 1157 (fairly close to your link from hwbot).

However gpu boost comes on top of this, unfortunately there weren't that many sites that measured the effect of this when overclocking, however 50-100 mhz extra seems somewhat common (but take this with a grain of salt, due to the lack of proper data).

so an average gtx680 would run at 1206-1256 mhz when overclocked compared to ~1058mhz at stock (again due to gpu boost), so a relative overclock of 14-19%
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
But that still doesn't refute the point that when it was overclocked to 30% and still had gpu boost enabled, the average gains were less than 18%. So gpu boost was included in the gpu benchmarks automatically

Average 680 overclock is 16-17% and means a 10% performance gain including gpu boost
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |