[gamegpu] Far Cry 4 performance

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Yes, the aftermarket cards are/were much better at holding their boost clocks. Not that many review sites, and no English speaking ones that I've seen, pointed out that the reference cooler allowed GK110 to throttle.

Considering 980 throttles with the reference Titan cooler with no overclocking, there is no question in my mind that much more power hungry 780TI would throttle under a heavy load in some games over prolonged gaming sessions.

"We found that with the default settings on GeForce GTX 980 SLI the lowest clock rate it hit while in-game was 1126MHz. That clock speed is actually below the boost clock of 1216MHz for GTX 980. This is the first time we've seen the real-time in-game clock speed clock throttle below the boost clock in SLI in games. It seems GTX 980 SLI is clock throttling in SLI on reference video cards. This is something we did NOT expect, but it is happening with reference cards." ~ HardOCP

But you know on our forum the blower > after-market heatsink myth continues to perpetuate and 'everyone' loves the NV blower design despite after-market options from Gigabyte and so on completely blowing it out of the water in noise levels and temperatures.
http://www.xbitlabs.com/articles/gr...orce-gtx-titan-black-ghz-edition_4.html#sect0

BTW. the huge difference between 7970 and 280X also doesn't make any sense, they are for all intents and purposes clocked identically and should perform identically.

Actually there is a big difference. They are using the original 7970 that came with 925mhz GPU clock vs. 1020-1080mhz for after-market 280X. 12-17% difference in performance isn't out of the question since Tahiti scales almost linearly with an increase in clock speeds. Had they used a 7970Ghz edition, your point would be valid. Besides, the situation where Titan/780 are barely beating 280X has come up on other websites in other games too. It's not the first time in the last 6 months. Of course 780/Titan are amazing overclockers so you can easily get 20-30% more performance vs. a stock 780/Titan with overclocking.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Considering 980 throttles with the reference Titan cooler with no overclocking, there is no question in my mind that much more power hungry 780TI would throttle under a heavy load in some games over prolonged gaming sessions.

"We found that with the default settings on GeForce GTX 980 SLI the lowest clock rate it hit while in-game was 1126MHz. That clock speed is actually below the boost clock of 1216MHz for GTX 980. This is the first time we've seen the real-time in-game clock speed clock throttle below the boost clock in SLI in games. It seems GTX 980 SLI is clock throttling in SLI on reference video cards. This is something we did NOT expect, but it is happening with reference cards." ~ HardOCP

But you know on our forum the blower > after-market heatsink myth continues to perpetuate and 'everyone' loves the NV blower design despite after-market options from Gigabyte and so on completely blowing it out of the water in noise levels and temperatures.
http://www.xbitlabs.com/articles/gr...orce-gtx-titan-black-ghz-edition_4.html#sect0
come one now. first off thats not a Titan cooler. next that is in SLI. and all you have to do is simply raise the the power/temp target and it will not throttle anymore. with that raised and even at 1516 mhz, the only time mine will throttle at all is if hitting the TDP limits.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
come one now. first off thats not a Titan cooler. next that is in SLI. and all you have to do is simply raise the the power/temp target and it will not throttle anymore. with that raised and even at 1516 mhz, the only time mine will throttle at all is if hitting the TDP limits.

It's very similar to the Titan cooler minus the vapor chamber I believe. I realize that if you raise the power/temp and fan speed, the 980 will not throttle but since testing is done at default, it's not out of the question. However, the poor performance of Kepler cards in too many games in the last 6 months cannot all be attributed to thermal throttle on all websites, especially since some sites like TechSpot ONLY tested with after-market cards, including an after-market Gigabyte 780Ti.

If anything, you should see most sites favouring NV more because they keep using reference blower 7970/7970Ghz/R9 290/290X cards. Not so with TechSpot, all after-market.



If you enable Godrays though, AMD cards will incur a large 20% performance hit and another 10% for HBAO+. However, in the real world NV cards are more playable as there is a stutter present on AMD cards until AMD and Ubisoft resolve this issue. A lot of times FPS isn't everything and in this case the higher FPS on Radeons don't necessarily translate into a better gaming experience, far from it actually:

http://www.hardocp.com/article/2014/11/21/far_cry_4_video_card_performance_iq_preview/5#.VHjnGTGUfsc

Whenever the next-gen big boys come (GM200 or R390X), I'll grab a few with Bitcoins. Free upgrade for life (or as soon as bitcoins become worthless heh)!

Congrats on reaching the free GPU upgrades for life club! :thumbsup: Although I took all of my remaining bitcoin money and used it for other things in life as no way I am spending 5 figures on videocard upgrades. I knew if I kept my bitcoins for GPU upgrades, I'd be buying 2-3 $700 cards every 2-3 years for kicks and giggles and didn't think it was worth it given how quickly GPUs advance/drop in price.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I don't understand the poor showing from Kepler, especially the 780 and 780ti, it makes no sense why they should be so far below Maxwell.

The game ready drivers came out surprisingly quickly without the usual month gap in between, I expect they released maxwell optimized drivers to compete with amd on release with the potential hopefully to optimise kepler at a late date.
 
Last edited:

nine9s

Senior member
May 24, 2010
334
0
71
I just got Far Cry 4. I am playing it at 2560X1440 with GTX 680. I thought I was going to have to get a GTX 970 and actually had one ordered today but I am playing on high settings with Texture and Environment at Ultra, with SMAA and godrays and it seems very smooth. Per Geforce Experience I am getting around 30-40 FPS but it seems much smoother I guess no dips below that.

I canceled my GTX 970 order after seeing how smooth it is.

Am I just lucky or is it not that demanding of game or perhaps GTX 680 has life still for me at 1440.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I just got Far Cry 4. I am playing it at 2560X1440 with GTX 680. I thought I was going to have to get a GTX 970 and actually had one ordered today but I am playing on high settings with Texture and Environment at Ultra, with SMAA and godrays and it seems very smooth. Per Geforce Experience I am getting around 30-40 FPS but it seems much smoother I guess no dips below that.

I canceled my GTX 970 order after seeing how smooth it is.

Am I just lucky or is it not that demanding of game or perhaps GTX 680 has life still for me at 1440.

I played Dark Souls at 30 FPS and never noticed it on PC unless I had dips below 30 FPS. I find some games, dips below 60 are jarring and others, I can get 30 FPS and be fine.

I mean, console gamers game at 30 FPS all the time without dying/tearing their eyes out so it's possible...
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I played Dark Souls at 30 FPS and never noticed it on PC unless I had dips below 30 FPS. I find some games, dips below 60 are jarring and others, I can get 30 FPS and be fine.

I mean, console gamers game at 30 FPS all the time without dying/tearing their eyes out so it's possible...
lot of difference using a controller and sitting way back from the tv.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Why would using a controller prevent me from sitting close to my TV?
did I say that? you can sit right in front of your tv close enough to count the pixels if you want to.

I simply said console games dont look and feel as bad because a controller and sitting back further from the TV negate most 30 fps issues.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Considering 980 throttles with the reference Titan cooler with no overclocking, there is no question in my mind that much more power hungry 780TI would throttle under a heavy load in some games over prolonged gaming sessions.

980 throttles primarily because Nvidia limits it in the bios to a more conservative TDP.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
If we're taking wild guesses, then I'd say it's probably this combined with Nvidia getting in late in development.

Judging from the bugs and totally different lighting with Nvidia's tech it's safe to say Gameworks integration was a rushjob.

I'd have to agree and take it further by thinking it's mostly a marketing deal. With AC BF they suddenly added Physx maybe 2 months after release. It was broken for about 3-4 more. Finally about 8-9 months after release I could run that game on ultra with normal physx at a reasonably steady 60fps. Watch dogs also now runs a lot better than it did at first. If I weren't so jumpy about getting these AAA games on release a smarter version of me would just wait until they hit $19.99. It's just so much smarter.

Well thankfully I loaded up on a bunch of indie games over Steam's Black Friday sale. Maybe I'll just play those and delay all my AAA game purchases accordingly. If these people want $60 anymore they better learn to finish the games. I want to support the devs but these companies don't deserve full price.
 
Feb 19, 2009
10,457
10
76
Well thankfully I loaded up on a bunch of indie games over Steam's Black Friday sale. Maybe I'll just play those and delay all my AAA game purchases accordingly. If these people want $60 anymore they better learn to finish the games. I want to support the devs but these companies don't deserve full price.

Most of my current played games are all indie, small time, like FTL, Don't Starve, Suvivalist, SPAZ etc. The AAA scene has a few titles per year that I'm interested in, otherwise it seems very poor gameplay (seems like all the focus is on graphics) of late. In particular, I was very disappointed with Civ BE, full AAA price for what is essential a re-skin & mod of Civ 5, all that focus on pretty effects and the AI is utter rubbish. A 4x TBS game with a bad AI is unplayable bad.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Well I'm finding this really fascinating. I just ordered a R9-270X to replace my 7870 which died back in the summer, hardly an upgrade from the 7870 but I figured I would be content with it. So looking at those benchmarks, I instinctively checked where the 270X placed...and it beats the GTX 680? Gives the 770 a run for its money? This is an Nvidia supported game, right? It makes very little sense for the card that unseated the 7970 (non-GHz) to be losing to a graphics chip that's a whole tier below the 7970's Tahiti. Heck, in the recent AMD supported game, Dragon Age Inquisition, the 680 beats the 270X. It's by a small margin, and the 270X beats the old Pitcairn rivals the 660 Ti and 660, but it makes at least some sense that an AMD supported game would hold an advantage for AMD cards. This...this is counterintuitive.

You can argue that "Oh, with Gameworks effects like 'enhanced godrays', Nvidia performs better!", but that's one effect. When the majority of advanced effects are maxed out, AMD hardware holds the advantage, particularly over Kepler-based graphics cards.

These are the explanations I can think of:

--The Kepler architecture is simply getting long in the tooth, and didn't have as much longevity for advanced graphics features as AMD's GCN architecture did.
--Despite Ubisoft and Nvidia's best efforts, the engine running Far Cry 4 specifically is simply better suited for AMD cards.
--Nvidia put little priority and effort in optimizing for Kepler cards, since they would rather people upgrade to Maxwell cards.

None of the above particularly reflects well on Nvidia. I mean, it's good that the game isn't a complete technical train wreck like Assassin's Creed Unity was. And it's good that you can't accuse Nvidia of really sabotaging AMD performance in this game. But c'mon, Nvidia, this subpar optimization is unbecoming of you.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
come one now. first off thats not a Titan cooler. next that is in SLI. and all you have to do is simply raise the the power/temp target and it will not throttle anymore. with that raised and even at 1516 mhz, the only time mine will throttle at all is if hitting the TDP limits.

That's true that all you have to do is raise the power/temp targets (and increase the blower speed). That's what Hardware.fr did in this chart (uber).


They set the blower to 85% and raised the temp limit to 94° to match Hawaii. That kinda negates the "Hawaii is so hot and loud" argument though if you basically match Hawaii settings with GK110. Still almost no sites reported the GK110 throttling.

Most sites test out of box though, for obvious reasons.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I agree with you Carfax83. Since I own a 3930k rig at 4.6Ghz with 2 R9 290s in CF and you have a 4930k rig at 4.5Ghz with 2 GTX970s in SLI we should both run a segment of Far Cry 4 to compare performance.:thumbsup:

Sure.. I will get Far Cry 4 for free because I bought AC Unity plus season pass, so we can do the comparison then
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
That's true that all you have to do is raise the power/temp targets (and increase the blower speed). That's what Hardware.fr did in this chart (uber).


They set the blower to 85% and raised the temp limit to 94° to match Hawaii. That kinda negates the "Hawaii is so hot and loud" argument though if you basically match Hawaii settings with GK110. Still almost no sites reported the GK110 throttling.

Most sites test out of box though, for obvious reasons.
because when it did throttle it was usually only one or two bins which is not even noticeable. running the 290x in quiet mode took a huge chunk of its performance.


http://www.hardwarecanucks.com/foru...7-nvidia-geforce-gtx-780-ti-3gb-review-3.html

"The GTX 780 Ti’s performance is a model of consistency which should have been evident by now considering the results we’ve already seen in this section. The R9 290X does match its continual framerate but only when used in Uber Mode which boosts fan speeds to uncomfortable levels in an effort to achieve the highest possible stable clock speed.

If anything, these tests should be vindication for NVIDIA’s approach to their GeForce Boost algorithms and their commitment to deliver a pleasant gaming experience. The GTX 780 Ti could have been a loud mess which performed even faster but instead it remains a relatively docile card that can still pump out framerates without sacrificing in other key areas."
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Well I'm finding this really fascinating. I just ordered a R9-270X to replace my 7870 which died back in the summer, hardly an upgrade from the 7870 but I figured I would be content with it. So looking at those benchmarks, I instinctively checked where the 270X placed...and it beats the GTX 680? Gives the 770 a run for its money? This is an Nvidia supported game, right? It makes very little sense for the card that unseated the 7970 (non-GHz) to be losing to a graphics chip that's a whole tier below the 7970's Tahiti. Heck, in the recent AMD supported game, Dragon Age Inquisition, the 680 beats the 270X. It's by a small margin, and the 270X beats the old Pitcairn rivals the 660 Ti and 660, but it makes at least some sense that an AMD supported game would hold an advantage for AMD cards. This...this is counterintuitive.

You can argue that "Oh, with Gameworks effects like 'enhanced godrays', Nvidia performs better!", but that's one effect. When the majority of advanced effects are maxed out, AMD hardware holds the advantage, particularly over Kepler-based graphics cards.

These are the explanations I can think of:

--The Kepler architecture is simply getting long in the tooth, and didn't have as much longevity for advanced graphics features as AMD's GCN architecture did.
--Despite Ubisoft and Nvidia's best efforts, the engine running Far Cry 4 specifically is simply better suited for AMD cards.
--Nvidia put little priority and effort in optimizing for Kepler cards, since they would rather people upgrade to Maxwell cards.

None of the above particularly reflects well on Nvidia. I mean, it's good that the game isn't a complete technical train wreck like Assassin's Creed Unity was. And it's good that you can't accuse Nvidia of really sabotaging AMD performance in this game. But c'mon, Nvidia, this subpar optimization is unbecoming of you.

This is a recurring theme in many of the recent games with the R9 290X and sometimes even R9 290 beating the 780 Ti when a year back the 780 Ti had it clearly beat. techspot concluded that Nvidia had optimization work left.

http://www.techspot.com/review/917-far-cry-4-benchmarks/page6.html

"Getting back to Nvidia's poor performance... we can confirm that the 344.75 driver was used while Far Cry 4 has been patched to the latest version through Uplay. We asked Nvidia if the performance we saw was unusual or different to what they have seen and they have yet to reply.As it stands, we believe AMD is getting the most out of its Radeon graphics cards in Farcry 4 and don't expect to see many performance improvements in the future, with the exception of CrossFire setups. Nvidia on the other hand have some work ahead, which is hard to believe with Far Cry 4 being Nvidia-sponsored."

But since this is not a 1 game occurrence I have to agree that Kepler driver optimization might have taken a backseat after Maxwell launch. Sadly this is not a good long term strategy. Customers remember how you treat them.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Well I'm finding this really fascinating. I just ordered a R9-270X to replace my 7870 which died back in the summer, hardly an upgrade from the 7870 but I figured I would be content with it. So looking at those benchmarks, I instinctively checked where the 270X placed...and it beats the GTX 680? Gives the 770 a run for its money? This is an Nvidia supported game, right? It makes very little sense for the card that unseated the 7970 (non-GHz) to be losing to a graphics chip that's a whole tier below the 7970's Tahiti. Heck, in the recent AMD supported game, Dragon Age Inquisition, the 680 beats the 270X. It's by a small margin, and the 270X beats the old Pitcairn rivals the 660 Ti and 660, but it makes at least some sense that an AMD supported game would hold an advantage for AMD cards. This...this is counterintuitive.

So you looked at one set of benchmarks and then arrived at this conclusion?

From what I've seen, the current results seem split down the middle. The PCGH.de benchmarks favor Nvidia, and so do the Gamersnexus ones. GameGPU and Techspot seem to favor AMD, and what those benchmarks have in common is that they used the stock ultra settings, unlike the PCGH.de and Gamersnexus benchmarks which have the Nvidia effects enabled.

PCLabs.pl's benchmark had HBAO+ and PCSS enabled, and the results were fairly close between the two but with Nvidia having a slight edge.

So when you summarize everything, it seems that AMD is very competitive when running the game without Gameworks effects, and is slightly behind when HBAO+ and PCSS is enabled.

But when the tessellation effects are enabled, then Nvidia pulls ahead by a wide margin. When the hairworks effects is finally released, I expect that margin to grow..

Anyway, like I said earlier, the final results for Far Cry 4 won't become known until a few months from now when the patches and driver optimizations have been fully implemented.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
So you looked at one set of benchmarks and then arrived at this conclusion?

From what I've seen, the current results seem split down the middle. The PCGH.de benchmarks favor Nvidia, and so do the Gamersnexus ones. GameGPU and Techspot seem to favor AMD, and what those benchmarks have in common is that they used the stock ultra settings, unlike the PCGH.de and Gamersnexus benchmarks which have the Nvidia effects enabled.

PCLabs.pl's benchmark had HBAO+ and PCSS enabled, and the results were fairly close between the two but with Nvidia having a slight edge.

So when you summarize everything, it seems that AMD is very competitive when running the game without Gameworks effects, and is slightly behind when HBAO+ and PCSS is enabled.

But when the tessellation effects are enabled, then Nvidia pulls ahead by a wide margin. When the hairworks effects is finally released, I expect that margin to grow..

Anyway, like I said earlier, the final results for Far Cry 4 won't become known until a few months from now when the patches and driver optimizations have been fully implemented.

the only setting which works poorly on AMD cards is Enhanced Godrays. HBAO+ has the same performance hit on Nvidia and AMD cards. Ultra with SMAA, HBAO+ and without Godrays runs very well on AMD cards. In most reviews we see R9 290X very close to GTX 980 and beating GTX 780 Ti at those settings.

http://hardocp.com/article/2014/11/21/far_cry_4_video_card_performance_iq_preview/5#.VHlctHutHhA

Majority of the reviews turn off Godrays because the cards above GTX 770 are the ones which seem to be upto the task to play the game with it.

http://www.purepc.pl/karty_graficzn...e_test_kart_graficznych_i_procesorow?page=0,5

http://www.purepc.pl/karty_graficzn...e_test_kart_graficznych_i_procesorow?page=0,6

http://www.purepc.pl/karty_graficzn...e_test_kart_graficznych_i_procesorow?page=0,7

http://www.purepc.pl/karty_graficzn...e_test_kart_graficznych_i_procesorow?page=0,8

http://www.sweclockers.com/artikel/19647-snabbtest-grafikprestanda-i-far-cry-4

http://pclab.pl/art57559-7.html

http://www.notebookcheck.com/Far-Cry-4-Benchmarks.130346.0.html

We are also yet to see the benchmarks updated with Patch 1.4 which seems to solve stuttering on AMD cards and improve performance. Ubisoft has still work to do with this game
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Is that purepc.pl review underclocking the GPUs? The Gigabyte G1 GTX 970 in that review was running at 1050 MHz, but the actual stock clock speed is 1178 MHz. I know this because I have two of them!

And it's too bad they didn't test the game with PCSS enabled..
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I wouldn't use HBAO+ in this game. Indoors it is fine but outdoors the coverage is very poor, so things in the distance don't get any AO, and so SSBC looks better.

SSBC is also 3x faster per frame (.5ms vs 1.5ms)
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I wouldn't use HBAO+ in this game. Indoors it is fine but outdoors the coverage is very poor, so things in the distance don't get any AO, and so SSBC looks better.

Didn't take me long to utterly disprove this:

Outdoor SSBC1
Outdoor HBAO+ 1
Outdoor SSBC2
Outdoor HBAO+ 2
Outdoor SSBC 3
Outdoor HBAO+ 3

No difference in occlusion distance, and HBAO+ is much more accurate and reveals significantly more detail. SSBC is reminiscent of that horrible black aura that was in Far Cry 3 when HBAO was enabled..

SSBC is also 3x faster per frame (.5ms vs 1.5ms)

Yeah, who'd have thought that higher quality would have a higher price? :whiste:
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Are you blind. There is a difference in distance.

How about you stop pointing the camera at the floor?

http://imgur.com/a/aLe1R#0

Do you even know what ambient occlusion does?

Apparently not. The reason why there is a difference in the distance in those screenshots is because it's in broad fricking daylight..

SSBC is shadowing objects that shouldn't be shadowed in a daylight scene, while HBAO+ is actually processing the self shadowing more accurately by accounting for the amount of light and revealing more detail..
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |