[gamegpu] Dragon Age Inquisition performance

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

StinkyPinky

Diamond Member
Jul 6, 2002
6,952
1,260
126
Turn off MSAA imo. It makes no visual difference and murders the frame rate. TotalBiscuit did a test with it in his you tube port report for the game.

I honestly cannot tell the difference between having it on and off. I'd rather bump up the textures to fade touched which looks better to me. With textures at fade touched, no MSAA, and everything else set to ultra (except tessellation and perhaps shadows....can't remember that may be set to high) it runs reasonably well. The only hitching I get is the occasional one in the main "cities". It's an RPG so I don't care about getting a constant 60 fps anyway.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I have 2GB card aswell.. And I can't have console parity with recent titles. I need to lower texture quality if I don't want to have stuttterfest.

Already had Stutterfest 2014 this year with Titanfall, it was fun watching a 2gb 770 absolutely fail at 768p with ultra textures, and actually stutter a bit with the very high ones too. Luckily this is the only recent title that this has happened with.

Looks like all the newer titles simply just need a boatload of both performance and vram on the gpu, safe to say my 770 has went from overkill at 768p last year to just enough in the cases vram isn't capped first this year.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Already had Stutterfest 2014 this year with Titanfall, it was fun watching a 2gb 770 absolutely fail at 768p with ultra textures, and actually stutter a bit with the very high ones too. Luckily this is the only recent title that this has happened with.

Looks like all the newer titles simply just need a boatload of both performance and vram on the gpu, safe to say my 770 has went from overkill at 768p last year to just enough in the cases vram isn't capped first this year.

Nvidia's planned obsolescence; they shorted the VRAM intentionally so you'd be forced to buy a new card sooner.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
Nvidia's planned obsolescence; they shorted the VRAM intentionally so you'd be forced to buy a new card sooner.
there were 4gb models but nearly everyone here would have told him not to get the 4gb version anyway.

who the heck would want to play at 1024x768 on purpose though? good grief.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
I have 2GB card aswell.. And I can't have console parity with recent titles. I need to lower texture quality if I don't want to have stuttterfest.

Already had Stutterfest 2014 this year with Titanfall, it was fun watching a 2gb 770 absolutely fail at 768p with ultra textures, and actually stutter a bit with the very high ones too. Luckily this is the only recent title that this has happened with.

Looks like all the newer titles simply just need a boatload of both performance and vram on the gpu, safe to say my 770 has went from overkill at 768p last year to just enough in the cases vram isn't capped first this year.

Well the discussion of console games being poorly optimized for PC and getting lesser returns on equivalent or better PC hardware is a different discussion than if 2 GB should be enough for parity with consoles. Titanfall, and more recently Assassin's Creed Unity, were poorly optimized games. Failing to run those games with high texture quality isn't a sign that 2 GB is inherently insufficient, it's a sign that the games had s*** optimization. Titanfall was running on the Source engine which doesn't even have DirectX 11 features, for crying out loud.

2 GB SHOULD be enough for console parity. Dragon Age Inquisition's benchmarks show that for itself.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
there were 4gb models but nearly everyone here would have told him not to get the 4gb version anyway.

Wasn't me making those recommendations.

who the heck would want to play at 1024x768 on purpose though? good grief.

I think they meant 1366x768, common resolution for laptops and uber cheap monitors.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
there were 4gb models but nearly everyone here would have told him not to get the 4gb version anyway.

who the heck would want to play at 1024x768 on purpose though? good grief.

If the 1024x768 comment was towards me,i don't game at that.I game at 1366x768.Don't worry the 5:4 screens you have harassed me about twice already in the pass aren't even in my house anymore lol. I had a Q3A and UT99 kick for a while along with BF1942 and those old monitors were nice for those games.:thumbsup:

Last time i came close to 1024x768 was in 2006 with BF2 on my 6200le and i actually had piss poor performance at 800x600.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
If the 1024x768 comment was towards me,i don't game at that.I game at 1366x768.Don't worry the 5:4 screens you have harassed me about twice already in the pass aren't even in my house anymore lol. I had a Q3A and UT99 kick for a while along with BF1942 and those old monitors were nice for those games.:thumbsup:

Last time i came close to 1024x768 was in 2006 with BF2 on my 6200le and i actually had piss poor performance at 800x600.
lol okay. yeah I forgot all about 1366x768 when I said that just then. but at least you are widescreen now and no more 5:4 or 4:3.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
lol okay. yeah I forgot all about 1366x768 when I said that just then. but at least you are widescreen now and no more 5:4 or 4:3.

Iv'e been on and off widescreen monitors for different reasons since 2009,ranging from just selling them to having one stolen to giving my cousin a 1080p 24'' cause he had this 17'' 768p screen and had a eye injury that made using it impossible.

Love my cousin to much to make him use this screen, but i want a true upgrade from 1080p soon anyways after a gpu upgrade.Next year sometime hopefully.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
With everything on Ultra with 2Xmsaa I get a 44.5 average FPS and 36.8 low, with everything ultra and no msaa it's 50.8 and 40.1. I didn't see much of a difference with those two turned down to high so I kept it.

Thank u sir! And your i5 3570k is running at what speeds? Wonder if my i7 4790k @ 4.6ghz will give me a few more fps....but 50fps avg is enough for me...think i'll take the plunge then! Thanks again!
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I probably mentioned this earlier in this thread but my 5930K @ 3.7GHz and 780 Ti @ 1215MHz runs this at 1200p with everything maxed and AA disabled at a smooth 60FPS Vsync locked, only dips are in Redcliffe and the Emerald Graves in the beginning with a lot of the battles. Dips down to 50, seen dips to 35 in pitched battles with everything going off for a second before it shoots back up to past 50. Kepler still needs optimization. This game does use all 6 cores consistently.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
I probably mentioned this earlier in this thread but my 5930K @ 3.7GHz and 780 Ti @ 1215MHz runs this at 1200p with everything maxed and AA disabled at a smooth 60FPS Vsync locked, only dips are in Redcliffe and the Emerald Graves in the beginning with a lot of the battles. Dips down to 50, seen dips to 35 in pitched battles with everything going off for a second before it shoots back up to past 50. Kepler still needs optimization. This game does use all 6 cores consistently.

I think DAI is going to get some performance improvements in the next major driver releases from both AMD and Nvidia. As good as the game looks, I don't think my 290X is topped out by it either. Within the next 30 days, the Catalyst 14.12s will probably have a few lines devoted to DAI in the release notes.
 

ramj70

Senior member
Aug 24, 2004
764
1
81
Thank u sir! And your i5 3570k is running at what speeds? Wonder if my i7 4790k @ 4.6ghz will give me a few more fps....but 50fps avg is enough for me...think i'll take the plunge then! Thanks again!

My CPU is at stock speeds, yeah I know shoot me

One of these days I need to bump up the speed some
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
I got my 270X and tried it out with Dragon Age Inquisition. My subjective evaluation is that the game runs fine at 1080p, everything maxed out except no MSAA -- during walking around and combat. Cutscenes are full of hitching and dropping frames galore. I was playing on a 1080p TV, and my brother and his friend were watching, and the friend specifically noted that it looked like it was dropping frames. Since performance outside of cutscenes is fine, I can't be sure if this is the cutscene hitching "glitch" related to cutscenes being locked at 30 frames per second, or just the 270X not quite cutting it at full spec. I guess I'll have to wait for BioWare to possibly patch that issue, and for AMD to release more thorough driver optimizations, to really find out. I was running the 270X at 1100 MHz, in DirectX 11, by the way. Haven't tried Mantle yet.

The game looks great though, in my opinion. The Frostbite engine is far superior to the in-house engine that BioWare used for the previous games. Dragon Age 2 had a few nice DirectX 11 features (for 2012), but its lighting was still pretty limited, and characters didn't look "real" at all. In Inquisition, lighting is much better. I noticed how in one scene a character in the foreground cast a shadow on objects and characters all the way in the background. Characters models are much more detailed and less cartoony than previous games, though they still can seem wax figure-ish at times. Animations when running around are much improved and seem more natural, with more of a sense of momentum. I noticed at least one dynamic physics interaction with the environment -- my character could knock around a low-hanging chandelier. Such physics interacts were completely absent from past Dragon Age games, and I hope there are more objects like that in the game.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Turn off MSAA imo. It makes no visual difference and murders the frame rate. TotalBiscuit did a test with it in his you tube port report for the game.

I honestly cannot tell the difference between having it on and off. I'd rather bump up the textures to fade touched which looks better to me. With textures at fade touched, no MSAA, and everything else set to ultra (except tessellation and perhaps shadows....can't remember that may be set to high) it runs reasonably well. The only hitching I get is the occasional one in the main "cities". It's an RPG so I don't care about getting a constant 60 fps anyway.

Really? MSAA looks a lot better to me. Running MFAA 4x here and it doesn't kill the frame rate too much. Lowering the tessellation distance and upping MSAA/MFAA kept the fps solid and really improved the picture IMHO.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
So I ran some benchmarks with my 1440x900 monitor (currently the only monitor I have access to, will be getting another 1080p monitor soon). I wanted to systematically go through and find what settings bottleneck my graphics card the most, by knocking each down from the highest individually and comparing to the "control" setting of max settings with no MSAA.

The answer: No setting in particular besides MSAA seems to give the system a boost by dropping down to the second highest. The average framerate in the in-game benchmark with my 270X -- at 1440x900, 1120 MHz graphics clock, max graphics settings except no MSAA -- was 44.9. No matter which setting I ticked down one notch, my framerate stayed around 44 or 45, average. This includes lowering tessellation or going from "Full HBAO" to just "HBAO".

MSAA, of course, is a different story. 2x MSAA with max settings dropped it down to 39.5. 4x MSAA dropped it down to 35.9.

I wanted to try Mantle and benchmark it, but it was a non-starter. Why? Because the game won't play in fullscreen mode in Mantle on my system! I'm not going to bother benchmarking Mantle if it won't even work properly.
 
Aug 11, 2008
10,451
642
126
My game seems to play in windowed mode no matter what. I do have full screen selected in the options. Not really a big deal, as it allows me to tab in and out easily.
 

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
So I ran some benchmarks with my 1440x900 monitor (currently the only monitor I have access to, will be getting another 1080p monitor soon). I wanted to systematically go through and find what settings bottleneck my graphics card the most, by knocking each down from the highest individually and comparing to the "control" setting of max settings with no MSAA.

The answer: No setting in particular besides MSAA seems to give the system a boost by dropping down to the second highest. The average framerate in the in-game benchmark with my 270X -- at 1440x900, 1120 MHz graphics clock, max graphics settings except no MSAA -- was 44.9. No matter which setting I ticked down one notch, my framerate stayed around 44 or 45, average. This includes lowering tessellation or going from "Full HBAO" to just "HBAO".

MSAA, of course, is a different story. 2x MSAA with max settings dropped it down to 39.5. 4x MSAA dropped it down to 35.9.

I wanted to try Mantle and benchmark it, but it was a non-starter. Why? Because the game won't play in fullscreen mode in Mantle on my system! I'm not going to bother benchmarking Mantle if it won't even work properly.

You're playing at a relatively low end resolution and your FPS isn't changing no matter what you're doing? CPU bound. As for Mantle, are you on the latest driver?


My game seems to play in windowed mode no matter what. I do have full screen selected in the options. Not really a big deal, as it allows me to tab in and out easily.

Would you happen to have Teamviewer running in the background? I tend to have full screen issues if it's running on mine.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
You're playing at a relatively low end resolution and your FPS isn't changing no matter what you're doing? CPU bound. As for Mantle, are you on the latest driver?

He is running a 2500k at 4.0. Would be be held back by the CPU to 45 fps?
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
You're playing at a relatively low end resolution and your FPS isn't changing no matter what you're doing? CPU bound. As for Mantle, are you on the latest driver?

CPU bound to 44-45 FPS on a 2500K overclocked to 4 GHz? When Techspot got 58 frames per second out of a 3570K at stock clock speeds? Seems unlikely to me.



And yep, I'm on the latest beta driver, 14.11.2. I can switch Mantle on, it's just that whenever I start the game with Mantle on, it takes me to windowed mode even if "full screen mode" is selected, and there's no way to change it to full screen once launched. DX11 automatically starts in fullscreen mode,
 
Last edited:

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
CPU bound to 44-45 FPS on a 2500K overclocked to 4 GHz? When Techspot got 58 frames per second out of a 3570K at stock clock speeds? Seems unlikely to me.



And yep, I'm on the latest beta driver, 14.11.2. I can switch Mantle on, it's just that whenever I start the game with Mantle on, it takes me to windowed mode even if "full screen mode" is selected and there's no way to change it to full screen; DX11 automatically starts in fullscreen mode,
Yeah, I didn't look at your sig or specs. Whoops. Not sure what's going on there.
 
Aug 11, 2008
10,451
642
126
You're playing at a relatively low end resolution and your FPS isn't changing no matter what you're doing? CPU bound. As for Mantle, are you on the latest driver?




Would you happen to have Teamviewer running in the background? I tend to have full screen issues if it's running on mine.

Dont think so unless it is part of Raptor.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
He is running a 2500k at 4.0. Would be be held back by the CPU to 45 fps?

Possibly. This and Unity can both use a hexa core well as well as (just) an octa core. There are jumps in Haswell's performance overall too - something in the architecture methinks?
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Possibly. This and Unity can both use a hexa core well as well as (just) an octa core. There are jumps in Haswell's performance overall too - something in the architecture methinks?

Seems to run fine on Ivy Bridge.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |