Closed

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

BlitzWulf

Member
Mar 3, 2016
165
73
101
Hm the G80 (GTX8800) was relesed in 11/2006 and with slight tweaks went on to ward off the x1000 Series, the HD2000 Series and the HD3000 Series from ATI.

The R300 uarch gave AMD the performance lead over nvidia for over 2 years, this would be the closest IMHO to hawaii's success

http://www.anandtech.com/show/8417/amd-celebrates-30-years-of-gaming-and-graphics-innovation

Thanks for your replies! Even a relative newb like me recognizes those GPU series,from a personal perspective I'm tickled pink to have a GPU (390X) that deserves to be mentioned alongside those legendary chips.

I might just keep it until it's obsolete for high end gaming in 2022 and just display it on a shelf to look at while stroking my gray beard and saying "They don't make em like they used to!"
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Thanks for your replies! Even a relative newb like me recognizes those GPU series,from a personal perspective I'm tickled pink to have a GPU (390X) that deserves to be mentioned alongside those legendary chips.

I might just keep it until it's obsolete for high end gaming in 2022 and just display it on a shelf to look at while stroking my gray beard and saying "They don't make em like they used to!"

I'm giving mine to friends, and I think I might want to have it back once it goes well and truly obsolete. It'd look nice next to my 8800 GT.

Also I do want to point out that NV is making shortsighted architectures, but I can't really blame them since the near future of gaming is tautologically going to look like what AMD's making.
 
Last edited:

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
I fixed it for ya.

Not really. Maxwell has also fallen behind Hawaii since release. In 2014 the 970 CRUSHED the 290, and beat it in every game- most of the time by 10-15%. The 290X was the competitor for the 970 then.

Fast forward to today and the 290x hangs out at the top of the graph near the 980 most of the time, and in a few games the 290 beats the 970 straight up at 1080p (the 970s best resolution).

To me that shows that what is going on has NOTHING to do with Nvidia or their drivers or their focus. The issue is console optimization and engines meant to get every bit out of GCN.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I recently sold my 780 Ti for $300 even, as its dead on 2yrs old and getting more seemed an impossibility. Partially surprised it actually sold for that match considering Kepler is useless in 2016. I'm considering picking up a 390 if I don't wait, possibly for Quantum Break and Doom 4. I do agree that AMDs GPUs have greater longevity with drivers and as consoles are GCN it seems to be the architecture to get.
 

Raising

Member
Mar 12, 2016
120
0
16
I view it as bad optimization from amd on new products while nvidia can get most of it from the start.

Bad CPU overhead is still a plague in dx11 for amd.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
This is what I've been saying for ages. I'm still a little annoyed that I bought a 780 2 months before Maxwell was released, when we were expecting it to be released several months later after Christmas, but overall I still have the performance I need for my res (helped by the OC) even if it takes a month or so for it to be optimised. It still makes me feel like a 2nd tier consumer when I spent around £400 for a high end product, but I can't game as much as I used to so I'm usually still playing the older games whilst Nvidia get round to optimising Kepler.

edit : Rise of tomb raider (DX12) , despite having 3GB , GTX 780Ti is faster than 290X ? Isn't this game intensive memory hungry?

In DX11 mode It can consume a lot of memory, but it doesn't need to. Most of it is for cache purposes and so far 3GB has proven playable when maxed out (using SMAA instead of SSAA)

DX12 mode is another story.
 

thesmokingman

Platinum Member
May 6, 2010
2,302
231
106
I view it as bad optimization from amd on new products while nvidia can get most of it from the start.

Bad CPU overhead is still a plague in dx11 for amd.


DX11 is dead, here comes DX12.


The king is dead, long live the king!





 
Feb 19, 2009
10,457
10
76
If you want to discuss the data in the link, the only thing to say it is does not match the major review sites out there.

There's been a ton of benchmarks for new games and data from credible sites have the 780Ti under-performing vs the R290X, but especially obvious because it's slower than the 970, sometimes by a big margin.
 
Feb 19, 2009
10,457
10
76
I agree,
but by the time the 290x passes the 980 we will all have new cards (or should) and we won't care because the 980/290 will be over 5 years old.. :thumbsup:

Non-reference 290X has already passed the 980 in new games. Most of them it matches or, some of them it exceeds it.

We haven't suddenly not cared about these GPU class, because you want to know why? It's still 50-60 fps, very playable maxing out modern games.

Unless console ports suddenly ramp up in the graphics demands, an R290X will still be a capable gaming GPU 2/3 years from now.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
If you want to discuss the data in the link, the only thing to say it is does not match the major review sites out there.

There's been a ton of benchmarks for new games and data from credible sites have the 780Ti under-performing vs the R290X, but especially obvious because it's slower than the 970, sometimes by a big margin.

The reason for this is that most sites do their tests when the game has been released, with the most recent drivers. Nvidia usually won't have made the Kepler optimisations by then.
 
Feb 19, 2009
10,457
10
76
The reason for this is that most sites do their tests when the game has been released, with the most recent drivers. Nvidia usually won't have made the Kepler optimisations by then.

You know this all started to hit the net with Witcher 3. When it was first released, the 780Ti was way below the 970.

Even the 960 was approaching the 780.

Lots of rage caused NV to optimize for Kepler, improving performance by a bit, that quelled the anger... but when we re-examine it, after 9 months or so with updated game and drivers, let's see what we find.





^ Look at that, the 780Ti is still well below the 970. The 280X/7970 approaches the 780, and the R290/X is above the 780Ti.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
You know this all started to hit the net with Witcher 3. When it was first released, the 780Ti was way below the 970.

Even the 960 was approaching the 780.

Lots of rage caused NV to optimize for Kepler, improving performance by a bit, that quelled the anger... but when we re-examine it, after 9 months or so with updated game and drivers, let's see what we find.





^ Look at that, the 780Ti is still well below the 970. The 280X/7970 approaches the 780, and the R290/X is above the 780Ti.

OK firstly those graphs are for high, and ultra quality, so not really a comparison of different drivers over time.

Do we actually know that this site retests every card, for every game, every time a new driver is released?

Not only does this seem unlikely due to time and resources, but the 780 and 780ti results are much lower than the performance I get with the game maxed out, and game works enabled (albeit with 1 or 2 reasonable optimisations)

And your definition of "well below" seems to equate to 3-4fps
 
Last edited:
Feb 19, 2009
10,457
10
76
It's a comparison of the most recent performance in Witcher 3 at 2 quality settings. lol

The 780Ti is 8 FPS behind the 970/R290X.

40 (34) vs 48 (42) on Ultra.

It was typically 10% faster than the 970.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
OK firstly those graphs are for high, and ultra quality, so not really a comparison of different drivers over time.

Do we actually know that this site retests every card, for every game, every time a new driver is released?

Not only does this seem unlikely due to time and resources, but the 780 and 780ti results are much lower than the performance I get with the game maxed out, and game works enabled (albeit with 1 or 2 reasonable optimisations)

And your definition of "well below" seems to equate to 3-4fps

I'd say the 290x being 20% faster average fps, and 23% higher minimums than the 780 TI makes it under the category of "well below".
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
It's a comparison of the most recent performance in Witcher 3 at 2 quality settings. lol

The 780Ti is 8 FPS behind the 970/R290X.

40 (34) vs 48 (42) on Ultra.

It was typically 10% faster than the 970.

Ahh, I was looking at minimums. I can't see the charts whilst writing a post on my phone, but even the numbers you quoted in your reply seem to be 6fps apart

I think the 10% or so performance difference between the 780ti and the 970 that occurred over time are in line with the usual performance increases you get in the initial period of a new architecture's release, due to overall driver optimisations. We see this in every new GPU generation.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I'd say the 290x being 20% faster average fps, and 23% higher minimums than the 780 TI makes it under the category of "well below".

The comparison being made at the time was between the 780ti and the 970.

Can someone provide a link to the article and testing methods?
 
Feb 19, 2009
10,457
10
76
The comparison being made at the time was between the 780ti and the 970.

Can someone provide a link to the article and testing methods?

http://gamegpu.com/rpg/rollevye/the-witcher-3-wild-hunt-v-1-06-test-gpu.html

They are one of the only sites I know of that revisit games after release, even re-benching patches.

This was release benches with NV game ready drivers, but AMD's drivers weren't available for a few days, with HairWorks off. You can see the 780Ti is well below the 970. It's typically 10% faster. So that's like a 30% performance deficit.

http://www.purepc.pl/karty_graficzn...n_test_procesorow_i_kart_graficznych?page=0,8



This was GameGPU's release benchmark:

http://gamegpu.com/rpg/rollevye/the-witcher-3-wild-hunt-test-gpu.html



Basically Witcher 3 and Project Cars was the turning point and when Kepler started to tank.

It's not a small drop either, going from 10% above 970 to 20% below it is a 30% performance loss.

We've discussed reasons why in the other thread (http://forums.anandtech.com/showthread.php?t=2467773) and it correlates to a 1/3 loss of performance when games do not use Kepler optimized wavefronts and NV does not actively optimize it.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
To me that shows that what is going on has NOTHING to do with Nvidia or their drivers or their focus. The issue is console optimization and engines meant to get every bit out of GCN.

After reading the results from the article I agree and I think this is the most likely explanation. Nvidia isn't doing anything to gimp or lower performance on Kepler but rather newer games don't run as well on Kepler as they do on GCN. It looks to me like developers are optimizing their code for GCN to get the most out the consoles. AMD cards are seeing the benefit of these game optimizations which explains why we still have Hawaii and other GCN cards fighting strong despite being very old.

We don't see the same regressions on Maxwell and my assumption is that Maxwell's architecture responds better to GCN optimizations than Kepler did. Not as good as GCN itself of course, but significantly better than Kepler. Given GCN and consoles have been out a while and we aren't seeing Maxwell fall on its face, I don't think we'll have the same level of regressions as Kepler has from GCN optimizations alone.

I think Mahigans post explains this theory well - http://forums.anandtech.com/showpost.php?p=38135040&postcount=24

I think the question for Maxwell is less about GCN optimization but instead will developers optimize for Async Compute? If so, we'll continue to see AMD cards punch above their weight class relative to the Nvidia cards. It will probably be to lesser degree with what we saw with the GCN optimizations and Kepler. While Async Compute has shown it won't hurt you if your GPU doesn't support it, it can improve performance noticeably on GPUs that do support it.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
http://gamegpu.com/rpg/rollevye/the-witcher-3-wild-hunt-v-1-06-test-gpu.html

They are one of the only sites I know of that revisit games after release, even re-benching patches.

This was release benches with NV game ready drivers, but AMD's drivers weren't available for a few days, with HairWorks off. You can see the 780Ti is well below the 970. It's typically 10% faster. So that's like a 30% performance deficit.

http://www.purepc.pl/karty_graficzn...n_test_procesorow_i_kart_graficznych?page=0,8



This was GameGPU's release benchmark:

http://gamegpu.com/rpg/rollevye/the-witcher-3-wild-hunt-test-gpu.html



Basically Witcher 3 and Project Cars was the turning point and when Kepler started to tank.

It's not a small drop either, going from 10% above 970 to 20% below it is a 30% performance loss.

We've discussed reasons why in the other thread (http://forums.anandtech.com/showthread.php?t=2467773) and it correlates to a 1/3 loss of performance when games do not use Kepler optimized wavefronts and NV does not actively optimize it.

Interesting that they use the maze as their benchmark run. That area is fairly unique compared to the rest of the game. I will download their save file and test it for myself when I get a chance.

One thing to note is that between various patch levels, I was no longer able to have the vegetation distance maxed out and play it smoothly. I think this was resolved in later patches, but I will have to double check. It was definitely a GPU issue because it occurred both before and after my recent CPU upgrade.

Also fwiw, for me the difference in performance started with Dragon Age Inquisition, 6 months before the Witcher 3 was released, and the first big game release I wanted to play after Maxwell was released. It took Nvidia 2 months to optimise DAI for Kepler, but now it has, it is very playable when maxed out on my oc'd 780/ultrawide combo.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,582
6,011
136
I don't get the point of this thread. Kepler owners are being treated as second tier customers, with the focus on Maxwell. That's not a surprise, nor should it be surprising. It's simply good for business, as long as people keep upgrading.

The same thing will likely happen with Pascal. We'll see 980 Ti and Maxwell-based cards get less attention. With a node jump we should see a 980 Ti killer at half the price soon enough. Which would give even less incentive to devote development resources to Maxwell.
 
Feb 19, 2009
10,457
10
76
Interesting that they use the maze as their benchmark run. That area is fairly unique compared to the rest of the game. I will download their save file and test it for myself when I get a chance.

One thing to note is that between various patch levels, I was no longer able to have the vegetation distance maxed out and play it smoothly. I think this was resolved in later patches, but I will have to double check. It was definitely a GPU issue because it occurred both before and after my recent CPU upgrade.

Also fwiw, for me the difference in performance started with Dragon Age Inquisition, 6 months before the Witcher 3 was released, and the first big game release I wanted to play after Maxwell was released. It took Nvidia 2 months to optimise DAI for Kepler, but now it has, it is very playable when maxed out on my oc'd 780/ultrawide combo.

In their earlier benches they did an outdoors scene. It was giving similar results, though the maze area seem for some reason even more demanding.

Generally, wavefronts of 64 suits GCN, 32 suits Kepler/Maxwell. But 64 only has a slight performance deficit on Maxwell. There was this diagram from a paper that studied this in more detail and Mahigan linked it, but I can't find his post atm.

The take away point is when game engines are optimized for GCN, it runs poorly on Kepler and okay on Maxwell.

NV can fix this and optimize these games in DX11 through their drivers (if they choose to do so)!

A classic example, Shadow of Mordor. R290X >= 980. Clearly GCN optimized.

http://us.hardware.info/reviews/576...d-with-21-gpus-test-results-1920x1080-full-hd



We find the 780Ti is ahead of the 970, between the 970 & 980, where it belongs basically.

http://www.anandtech.com/show/9306/the-nvidia-geforce-gtx-980-ti-review/6



This result is verified by other sites, like AnandTech here, where the 780Ti is slight below the 980. Not a huge gap.

We see this in other games like Rainbow Six, Battlefront etc. Despite running very well on GCN, Kepler keeps up just fine.

So while we can say GCN optimized engines (modern console ports) can cripple Kepler, it's clear in many games, that NV can optimize Kepler if they wish.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Yes I would agree with your last paragraph completely, I would also add that they do optimise eventually, but not so much that it overshadows Maxwell. From a business point of view It wouldn't shed the best light on Maxwell, but now they are facing backlash from the relatively small group of enthusiasts who are calling them out, most of which from what I can see either went with AMD, or have upgraded to Maxwell.

I think they could do more, especially in circumstances that see an AMD 280 outperform a 780.
 
Feb 19, 2009
10,457
10
76
I agree,
but by the time the 290x passes the 980 we will all have new cards (or should) and we won't care because the 980/290 will be over 5 years old.. :thumbsup:

I just want to re-iterate this point.

My non-ref R290X is running at 390X class performance.

I've been playing all the new AAA games and it is performing amazing, maxing them all out easily (minus GameWorks where applicable).

I bet in another 2 years time, it will still be perfectly capable at 1080p and able to handle 1440p minus some minor visual settings.

The next performance leap will be next-gen consoles, PS4K/PS5 etc, when the resolution target shifts to 4K, from the current 900p or 1080p. So current GPUs matter a lot longer. This is even more true when there's properly done DX12 giving GCN a nice performance boost.

Not bad for a chip released in 2013.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |