8GB VRAM not enough (and 10 / 12)

Page 74 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
Company Of Heroes 3060 has a higher minimum than the 3070TI:

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,628
21,077
146


So many of you have made such a compelling case, and addressed all the counter arguments with facts and references. To the point that no one reasonable could fail to acknowledge and accept them.

Paying $270 or more for 8GB in 2024 is a demonstrably bad buy. For those that sell their old cards it will be even worse. As games expose them more and more, the resell value will plummet compared to models with more vram.
 

MrPickins

Diamond Member
May 24, 2003
9,017
585
126
The 6700 XT's low price is only because AMD bought far too many wafers during Peak Crypto. AIBs were (still are?) building new cards from chips that are a year+ old. Not really a sustainable product. That's going to be a problem assuming no new consoles any time soon - who exactly is going to pay for RDNA4,5,6's R&D?
Wrong thread, buddy.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
So many of you have made such a compelling case, and addressed all the counter arguments with facts and references. To the point that no one reasonable could fail to acknowledge and accept them.
And that's the crux of the issue here. Nobody that's watched the first four videos in the OP could possibly keep parroting "these are corner cases where neither card is playable!"

That's a complete lie.

So either the posts are intentional corporate shilling, or a possible case of legal blindness. There's no other rational explanation here.
 

Aapje

Golden Member
Mar 21, 2022
1,425
1,934
106
After all, the chips cost money. Figure any clamshell products that come out were not originally planned. Power consumption is a factor but when they are deciding on the bus width, the cost is a far bigger one.

Even if you completely ignore the reduced prices for RDNA4 and only look at MSRPs, the prices of the new generations have increased substantially above what you can explain with rising costs.

And arguing that it is the RAM-prices that are a major cost factor, at a moment where RAM makers have massive excesses of RAM and prices are extremely low, is frankly ridiculous.
 

jpiniero

Lifer
Oct 1, 2010
14,675
5,300
136
When it comes to nVidia I think the best you can hope for with Blackwell is that NV includes a hardware asset decompression engine. Which will hopefully greatly reduce the performance hit of bringing in assets in a busy VRAM situation.
 

psolord

Golden Member
Sep 16, 2009
1,963
1,201
136
Oh wow, so many answers. I will try to answer in due time. Sadly many of you misunderstand what I am saying. For example, I never disputed if anyone should buy a 7600xt 16GB/4060ti 16GB over their 8GB counterparts. I only said just get an even better and more expensive card, of course, like a 4070 or a 7700XT. You are putting more weight on vram, I am putting more weight on gpu power.

Yes maybe there is a language barrier. Among other things, I do try to practice my English from within this thread.

Also all that 4GBs vs 8GBs I don't know how it came into the discussion. If you think, that just because 4 vs 8 have the same percentage difference as 8 vs 16, it will result in the same kind of differences, you are mistaken. The reason is very simple. The higher we go with vram, we are getting diminishing returns. I mean visually, 8GBs are a lot of graphics data and the resolutions are fixed. You need to try to squeeze higher quality assets, on a 1080p output than you actually need.

To highlight this better, I did a quick test, with the notorious TLOU you all like dragging around on a flag in this thread.

Test on the 4070ti, 1080p, no upscaler.

1080p staight ultra preset



1080p ultra with textures at high



1080p straight high


1080p high with ultra textures



Do you see the vanity of this thread and why I am talking about correct settings or do you not?

First of all, visually the differences are virtually non existent. I did run around testing all these settings and I could not see such a difference, maybe only when I went from straight ultra to straight high.


Other than that the findings are:
1080p ultra 96W, 8.8GB vram usage.
1080p ultra high textures 8.5GB vram usage

1080p high 78W, 6.9GB vram usage
1080p high with ultra textures, 80W, 8.5GB vram usage.

For some strange reason, the vram usage did not change much for the first two, but neither did the image quality.

The last two are more telling really. Using high+ultra textures, actually moves the vram over 8GBs with zero benefits. That's how you make your game unplayable on a 8GB card, for nothing. This is what I call "stupid settings" and diminishing returns. Now add RT on top of that, for games that support it and you end with 29fps vs 5fps arguments, for which I am sorry, but I don't care.
 
Reactions: MoogleW

Ranulf

Platinum Member
Jul 18, 2001
2,382
1,262
136
The last two are more telling really. Using high+ultra textures, actually moves the vram over 8GBs with zero benefits. That's how you make your game unplayable on a 8GB card, for nothing. This is what I call "stupid settings" and diminishing returns. Now add RT on top of that, for games that support it and you end with 29fps vs 5fps arguments, for which I am sorry, but I don't care.

I see and hear your arguments. I just now see zero benefits in engaging further. Though I probably will because I'm a glutton for punishment or bored.

Oh and a 4070ti was a $800 card. For 1080p. To get only 60fps, at any level is just stupid expenses or settings. It has been bad enough that I've bought into spending $300-400 every 3-5 years to get top end 1080p performance. I'm not going up to $800 to get a 192bit bus card with 12GB especially now that we've had Turing 2.0 happen with the 4070ti Super with 16GB of ram and a 256bit mem bus.
 

MrPickins

Diamond Member
May 24, 2003
9,017
585
126
Also all that 4GBs vs 8GBs I don't know how it came into the discussion. If you think, that just because 4 vs 8 have the same percentage difference as 8 vs 16, it will result in the same kind of differences, you are mistaken. The reason is very simple. The higher we go with vram, we are getting diminishing returns. I mean visually, 8GBs are a lot of graphics data and the resolutions are fixed. You need to try to squeeze higher quality assets, on a 1080p output than you actually need.
It came into the discussion because (like I told you months ago): the future exists, and we want the cards we buy today to be able to play games that come out in the next few years, not just the ones that are out today.

The 4vs8GB and 3vs6GB discussions are historical reference points that show the trend in how hardware tends to age. You're far too focused on the here and now.

(Also, I see you've completely ignored my point about texture packs)

I see and hear your arguments. I just now see zero benefits in engaging further. Though I probably will because I'm a glutton for punishment or bored.
Same
 

Ranulf

Platinum Member
Jul 18, 2001
2,382
1,262
136
I am of the opinion that RT is going to become more important soon enough... enough that RDNA2 and earlier won't look great either.

Oh its possible and I wouldn't be surprised. Maybe it will be Unreal5 engine. Thing is, I don't think that is going to help sell the new cards all that well. Not after the well poisoning since 2018/19 with Turing that just got worse since then.
 

MrPickins

Diamond Member
May 24, 2003
9,017
585
126
I am of the opinion that RT is going to become more important soon enough... enough that RDNA2 and earlier won't look great either.
This argument isn't about NVidia vs AMD per se, though.

Ray tracing adds its own VRAM overhead. It's quite possible that the 8GB RTX cards will hit a VRAM bottleneck before a compute one in those scenarios. Who cares what the competitor is doing when it's your card that's being intentionally limited?
 

H T C

Senior member
Nov 7, 2018
561
401
136

Attachments

  • Screenshot from 2024-03-09 22-18-53.png
    98.5 KB · Views: 12
Jul 27, 2020
16,706
10,703
106
I think we still have some time to go before it's "ray tracing for all".
I think we will soon see a trend of "RT Lite" so that it can run on iGPUs from AMD and Intel and next year, in WoA games. I really hope that takes off because if it does, Geforce RT cores will start getting under-utilized because I'm betting that most users will not see enough of a difference in full RT to bother about it or the performance hit that comes with enabling it, even on a 4090 because let's face it, if RT Lite is 50% faster than full RT, not many gamers will see the point in pushing their card harder with more heat and more fan noise just to be able to see cool lighting effects. Then we can finally move onto something else in improving image fidelity. Yes, path tracing is nice to look at if you are playing some adventure title or maybe RTS or RPG but in fps shooters or fighting games, most people prioritize fluid motion over realistic visuals.
 
Reactions: MrPickins

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
Ray tracing adds its own VRAM overhead. It's quite possible that the 8GB RTX cards will hit a VRAM bottleneck before a compute one in those scenarios.
Already happening now in some ray tracing situations where AMD's 12GB/16GB parts are outrunning equivalent NV 8GB parts.

I guess the "correct setting" in these cases is to disable ray tracing, heh.
 
Mar 11, 2004
23,093
5,572
146
I am of the opinion that RT is going to become more important soon enough... enough that RDNA2 and earlier won't look great either.

I don't see that happening soon because of the consoles. Games aren't going to sabotage performance there yet. Even if Sony pushes out a PS5 Pro this year, its not gonna have a huge install base and they're gonna have to keep offering ok performance for the current stuff.

Quick side note, apparently Sony is possibly going to do something to make the PSVR2 work on PC. Extra surprising since that would be one of the main reasons to upgrade to a PS5 Pro I would think.
 

Majcric

Golden Member
May 3, 2011
1,373
40
91
While 8gb VRAM is a problem for PC enthusiast in the here and now. Do we know for certain that 12 is a problem after game patches have been applied?
 
Jul 27, 2020
16,706
10,703
106
Extra surprising since that would be one of the main reasons to upgrade to a PS5 Pro I would think.
I think it has to do with the Quest's popularity. After releasing quite a few high profile console exclusives on the PC, Sony has now tasted the sweat and blood of PC enthusiasts and it wants more of it. It should hopefully be a huge popularity boost for VR as it should entice developers to port their PS4/PS5 VR experiences to the PC, opening up a new and much larger market for them. I just really, really hope that Sony isn't too greedy and sensibly makes the PSVR2 SteamVR compatible. Otherwise, it won't gain much traction.
 
Reactions: GodisanAtheist

Aapje

Golden Member
Mar 21, 2022
1,425
1,934
106
I don't see that happening soon because of the consoles. Games aren't going to sabotage performance there yet.

They constantly sabotage performance on the consoles by offering games only with 30 FPS, maxing out the visuals as much as possible. So the consoles may actually drive RT, although this would require a new generation.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,628
21,077
146
Rumor mill says PS5 Pro will have RDNA 3 but with RDNA 4 ray tracing tech. Along with their own ML upscaling to help ensure it looks and plays well enough. There are already console games with light RT, no reason to think they won't push that as far as they can next iteration. Of course by the time the PC ports for the Pro are out, we should be well into the next gen of GPUs, if not the one after that. By then there should not be an 8GB card over $200. If there is? Anyone that buys them is part of the problem, whether they know it or not.

It'd be cool to see Intel or AMD with a sub $250 12GB card next gen. That should drive down prices on the plethora of older 8GB cards on the used market, to help out cash strapped gamers get something with a newer RTX 30 or RDNA2&3 feature set for dirt cheap.
 
Jul 27, 2020
16,706
10,703
106

psolord

Golden Member
Sep 16, 2009
1,963
1,201
136
It came into the discussion because (like I told you months ago): the future exists, and we want the cards we buy today to be able to play games that come out in the next few years, not just the ones that are out today.

The 4vs8GB and 3vs6GB discussions are historical reference points that show the trend in how hardware tends to age. You're far too focused on the here and now.

(Also, I see you've completely ignored my point about texture packs)
Yes this is one of my main arcs, my arguments are based on. You will not be able to play future games as you like, just because of more vram.

Along with those 4vs8GBs came a considerable performance uplift from the gpu. My GTX 970 would be just as dead in the water even if it had 8GBs. On the other hand, I recognize, that my GTX 1070 would not have aged as good, if it had 4GBs. There is a fine line, where the balance between gpu power/vram must be maintained and we are not passed that yet, for 1080p 8GB cards.

To give you a very recent example, of my own testing, on Vindictus Defying Fate. This is one more, upcoming, hyper advanced Unreal Engine 5 game. These are two screenshots, stop frame benchamrks I call them, between my 3060ti and my 1070 where you can see that the difference between the two, is 100%. 30fps vs 60fps.

Same settings at ultra, 1080p, upscaler at quality





Same framebuffer size, 100% difference. And yes, this is 1080p WITH upscaler. Real res is at 720p. This is where things are going.

For people that want to see a proper run with motion, I have these two videos (non monetized).



Two notes here. Yes with correct settings, even the 1070 can give a playable experience, yet, no amount of vram could take it close to the 3060ti. The 3060ti would not be saved even with 128GB of vram, and the proof is on the next post.
 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,963
1,201
136
This is Vindictus on the 4070ti, 1080p straight ultra. Yes, 1080p, on the 4070ti.


Thankfully it is playable, however there are some framedrops related to GPU power, not vram. There is quite some vram to be spared on the 4070ti. UE5 will require THAT much of gpu power. The 4070ti will be a 1080p card before long. Your 12 and 16GB cards of yesteryear, will not survive, without correct settings, anyway.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |