8GB VRAM not enough (and 10 / 12)

Page 83 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
Company Of Heroes 3060 has a higher minimum than the 3070TI:

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
5,392
8,280
136
As a game I am actually playing, I find this interesting but not necessarily reflecting my experience. I don’t think I tweaked anything when launching the game except to enable HDR but I will have to circle back. My 6800 is usually at 70 fps+ and I have chill enabled. Then again maybe the 3D chip matters here.

I would think the ability to benchmark this game would be really hard without a provided benchmark utility - which maybe I missed? It’s a service game and the runs vary wildly, and if your are playing solo on the lowest difficulty (ie, nearly no enemies and no teammates spamming heavy attacks) then that’s not representative of how the game is played.

It is interesting that for a Sony console focused title Nvidia seems favored. I am wondering what effects Ultra pushes and how many of them are even present on the PS5. It’s likely these are toned down for the ~6700 adjacent GPU in there.

GameGPU is known for having terrible benchmark scenes that don't represent game performance very well. Here is a link to their test sequence for Helldivers 2. Since I don't own the game, I'll let you decide if this one is a good test scene or not.


Probably the worst example I've seen of their terrible test scenes is from Doom (2016). They literally just run around an empty room at the beginning of the game and fire the pistol at nothing a few times. It's a complete joke to use this as a benchmark scene for a fast past bullet storm type game.


If there is an issue with a game with VRAM or other performance issues, you can pretty much guarantee that GameGPU will be the least likely outlet to show the problem. Case in point, their Plague Tale results. Looking at it, you'd say 8 GB cards have no issues.



Let's check what another reviewer found:



Completely different story. There are, of course, other problems like bad LOD and reduced texture IQ in some games with 8 GB cards which you would never know about if only looking at GameGPU. In other words, you can't rely on GameGPU to give you a good idea of actual game performance or IQ.


Edit: @SolidQ pointed out that the GameGPU Plague Tale test was done pre-RT patch. I have posted the GameGPU Plague Tale tests with RT in post #2058 but the results are the same, they fail to show the VRAM issue with 8 GB cards.
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,145
3,086
136
www.teamjuchems.com
Eh, includes a loading screen/level elevator then a planet covered in vegetation. That’s pretty atypical as far as I have seen. I dunno. I wouldn’t take it as gospel.

I need to figure out how take videos like everyone else. My son and I had a great time last night and I dropped a 500kg bomb in to try to clear the bots between us and our awaiting extraction pelican. I threw the stratagem in to the little valley we needed to go through and told my son it would be danger close. We both turned to watch when the bomb took a terrible angle and slammed into the ground about 3 feet in front of us. It took a split second to detonate then exploded, instantly gibbing my kid and ragdolling me off the hill and into a tree. I was wearing the luck armor - 50% chance to survive a killing blow - that must have hit twice in a row. We were laughing so hard it took a second to realize I wasn’t dead so I stimmed, ran to the pelican and managed to grab the samples off my old corpse that had died to a fire tornado just behind the pelican. It was pretty hilarious and only wish I could share it. My son and I got to go on and on about it in IRL though so at least I’ve got that 😃
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
5,392
8,280
136
Eh, includes a loading screen/level elevator then a planet covered in vegetation. That’s pretty atypical as far as I have seen. I dunno. I wouldn’t take it as gospel.

I need to figure out how take videos like everyone else. My son and I had a great time last night and I dropped a 500kg bomb in to try to clear the bots between us and our awaiting extraction pelican. I threw the stratagem in to the little valley we needed to go through and told my son it would be danger close. We both turned to watch when the bomb took a terrible angle and slammed into the ground about 3 feet in front of us. It took a split second to detonate then exploded, instantly gibbing my kid and ragdolling me off the hill and into a tree. I was wearing the luck armor - 50% chance to survive a killing blow - that must have hit twice in a row. We were laughing so hard it took a second to realize I wasn’t dead so I stimmed, ran to the pelican and managed to grab the samples off my old corpse that had died to a fire tornado just behind the pelican. It was pretty hilarious and only wish I could share it. My son and I got to go on and on about it in IRL though so at least I’ve got that 😃

Both AMD and NV have recording capability built into their software. It does effect performance a little bit though (maybe like 5-10% worst case), so you would want a separate capture device if you don't want any performance hit.
 
Reactions: Tlh97 and blckgrffn

blckgrffn

Diamond Member
May 1, 2003
9,145
3,086
136
www.teamjuchems.com
Both AMD and NV have recording capability built into their software. It does effect performance a little bit though (maybe like 5-10% worst case), so you would want a separate capture device if you don't want any performance hit.
Yeah I always chuffed and thought it was for the Twitch crowd but I’ve done it a couple of times on Xbox and PS where I can hit “save the last 30 seconds” or whatever and I am starting to appreciate it more. I’ll definitely look into it, thanks. I don’t need crazy quality and I can drop in a SSD or something for it that I’ve got laying around so as to not hit my main storage. I assume you can just continuously overwrite.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,663
21,169
146
Eh, includes a loading screen/level elevator then a planet covered in vegetation. That’s pretty atypical as far as I have seen. I dunno. I wouldn’t take it as gospel.
To quote an ancient meme - it's good for me to poop on.

Worthless testing. It's endemic to bigger bar better reviews. I ignore them now. It can take more than 20 minutes for the framebuffer to overflow and textures stop rendering in Halo Infinite. You'd never know that from reading any major review. It took Aussie Steve doing the equivalent of investigative journalism to suss that out.

I don't know why I seem to be the only one using the ignore feature in this thread, but it's seems a shame. It has become an indulgence of masochism at this point.
 

Mopetar

Diamond Member
Jan 31, 2011
7,936
6,239
136
I can see why some sites don't test like that. If you're trying to benchmark every possible different major GPU or CPU and a few popular ones going back a few generations and to do this for multiple titles, it cuts in to how much time you can spend with the testing. Add in the pressure of having results available on release and people will try to find ways to save any amount of time they can find.

It would be easier if companies released some built-in benchmarks, but at the same time it's easy to see why they wouldn't want to if it just leads to people crying over how their game is unoptimized or broken. Otherwise longer sessions are difficult to benchmark and suffer from a greater amount of variability.
 
Reactions: blckgrffn

Mopetar

Diamond Member
Jan 31, 2011
7,936
6,239
136
You don't even need an AI if the game is perfectly deterministic and you can get a bit to enter precise inputs. It's not that much different than what the TAS community does with speed running older games. However, I don't know how well that would work for more modern games.

Also it still needs a human set of eyes because as we've seen some games will degrade texture quality to keep the frame rate up and that won't show up in a frame time graph.
 
Jul 27, 2020
16,817
10,764
106
Also it still needs a human set of eyes because as we've seen some games will degrade texture quality to keep the frame rate up and that won't show up in a frame time graph.
I could try to solve that problem by testing the fastest available card with the AI bot and take screenshots at random places. Then I will try to replay that same path with a different card with lower VRAM and take those screenshots again, provided the AI can identify that it is looking at roughly the same place in the game where the previous screenshot was taken. Then use some sort of imagediff program to highlight quality differences between similar screenshots taken around the same place in the game. Maybe not that easy to do for the typical review site but an AI researcher could definitely do something like that and then opensource the whole thing on Github for the world to use. Just an idea
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,944
7,359
136
What I don't understand is why we don't see more indepth testing of this sort mid-gen.

I totally get launch day reviews having to be a quick and standardized testing format, but what about when we're 6 months or 1.5 years into a generation?

GPUs drive these site visits, you'd think more reviewers would fill dead time with "fun" or more indepth testing methodology instead of the 100th insipid keyboard review or whatever.

Instead of testing every card under the sun, test theories from your forums, stress test a popular card, pair new cards with older CPUs, whatever. Techspot/HUB kinda does this, and I appreciate it, but I'd love to see more.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,663
21,169
146
What I don't understand is why we don't see more indepth testing of this sort mid-gen.

I totally get launch day reviews having to be a quick and standardized testing format, but what about when we're 6 months or 1.5 years into a generation?

GPUs drive these site visits, you'd think more reviewers would fill dead time with "fun" or more indepth testing methodology instead of the 100th insipid keyboard review or whatever.

Instead of testing every card under the sun, test theories from your forums, stress test a popular card, pair new cards with older CPUs, whatever. Techspot/HUB kinda does this, and I appreciate it, but I'd love to see more.
You read my mind. I was about to post almost exactly the same thing. Aussie Steve retesting the 5800X3D in this dead time, instead of in depth testing or tackling the high end Intel CPUs crashing in games. That story has been going on a month now. He could have been stress testing this whole time to see if the CPU starts flaking out. Just a stressful built-in bench on infinite loop might be enough to induce the issue being reported?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
I don't know why I seem to be the only one using the ignore feature in this thread, but it's seems a shame. It has become an indulgence of masochism at this point.
It's like The Three Stooges juggling chainsaws. You know exactly what'll happen but you keep watching anyway.

We've had some absolute comedy gold in this thread:
  • Minimum framerate doesn't matter, only averages do.
  • Texture quality doesn't matter because it's just "the game's LOD system".
  • 8GB is fast approaching "good enough forever".
  • 9GB-15GB usage is exactly the same as 8GB usage, no difference.
  • 16GB cards "can't utilize their data set".
We clearly have an esteemed 3D programming industry legend educating us in this thread. John Carmack has absolutely nothing on this guy.

 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
GameGPU is known for having terrible benchmark scenes that don't represent game performance very well. Here is a link to their test sequence for Helldivers 2. Since I don't own the game, I'll let you decide if this one is a good test scene or not.
Known for terrible benchmark scenes? No. Having some objections by users (myself included), calling for harder tests, in some games, yes. There is a reason for that however. When you have a delta from the 6600 to the 4090 to test, you must find a location that everyone can run. After all, that's why they have 1440p and 4k (and 8K in some cases) graphs too.

Case in point Outpost Infinity Siege. A reminder of what I wrote before.
My friend, most if not all UE5 games, have exhibited the same behavior. The demos too. I mean they are there, nobody is testing, but still everyone has an opinion. Go play Brothers a Tale of two Sons remake and see how nice it runs...vram has nothing to do with it either.

The reason I am insisting on true heavy games, like Starfield, Avatar, Alan Wake II, UE5 and the lot, is because the premise of this thread, is that more vram will keep you safe. It will not. And again, I know more vram is better, if you examine it like that arbitrarily. Its excessive necessity, in regards to the tflops/vram ratio is what I am questioning about. YES in some cases, it will help. In most it won't.

Anyhoo, new example, own test (non monetized) very representing of what happens when you choose stupid settings.

Outpost Infinity Siege is the subject. The freagin 4070ti, with 4K DLSS balanced, Ultra preset, drops to 40fps or lower. VRAM maxes at 9GBs. No vram problem, gpu problem again.


Same run on the 3060ti. Same 4K DLSS balanced. Medium preset with maxed textures you all seem to like. No vram problem again. Performance drops to 30s though.



OMG I thought this was UE4 but it's UE5! lol That's why my 970 had such a hard time with it! xD

This is what gamegpu found on a very light part of the game.



For clarification, my 4070ti above, at 4k/ultra/dlss balanced, runs at 60fps indoors but drops to the 40s outdoors, as is shown in the above run I did. This is not shown in the above graph, true, but the point I was trying to make, still stands, I mean look at the 3060 12GB. It is already at 30fps at 1080p, on a light section. Imagine what will happen outdoors. yeah vram aint gonna help you.
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
It's like The Three Stooges juggling chainsaws. You know exactly what'll happen but you keep watching anyway.

We've had some absolute comedy gold in this thread:
  • Minimum framerate doesn't matter, only averages do.
  • Texture quality doesn't matter because it's just "the game's LOD system".
  • 8GB is fast approaching "good enough forever".
  • 9GB-15GB usage is exactly the same as 8GB usage, no difference.
  • 16GB cards "can't utilize their data set".
We clearly have an esteemed 3D programming industry legend educating us in this thread. John Carmack has absolutely nothing on this guy.

You are picking sentences I used, out of context.

For example I only said averages are MORE important and minimums are only a split second measurement and such is useless. Minimums and 1% lows, are two different things, I am not sure if you are familiar with the difference....

I never said 9-15GB is the exact same. I only said and SHOWED, how Capcom's estimations, are exagerrating and how a freagin GTX 1070 DID run RE4 very nicely, with 71% more VRAM needed, according to capcom.

Even worse example, Resident Evil 3, from 4 years ago. Running on a freagin Radeon 7950, with 3GBs. Look at what the estimation was suggesting.




The game was requesting 4.14X more vram than what the 7950 had. Yet it run adequetaly, especially for a PS3 era card. This is a DX12 game, for which the 7950 is a complete no go. But still...

And yes, these 16GBs on the 7600XT....completely useless, aside a very few corner cases. Just wait till Hellblade 2 comes out and you will see how they will help.

Try harder.
 

mikeymikec

Lifer
May 19, 2011
17,807
9,800
136
What I don't understand is why we don't see more indepth testing of this sort mid-gen.

I totally get launch day reviews having to be a quick and standardized testing format, but what about when we're 6 months or 1.5 years into a generation?

GPUs drive these site visits, you'd think more reviewers would fill dead time with "fun" or more indepth testing methodology instead of the 100th insipid keyboard review or whatever.

Instead of testing every card under the sun, test theories from your forums, stress test a popular card, pair new cards with older CPUs, whatever. Techspot/HUB kinda does this, and I appreciate it, but I'd love to see more.

My impression is that tech reviewers are being increasingly sidelined by hardware makers and are having trouble making ends meet, so in those circumstances I can imagine them not particularly wanting to rub the hardware makers up the wrong way by doing a deep dive into the ways that hardware makers are trying to con the end users.

Achieving a scoop that results in significant amounts of egg on hardware makers' faces might provide a short-term boost to a reviewer's leadership, but without easy access to up-to-date hardware is a significant obstacle in the way of turning that short-term boost into a long-term one.
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
As a game I am actually playing, I find this interesting but not necessarily reflecting my experience. I don’t think I tweaked anything when launching the game except to enable HDR but I will have to circle back. My 6800 is usually at 70 fps+ and I have chill enabled. Then again maybe the 3D chip matters here.

I would think the ability to benchmark this game would be really hard without a provided benchmark utility - which maybe I missed? It’s a service game and the runs vary wildly, and if your are playing solo on the lowest difficulty (ie, nearly no enemies and no teammates spamming heavy attacks) then that’s not representative of how the game is played.

It is interesting that for a Sony console focused title Nvidia seems favored. I am wondering what effects Ultra pushes and how many of them are even present on the PS5. It’s likely these are toned down for the ~6700 adjacent GPU in there.
Location from location can have a wide performance delta. This is common in games. The fact of the matter is that gamegpu tests all combinations on the same location. Some times it's lighter, sometimes it's heavier, it happens.

There's always youtube though. Your 70fps on the 6800 are on par with the 70-80fps this guy is showing on the 4060ti at 1080p.


Gamegpu showed parity between these cards, on a lighter location.

The fact of the matter remains. The user above is showing less than 6GBs, in game that will be played by millions. These are the examples that interest me and not Deliver us Mars at 1440p maxed + RT, as has been shown before.

For me, and within the confines of this thread, it's a matter of playability vs unplayability and at what image degradation cost. So far I am not impressed.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |