8GB VRAM not enough (and 10 / 12)

Page 90 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
Company Of Heroes 3060 has a higher minimum than the 3070TI:

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

MrPickins

Diamond Member
May 24, 2003
9,017
585
126
The 3060 can absolutely do 4K, 30 FPS is absolutely playable.

It's far from ideal, but i played through Hogwart's Legacy on a 980ti and even at the "correct settings" it was waffling around 30 FPS and was very playable.
I was just about to say that.

I much prefer 60fps, but many people don't care and would actually rather do 4k/30 vs 1080p/60. Declaring anything under 60fps as "unplayable" is just silly.
 
Jul 27, 2020
16,820
10,767
106
Regarding DLSSing games to 4K, DLSS isn't supported by every game. More VRAM wins when the game doesn't support DLSS.

Regarding stupid comparisons, a buyer deserves not to be disappointed. A card with low VRAM is sure to disappoint in the rare game that the buyer may WANT to play, especially a buyer with limited means. It is best for such a buyer to go with the highest VRAM card possible within his budget, instead of relying on some hokum tech that can't be applied globally and needs to be supported by games. That rules out less popular but still very interesting and enjoyable games which is the majority of indie games.

And as discussed so many times in this thread before, you fall in love with some game. Complete it 100% and then go looking for mods to squeeze out some more fun from the game. And this is when you sadly find out the limits of your card if you went with 8GB or less.

I and I'm sure a lot of people here believe in killing as many birds with one stone as possible. For a vast majority of people, especially from developing countries, the 8GB limit has probably caused countless sighs and saddened many hearts. So I cannot in good faith let anyone buy an 8GB card if they intend to keep it for a long time. As a spare card or something to mess with casually, sure an 8GB card isn't a bad thing to have.
 
Jul 28, 2023
141
523
96
And as discussed so many times in this thread before, you fall in love with some game. Complete it 100% and then go looking for mods to squeeze out some more fun from the game. And this is when you sadly find out the limits of your card if you went with 8GB or less.
cue "not a correct use-case" argument.

I and I'm sure a lot of people here believe in killing as many birds with one stone as possible. For a vast majority of people, especially from developing countries, the 8GB limit has probably caused countless sighs and saddened many hearts. So I cannot in good faith let anyone buy an 8GB card if they intend to keep it for a long time. As a spare card or something to mess with casually, sure an 8GB card isn't a bad thing to have.
Would someone please spare a thought for Jensen's margins?
 
Jul 27, 2020
16,820
10,767
106
It is if it's $300-$400.
If someone wants to let Jensen afford one more jacket with a spare 8GB card, who am I to keep a gentleman from offering fealty to his hero?

I don't expect Jensen to wake up one day and suddenly decide that 8GB isn't enough. This damn RAM/VRAM amount is going to haunt us for a few more years. Possibly even 5 years for us to never see another device again with 8GB. The best we can do is offer sane advice to visitors to these forums and save them from wasting their money.
 
Mar 8, 2024
37
110
66
It is if it's $300-$400. The entire point of the thread. No one is spicy about the RX 6600 or A750 because the price for 8GB is fine.

Can we print this out and send it to psolord? Maybe have it affixed to his monitors? This is the entire ever-loving thing, the whole deal. An A580/750 or RX 6600/6650xt are wonderful cards... Because they provide value for money. Nearly a hundred pages of equivocation, goalpost shifting, and outright detachment from reality is entertaining, but it's hardly edifying.

It's going to be VERY funny when the 5060 launches with 8 gigs at 350 USD next summer... We're going to reach a level of apoplexy usually reserved for American politics!
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,663
21,169
146
Can we print this out and send it to psolord? Maybe have it affixed to his monitors? This is the entire ever-loving thing, the whole deal. An A580/750 or RX 6600/6650xt are wonderful cards... Because they provide value for money. Nearly a hundred pages of equivocation, goalpost shifting, and outright detachment from reality is entertaining, but it's hardly edifying.

It's going to be VERY funny when the 5060 launches with 8 gigs at 350 USD next summer... We're going to reach a level of apoplexy usually reserved for American politics!
We can't let the RX 7600 off the hook. The cheapest model is $250 at newegg with the rest costing more. AMD just follows whatever Nvidia does only a little cheaper with no leather interior or sunroof.

I am stoked he keeps fighting the battle here though. It is going to be entertaining to revisit in a couple of years. I wish we had a remind me forum bot...
 

nurturedhate

Golden Member
Aug 27, 2011
1,753
721
136
Can we print this out and send it to psolord? Maybe have it affixed to his monitors? This is the entire ever-loving thing, the whole deal. An A580/750 or RX 6600/6650xt are wonderful cards... Because they provide value for money. Nearly a hundred pages of equivocation, goalpost shifting, and outright detachment from reality is entertaining, but it's hardly edifying.

It's going to be VERY funny when the 5060 launches with 8 gigs at 350 USD next summer... We're going to reach a level of apoplexy usually reserved for American politics!
Just to repost it.
 
Mar 8, 2024
37
110
66
We can't let the RX 7600 off the hook. The cheapest model is $250 at newegg with the rest costing more. AMD just follows whatever Nvidia does only a little cheaper with no leather interior or sunroof.

I am stoked he keeps fighting the battle here though. It is going to be entertaining to revisit in a couple of years. I wish we had a remind me forum bot...

AMD will at least let the MSRP on those come down over time, if they can get the 7600 to present-day 6600 levels and let the XT get down to 250 bucks, they'd be okay-ish values. But at that rate, you'd still probably be able to find a 6750XT for the same price, which would totally invalidate the choice. RDNA 2 is going to really age well for budget gamers, god knows I'm happy as a clam with my 6800.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,947
7,361
136
This has already been going on for a couple years

- It didn't even occur to me that this thread started in 2021 and is 90 pages long (at 100 posts per page).

It feels like its pretty constantly at the top of the forum and in the new posts section. The Sisyphean nature of the thread also makes it feel like you're always jumping into page 2 of the discussion...
 

Mopetar

Diamond Member
Jan 31, 2011
7,936
6,239
136
I am stoked he keeps fighting the battle here though. It is going to be entertaining to revisit in a couple of years. I wish we had a remind me forum bot...

I don't think you need one. I'm pretty sure he'll still be posting. This thread would have probably died months ago if he didn't keep bumping it.

If he ever stops I'm going to be half tempted to call in a welfare check to make sure he's still okay. Or living I suppose.
 
Jul 27, 2020
16,820
10,767
106
Ironically, gamers are luckier than professional users. All three vendors' professional GPU prices are nothing short of highway robbery. Even for 4GB and 6GB cards. Though AMD's W6800 32GB is within my buying power. Not that I will buy it. OK. I'll buy it if I'm very, very bored. But then, for the same money, I could buy seven 3050 6GB cards.

Decisions decisions
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
It is if it's $300-$400. The entire point of the thread. No one is spicy about the RX 6600 or A750 because the price for 8GB is fine.
Can we print this out and send it to psolord? Maybe have it affixed to his monitors? This is the entire ever-loving thing, the whole deal. An A580/750 or RX 6600/6650xt are wonderful cards... Because they provide value for money. Nearly a hundred pages of equivocation, goalpost shifting, and outright detachment from reality is entertaining, but it's hardly edifying.

It's going to be VERY funny when the 5060 launches with 8 gigs at 350 USD next summer... We're going to reach a level of apoplexy usually reserved for American politics!
No need to print it and send it to me. I'd love all cards to be way cheaper, but I am a realist.

What I should print and send to you however, is that I have told you a hundred times already, that I have THREE 8GB cards and they are nothing alike. For me, their value is exactly where it should be.

I have an RX6600 too. Great little card, especially for the price, I agree. However this is where we are now. I fired up Everspace 2, so to see the new Lumen upgrade. Lo and behold. The game is now unplayable on the rx6600. This is what it was before from gamegpu's testing.


This is what happens with the new patch and Lumen enabled.


The game went from 72fps to 46fps for the rx6600. What this means, is that game went from playable to unplayable. The 3060ti however, still manages a playable 1080p experience. The 4060ti is 2.13X faster than the rx6600, while also being faster than the 6800s once again. Are these 8GB cards anything alike? No! So how exactly you want them to be priced? Do you really want a flat price for all? What do you want, 30$ difference for each tier? Would you follow that pricing if you were the manufacturer? Do the rest of the monetary tiers in your life, work like that?

As a side note, also keep in mind that this game, went to +56% gpu power requirement, while it increased only 10-15% in vram requirement. Let that sink in and understand what I am talking about, when I am showing you were things are heading. The marveled rtx 3060 with its 12GB, still has framedrops below 60fps. Yeah, vram aint gonna help, even in 1080p.
 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
I don't think you need one. I'm pretty sure he'll still be posting. This thread would have probably died months ago if he didn't keep bumping it.

If he ever stops I'm going to be half tempted to call in a welfare check to make sure he's still okay. Or living I suppose.
See? At least you guys know I am alive! xD

This is an interesting thread non the less. I said, I'm here for the tech (mostly).
 
Reactions: DeathReborn

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
The 3060 can absolutely do 4K, 30 FPS is absolutely playable.

It's far from ideal, but i played through Hogwart's Legacy on a 980ti and even at the "correct settings" it was waffling around 30 FPS and was very playable.
Oh wow. When you see lesser performance on some test with exorbitant settings on 8GB cards, even if they are within acceptable playability boundaries, you are all grabbing the pitchforks. Now all of a sudden, 30fps is absolutely playable? On an action game that has extreme x+z axis movement? And I am the one doing goalpost shifting? lol
 
Jul 27, 2020
16,820
10,767
106
I said, I'm here for the tech (mostly).
We are lucky that you stay confined to this thread. If other tech threads got infected with your unique brand of arguing, there could be rampant chaos and dozens or hundreds of unnecessary back and forth posts with people trying their best to help you understand something as basic as long term value of hardware and future proofing. Though I guess you are doing a service for third world country members by letting them know how to enjoy games with limited means. A lot of us here are more interested in moving the needle and progressing forward.
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
Oh and something else, very very interesting. Gamegpu tested a tech demo, on both UE4 and UE5. I mean who does that? These guys are epic (pun intended)


Here is the result. I will stay at 1080p.



I mean....where do I start?

The rx6600 gets less than half the performance. The RTX 3060 12GB goes tits up. The 4060ti is destroying it, as the 3060ti in a lesser degree. Not all 8GB cards are equal, as not all 12 GBs are equal, as not all 16GB cards are equal.

Vram usage goes up by 1GB, as I told you a long time ago, easy to see with the on/off switch in Karagon. However gpu power requirement more than doubles. So are we clear now, or do you need more proof of what I am saying?
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
We are lucky that you stay confined to this thread. If other tech threads got infected with your unique brand of arguing, there could be rampant chaos and dozens or hundreds of unnecessary back and forth posts with people trying their best to help you understand something as basic as long term value of hardware and future proofing. Though I guess you are doing a service for third world country members by letting them know how to enjoy games with limited means. A lot of us here are more interested in moving the needle and progressing forward.
Vram alone aint gonna future proof you my friend. See all the 3060 12GB example above. The sooner people realize that, the better. In older times I would have said nothing. These are new waters we are navigating though...

Thanks for the vote of confidence! It happens when people have a different point of view. Don't cry when Hellblade II comes out.
 
Reactions: igor_kavinski

coercitiv

Diamond Member
Jan 24, 2014
6,257
12,196
136
The game went from 72fps to 46fps for the rx6600. What this means, is that game went from playable to unplayable.
No it doesn't, what a way to misrepresent the situation. Where are your "correct settings" in this conclusion of yours? Did you tell folks they can still enjoy the game with SSGI instead of Lumen, did you tell them Lumen is not the default GI setting after the upgrade? I had to enable it manually.

I just had fun with the ES2 update, found a build I liked and took it to Lunacy 2000. For the folks who don't know, it's an ARPG & space combat sim combo, with A LOT of focus on speed and killing enemy ships quickly. Even before the UE5 update the best way to enjoy the game for me was to lock in 120FPS and lower details to High because most of the time you're chasing afterburners. The update increased load on the GPU, but since I was running with headroom to spare (to keep power down), I just kept the old settings, including FSR. I briefly tried XeSS, but did not see the same improvement I found in CP 2077, so reverted to FSR for the energy savings.

Lumen in the game looks great but there area caveats. I tried enabling twice, once during space travel and once in a darker environment. On both occasions it messed with color saturation and visibility, so I decided to keep it off while I was doing rifts and incursions, the game has enough distracting particle effects as it is. After seeing the Lumen demo video from the developer, I'll enable it while I do some exploring to check how it behaves in more environments.
 
Last edited:
Jul 27, 2020
16,820
10,767
106
I mean....where do I start?

The rx6600 gets less than half the performance. The RTX 3060 12GB goes tits up. The 4060ti is destroying it, as the 3060ti in a lesser degree.

OMG. What did you smoke before posting that????





How are these three cards in the same class based on pricing, for you to even be comparing them???

The 4060 Ti as usual wins the Turd Award with its pricing.

95% higher price and only 8GB VRAM?

Awesome!

/s
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
OMG. What did you smoke before posting that????

View attachment 98574
View attachment 98575
View attachment 98576

How are these three cards in the same class based on pricing, for you to even be comparing them???

The 4060 Ti as usual wins the Turd Award with its pricing.

95% higher price and only 8GB VRAM?

Awesome!

/s
OMG that's what I am saying. They are priced according to their performance and not their vram, because the latter is mostly irrelevant for their tier.
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
No it doesn't, what a way to misrepresent the situation. Where are your "correct settings" in this conclusion of yours? Did you tell folks they can still enjoy the game with SSGI instead of Lumen, did you tell them Lumen is not the default GI setting after the upgrade? I had to enable it manually.

I just had fun with the ES2 update, found a build I liked and took it to Lunacy 2000. For the folks who don't know, it's an ARPG & space combat sim combo, with A LOT of focus on speed and killing enemy ships quickly. Even before the UE5 update the best way to enjoy the game for me was to lock in 120FPS and lower details to High because most of the time you're chasing afterburners. The update increased load on the GPU, but since I was running with headroom to spare (to keep power down), I just kept the old settings, including FSR. I briefly tried XeSS, but did not see the same improvement I found in CP 2077, so reverted to FSR for the energy savings.

Lumen in the game looks great but there area caveats. I tried enabling twice, once during space travel and once in a darker environment. On both occasions it messed with color saturation and visibility, so I decided to keep it off while I was doing rifts and incursions, the game has enough distracting particle effects as it is. After seeing the Lumen demo video from the developer, I'll enable it while I do some exploring to check how it behaves in more environments.
So in other words, what is the conclusion? That you are going to use correct settings anyway. Isn't that what I am saying all along and you are mocking?

Tell us please, in order to reach that 120fps, what did these settings do, to your vram usage and what did they do to your gpu power?

btw these are my three tests on Everspace 2 (non monetized).


I will stand more on the GTX 1070 vs RTX 3060ti comparison. Largely different cards. Both 8GBs. The 3060ti could do 4k/epic/dlss/60 the 1070 only 1080p/60/high. Do you see the difference or do you not? How is the 8GBs handicapping the 3060ti? They do not. The card is doing what it was meant to do. Actually way more in this one, since the 3060ti is not a 4k/dlss card. The game was just very light before the Lumen patch.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |