8GB VRAM not enough (and 10 / 12)

Page 72 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,979
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
Company Of Heroes 3060 has a higher minimum than the 3070TI:

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

poke01

Golden Member
Mar 8, 2022
1,202
1,390
106
after seeing Alan wake 2 running on GTX 10 cards I am now 100% sure that devs don’t optimise any games at launch. There is so much optimisation left on the table. VRAM is important and even more so with texture sizes increasing but where is the limit?

I can’t be only one that doesn’t want 200GB level of AAA games in the future that’s not sustainable for anyone. Games are meant to accessible to everyone and it’s up to us to make sure that we tell corps that we want a GPU that can withstand not todays games but also games 5-6 years from now.

So to to sum up: textures have an effect on VRAM, optimisation is key and hopefully the cost to play games won’t increase too much in the coming years
 

psolord

Platinum Member
Sep 16, 2009
2,002
1,223
136
Obviously HWUB were running "the wrong settings" for 4GB. They should've dropped to 480p to prove 8GB isn't needed, yo!
Really? We are now posting 4GB vs 8GB videos in 2024, to prove how things are for 8GBs vs everything else? And for the freaking 6500XT 8GB no less, which costs like 20$ less than the rx6600? Why don't they show as rx6600 vs 6500XT 8GB to see how it goes, hmmm?

Yeah I had 4GBs 9.5 years ago on my GTX 970, thank you very much. Less than 4GBs really, xD.

Luck would have it, I am testing my GTX 970 as we speak, since I took it out off the shelf and installed it on the work's PC.


It still runs many games fine and I can assure you it is mostly GPU power limited. Even if it had 8GBs nothing would change. When I say fine, I mean within the scope of a 9.5yo card.

For example from the above link, The finals runs great, Lies of P the same, Last Epoch at very high is absolutely playable and can hit a steady 60 at high, Palworld played by the millions, is also quite good and it can even run some UE5 games nicely, like Tekken 8 and Jusant, not yet uploaded. Baldur's Gate is also a 60fps experience, yes with correct settings. Not yet uploaded too. Anyhoo, I have too much testing yet. I left my poor 970 unattended for too long.

What matters to me, is that even this 9.5 yo card, with its handicapped 3.5GBs of VRAM, can still run PS5 exclusives, like Ratchet and TLOU, while it still has access to a magnitude of games, that run even better.

Ratchet here, with correct settings for the 970.

In TLOU I, although I used the low preset, I upped the textures to medium which you like so much and nothing changed performance wise. Why? Because you need the required GPU power to go with extra vram.

FSR3+FG is coming and I will test this too. This should be fun.
 

jpiniero

Lifer
Oct 1, 2010
14,805
5,428
136
after seeing Alan wake 2 running on GTX 10 cards I am now 100% sure that devs don’t optimise any games at launch. There is so much optimisation left on the table. VRAM is important and even more so with texture sizes increasing but where is the limit?

AAA games are pretty much designed around the consoles... and then shoehorned to work on PC. It works because consoles remain static over the generation and you can also spend more/take on much higher power consumption on top of that if you want.

Feature parity is important though. The 10 series is 4 years older than the current gen consoles. I imagine the VRAM problems would be much less of an issue if GPUs had asset hardware decompression and more games used DirectStorage.
 

Ranulf

Platinum Member
Jul 18, 2001
2,400
1,289
136
Could you elaborate on that further?

Observing how things have shaken out, I don't think 8GB is going to hold back games anymore. I do think owners will increasingly not be enjoying many of the day one experiences. Example: they will release the unoptimized system stranglers like The Last of Us, then as the months go by and they get ROI, there will be the attention necessary to run on just about anything within reason.

Just that it is more obvious to more people that vram is a problem in the gaming scene. For the reason you mention, the day one experience means many should wait for patches and price discounts. Already a good idea given the quality issues of many games beyond hardware performance.
 

Ranulf

Platinum Member
Jul 18, 2001
2,400
1,289
136
Obviously HWUB were running "the wrong settings" for 4GB. They should've dropped to 480p to prove 8GB isn't needed, yo!


My only gripe so far with this video is Steve's notion that the first 8GB mainstream card was the 6500 from AMD. I guess that depends on what "mainstream" is. The RX 480/580 had 8GB options for $240-275 (ref to custom) in 2016-2017. So did the 470 and 570 models. I kind of regret not getting a 570 8GB now but its not that big of a deal for a card that was mostly fine for 3+ years at 1080p for $150 in late 2018. Of course the fire sale 580's that were cheaper were even better in early 2019.

Edit: Ok, later on he talks about recommending the 8GB cards in 2017. My bad.

2nd edit: Tech testers need to be specific about Total War Warhammer 3 and when they test it. Are they just doing the campaign map, the battles or some sort of both? I think here Steve is doing just the campaign map.
 
Last edited:

MrPickins

Diamond Member
May 24, 2003
9,019
586
126
Really? We are now posting 4GB vs 8GB videos in 2024, to prove how things are for 8GBs vs everything else? And for the freaking 6500XT 8GB no less, which costs like 20$ less than the rx6600? Why don't they show as rx6600 vs 6500XT 8GB to see how it goes, hmmm?
The point of the test was to eliminate other variables to show what happens when only the VRAM size is different. How else would it be an apples to apples comparison?

I feel like you're losing sight of the debate here...
 

psolord

Platinum Member
Sep 16, 2009
2,002
1,223
136
The point of the test was to eliminate other variables to show what happens when only the VRAM size is different. How else would it be an apples to apples comparison?

I feel like you're losing sight of the debate here...
The point of the test, was to show what happens between 4GBs vs 8GBs.....in todays games. This does not apply to 8GBs vs 12GBs, or more.

My whole argument in this thread, is not if more vram is arbitrary better. With no other variables, of course it's better.

My argument lies within the spectrum of gpu power to vram ratio and also how protected are people with cards with more vram, against an unwanted forced upgrade. As it has been shown in the latest hyper AAA heavy games, they are not protected at all. So this whole thread is moot.

People that bought into 499$-599$ level of cards, they WILL have to upgrade anyway, if they want to sustain that level of performance for current gen games. VRAM ain't going to save them for 99% of the cases.

For example, the Radeon VII came out the same year as the 2080 super and it had DOUBLE the vram. Well, good luck playing Alan Wake 2 with no mesh shaders and no RT though.
 

Timorous

Golden Member
Oct 27, 2008
1,723
3,124
136
The point of the test, was to show what happens between 4GBs vs 8GBs.....in todays games. This does not apply to 8GBs vs 12GBs, or more.

My whole argument in this thread, is not if more vram is arbitrary better. With no other variables, of course it's better.

My argument lies within the spectrum of gpu power to vram ratio and also how protected are people with cards with more vram, against an unwanted forced upgrade. As it has been shown in the latest hyper AAA heavy games, they are not protected at all. So this whole thread is moot.

People that bought into 499$-599$ level of cards, they WILL have to upgrade anyway, if they want to sustain that level of performance for current gen games. VRAM ain't going to save them for 99% of the cases.

For example, the Radeon VII came out the same year as the 2080 super and it had DOUBLE the vram. Well, good luck playing Alan Wake 2 with no mesh shaders and no RT though.

It is easier to tune performance when you have limited compute than it is when you are limited by VRAM.



Look a 4060Ti 16GB can actually manage a 1080p path traced Alan Wake 2 30 fps experience and given the chart is minimums it will give you a smooth if lower frame rate experience. The 8GB version is a stuttering mess with a 5.4 fps 1% low. Look at the $600 3070Ti, getting absolutely wrecked. OTOH the 12GB 3060 and 11GB 2080Ti are showing that the reason for the poor 3070Ti showing is not down to a lack of compute but entirely down to NVs decision to skimp on VRAM.


Plague tale, here the 16 GB 4060Ti can offer a decent 60 fps experience with RT on, the 8GB is version gives a worse experience than the 7600XT due to those awful 1% lows.

Look at the 1% lows on the 8GB cards, pretty terrible and this is from the 7600XT review so a pretty recent patch of the game. Here the 3060 12GB is offering a better gaming experience than the 3070, 3060Ti and 4060Ti 8GB. If you were to tune the cards to hit 60FPS I know for a fact the 3060 would offer better IQ at 60 fps 1440p than the 8GB cards could offer.

The entire argument basically boils down to 2 things. The first is that pairing a high compute card with a low amount of VRAM is just as more pointless than pairing a low compute card with a lot of VRAM. The second thing is that if you want a card to last a console generation, something akin to the RX480/580 8GB and 1060 6GB then you really need a GPU that has more compute performance and has 12-16GB of VRAM. 4060Ti 16GB ticks that box and while the raster performance delta between it and the PS5 is not as great as the delta between PS4 and the 3 amigos the RT delta is huge so a bit of swings and rounds. Also it costs a bit too much at $440 but it is the cheapest option available right now that has a decent combination of attributes.
 

John Carmack

Member
Sep 10, 2016
156
248
116
So don't need anymore mesh shaders?
Remember when people were dropping their jaws at the system requirements for this game and one of the talking heads at Digital Founders (who were slobbering over this game) brushed off the criticism with something along the lines of demanding requirements doesn't mean the game is poorly optimized?
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,770
21,475
146
Remember when people were dropping their jaws at the system requirements for this game and one of the talking heads at Digital Founders (who were slobbering over this game) brushed off the criticism with something along the lines of demanding requirements doesn't mean the game is poorly optimized?
That's probably Alex, when he is making the content or contributing to a conversation it stops being Digital Foundry and starts being Digital Founder's Edition. Alan Walks 2 is another Nvidia showcase game. And just like Cyberpunk they come out of the gate as card crushers. Making only expensive RTX cards seem worthwhile.

Horizon Forbidden West is hopefully pointing to what is going to be the trend with many new releases. That is, they are going to scale well. Thanks to handheld performance targets.
 

Aapje

Golden Member
Mar 21, 2022
1,464
2,028
106
The point of the test, was to show what happens between 4GBs vs 8GBs.....in todays games. This does not apply to 8GBs vs 12GBs, or more.

Yes and no. Of course, measuring the performance of 4 vs 8 GB today is not going to tell you what you will get from 8 vs 12 GB today.

However, as I've tried explaining to you several times now, people don't just care about playing current and old games, but also buy GPUs for future games. So then they need to make a guess as to the future.

Typically, assuming that patterns from the past will continue into the future, is one of the better ways to predict the future, and also one of the easiest. So, assuming that the VRAM demands keep going up, which is a simple extrapolation of the past, then it makes sense to assume that the same kind of problems that we very often see with 4 vs 8 GB, sometimes with 8 vs 12 GB and rarely with 12 vs 16 GB, will all become more common over time, as the average VRAM demand of newer games keeps rising.

And as others have argued above, high VRAM with a bit lower compute simply seems to age more gracefully than lower VRAM with a bit higher compute.
 

psolord

Platinum Member
Sep 16, 2009
2,002
1,223
136
It is easier to tune performance when you have limited compute than it is when you are limited by VRAM.
As I have said many times, I have THREE 8GB cards and they are nothing alike. There is no way in hell, you can fine tune a game equally, between a GTX 1070, an RX 6600 and an RTX 3060ti. No way. You will go WAY lower on the rx6600 and the GTX 1070, compared to an RTX 3060ti.

Look a 4060Ti 16GB can actually manage a 1080p path traced Alan Wake 2 30 fps experience and given the chart is minimums it will give you a smooth if lower frame rate experience. The 8GB version is a stuttering mess with a 5.4 fps 1%
You are giving me, PATH TRACED results, in the range way below 60fps, below 30fps even, to show how much better 29fps is compared to 5fps? Well yes if we are comparing straight numbers it is, but still unplayable anyway.

On the very same TPU review you found your screenshot, there are perfectly good playable results, WITHOUT ray tracing or path tracing. Why are you CHOOSING to ignore them?



This is the mentality that is wrong with this thread. You are going LOOKING for trouble, you find unplayable results and you are like "see, 8GB are not enough".

End result, 4060ti 8GB and 16GB, same PLAYABLE result, RTX 3070 same playable result as the 16GB RX6800, 3060ti same result as the 11GB 2080ti.

Speaking of the 3060ti, I showed you before on my own 3060ti, how Alan Wake 2 is very much playable, if the user is not a buffoon.


Super playable. Why? Because I went looking for solutions and not for trouble.


Even if you take the 1440p results, arguing about the 3070's price, both the 3070 and the 6800 are unplayable.




Guys if you are going to throw benchmarks at me, at least try harder, especially if there are playable results on that same freagin review. I wasn't born yesterday. Your unplayable results do not stick.


Plague tale, here the 16 GB 4060Ti can offer a decent 60 fps experience with RT on, the 8GB is version gives a worse experience than the 7600XT due to those awful 1% lows.
If the argument here is 4060ti 8GB vs 4060ti 16GB, on these specific settings, I can give it to you. However, if I was in the market for a graphics card, I would still look for the much faster 4070 with only 12GBs, which as you can see destroys the 4060ti 16GB for only 100$ more.

However, once again, aside from the RT results, there are also non RT results, where the difference goes away. I mean THERE IS a solution for the cheaper card, which you choose to ignore once again.




Look at the 1% lows on the 8GB cards, pretty terrible and this is from the 7600XT review so a pretty recent patch of the game. Here the 3060 12GB is offering a better gaming experience than the 3070, 3060Ti and 4060Ti 8GB. If you were to tune the cards to hit 60FPS I know for a fact the 3060 would offer better IQ at 60 fps 1440p than the 8GB cards could offer.
Ok the last of us again. Yeah, that one belongs to the 1% while being far from the top notch of graphical fidelity of games.

"Better" gaming experience for the 3060, while it still being downright BAD, is a non argument once again. You are again using a result, on a resolution that is not meant for these cards, with settings that are not meant for these cards and then you put on the pikachu surprised face. That's not how things work.

Yes the 4060ti 16GB also wins on the 1080p results, that's why I accept this one as a win for your vram arguments and it's one of the VERY FEW.



My counter argument here, will be once again, one of personal testing. That notorious game, on a mere rx6600.


Is it unplayable? no. Is it butt ugly? no. Do you get your business done with a 165 euro 8GB graphics card? yes. Are you bitching for nothing on this thread, by cranking everything over 9000 and then complain? Sure.

Btw there are 12 games on that 7600XT review and the 4060ti is working fine for all of them, that use proper settings, you know as that card was meant to be used...


The entire argument basically boils down to 2 things. The first is that pairing a high compute card with a low amount of VRAM is just as more pointless than pairing a low compute card with a lot of VRAM.
Depends on the threshholds. A 4090 12GB would be far more usable, than a rx6600 with 16GBs.

I mean it's right there, from your own carefully selected screenshots of the 7600XT review. Just see where the 4070 12GB is and where the 7600XT 16GB is....


The second thing is that if you want a card to last a console generation, something akin to the RX480/580 8GB and 1060 6GB then you really need a GPU that has more compute performance and has 12-16GB of VRAM. 4060Ti 16GB ticks that box and while the raster performance delta between it and the PS5 is not as great as the delta between PS4 and the 3 amigos the RT delta is huge so a bit of swings and rounds. Also it costs a bit too much at $440 but it is the cheapest option available right now that has a decent combination of attributes.
Oh no, you don't get to bring the console generation in PC vs PC arguments, because they are totally different things. The plague tale requiem you posted above, uses the medium preset for it's 1080p 60fps performance mode. UE5 games typically run at 720p upscaled with FSR2, the most epic recipe for disaster. Avatar also 720p. Alan wake II also 720p. Baldurs Gate 3 was the only one with a straight 1080p/60 Ultra preset. And you can do all that with a rx6600.

The ONLY games that have some problems on the PC, are *some* of the ports from the PS exclusive space and that is outright down to lazy programming on the PC. Especially on the TLOU. I mean you did see what they did in the medium preset texturest, on the launch version right? Who does that?

For console parity you need at most 1/2 of the vram the console has, for the vast majority of the games. My 3060ti can do UE5, Alan Wake 2, Avatar, Starfield, Plague Tale, wwwwaaaaayyy better than what the consoles can do. For the TLOU I cannot vouch for parity, but the rx6600 example above, can vouch for good enough.
 

psolord

Platinum Member
Sep 16, 2009
2,002
1,223
136
Yes and no. Of course, measuring the performance of 4 vs 8 GB today is not going to tell you what you will get from 8 vs 12 GB today.

However, as I've tried explaining to you several times now, people don't just care about playing current and old games, but also buy GPUs for future games. So then they need to make a guess as to the future.

Typically, assuming that patterns from the past will continue into the future, is one of the better ways to predict the future, and also one of the easiest. So, assuming that the VRAM demands keep going up, which is a simple extrapolation of the past, then it makes sense to assume that the same kind of problems that we very often see with 4 vs 8 GB, sometimes with 8 vs 12 GB and rarely with 12 vs 16 GB, will all become more common over time, as the average VRAM demand of newer games keeps rising.

And as others have argued above, high VRAM with a bit lower compute simply seems to age more gracefully than lower VRAM with a bit higher compute.
With UE5 in our midst, good luck making good guesses about the future.

Vram requirements will go higher, that's a given. What I am also trying to explain from my side, is that the gpu requirements already go WAY higher compared to vram requirements and you chose to ignore it.

I mean I have an up and close personal experience from my own three 8GB cards and I can assure you, vram is the least of their troubles.

The fact of the matter is, that even higher vram cards of the past, WILL need to be upgraded anyway, sooner rather than later. Actually with the way things are going, the real winner will be the one who has the best upscaler.
 
Jul 27, 2020
17,479
11,269
106
You are giving me, PATH TRACED results, in the range way below 60fps, below 30fps even, to show how much better 29fps is compared to 5fps? Well yes if we are comparing straight numbers it is, but still unplayable anyway.
29 fps is not unplayable. The same GPU with an advertised feature (raytracing) totally dives into the gutter due to lack of VRAM, is a waste of money, waste of a good GPU die, destined for the landfill in the future way early than the 16GB variant and all you can think of is "settings"? How about Jensen stop advertising that 8GB POS with the RTX feature? How about put a disclaimer on the box saying "1080p Medium settings only"?

Yes the 4060ti 16GB also wins on the 1080p results, that's why I accept this one as a win for your vram arguments and it's one of the VERY FEW.
FEW is not zero. The 8GB variant for the 4060 should not exist. Period. They can put out a 4050 6GB and a 4050 Ti 8GB if they are so "concerned" about giving Ada to the masses. Putting 8GB on the 4060 with the excuse of making it "affordable" is a pathetic joke and they know it. They probably get a good laugh out of it every time they discuss it in their internal meetings. "We sure fooled 'em, didn't we???? MUAHAHAHAHAHA!!!!". I swear Jensen has North Korean genes.
 

Timorous

Golden Member
Oct 27, 2008
1,723
3,124
136
As I have said many times, I have THREE 8GB cards and they are nothing alike. There is no way in hell, you can fine tune a game equally, between a GTX 1070, an RX 6600 and an RTX 3060ti. No way. You will go WAY lower on the rx6600 and the GTX 1070, compared to an RTX 3060ti.

That is not the argument being made. The argument is that some cards were under provisioned with VRAM so will see unnecessary performance / IQ issues before other products with similar and sometimes less compute will.

The fact a 3060 12GB can offer a better gaming experience than the 3070Ti in certain edge cases right now is direct proof and in the future it won't just be edge cases.

The 7600XT is another example where the VRAM buffer is going to give it far better legs than the 4060 and probably 4060Ti 8GB.

You are giving me, PATH TRACED results, in the range way below 60fps, below 30fps even, to show how much better 29fps is compared to 5fps? Well yes if we are comparing straight numbers it is, but still unplayable anyway.

On the very same TPU review you found your screenshot, there are perfectly good playable results, WITHOUT ray tracing or path tracing. Why are you CHOOSING to ignore them?

I am not ignoring them. A stable 30 fps is pretty playable in slow paced games which AW2 is. Obviously you can turn settings down if you want more FPS and that may help with a VRAM bottleneck at the moment but it won't be long before even low settings exceeds the 8GB VRAM buffer which is where people will really see an issue. If they upgrade frequently then it probably won't happen before they upgrade but if they hold on to cards and treat a PC in a similar vein to a console and want a similar life span skimping on VRAM is going to bite them.



This is the mentality that is wrong with this thread. You are going LOOKING for trouble, you find unplayable results and you are like "see, 8GB are not enough".

End result, 4060ti 8GB and 16GB, same PLAYABLE result, RTX 3070 same playable result as the 16GB RX6800, 3060ti same result as the 11GB 2080ti.

Speaking of the 3060ti, I showed you before on my own 3060ti, how Alan Wake 2 is very much playable, if the user is not a buffoon.


Super playable. Why? Because I went looking for solutions and not for trouble.

Your solutions are to turn down settings, not exactly magical. The point is as cards age those with more VRAM age more gracefully, it has been seen plenty of times over the history of dGPUs so to deny this basic fact is just utterly insane.

Even if you take the 1440p results, arguing about the 3070's price, both the 3070 and the 6800 are unplayable.




Guys if you are going to throw benchmarks at me, at least try harder, especially if there are playable results on that same freagin review. I wasn't born yesterday. Your unplayable results do not stick.

The 'solutions' are to turn down high impact settings like textures. That is the only solution to making games work on limited VRAM models and that setting cares zero about the compute performance of the GPU. If you don't turn down those settings you either end up with a stuttery mess when you exceed the VRAM limit or you have texture swapping which just looks awful.

Another factor is the trend.

PCGH have a nice RT index and in 2023 the 4060Ti 16GB was just 6% ahead of the 8GB variant yet with the 2024 index the 16GB is now 14% ahead and it is only going to increase.

If the argument here is 4060ti 8GB vs 4060ti 16GB, on these specific settings, I can give it to you. However, if I was in the market for a graphics card, I would still look for the much faster 4070 with only 12GBs, which as you can see destroys the 4060ti 16GB for only 100$ more.

However, once again, aside from the RT results, there are also non RT results, where the difference goes away. I mean THERE IS a solution for the cheaper card, which you choose to ignore once again.

View attachment 94968

Sure if you have the 25% more money to spend on a 4070 over the 4060ti 16GB it is a better card and it does have enough VRAM to have a good life span. The issue is people could have had something similar 3 years ago if the 3070 or 3070Ti had a larger VRAM buffer, which is probably why NV only offered them with 8GB.

Ok the last of us again. Yeah, that one belongs to the 1% while being far from the top notch of graphical fidelity of games.

"Better" gaming experience for the 3060, while it still being downright BAD, is a non argument once again. You are again using a result, on a resolution that is not meant for these cards, with settings that are not meant for these cards and then you put on the pikachu surprised face. That's not how things work.

Yes the 4060ti 16GB also wins on the 1080p results, that's why I accept this one as a win for your vram arguments and it's one of the VERY FEW.

View attachment 94969

My counter argument here, will be once again, one of personal testing. That notorious game, on a mere rx6600.


Is it unplayable? no. Is it butt ugly? no. Do you get your business done with a 165 euro 8GB graphics card? yes. Are you bitching for nothing on this thread, by cranking everything over 9000 and then complain? Sure.

Your counter argument is again turning down settings that cards with a similar level of compute and more VRAM don't need to turn down. It is not an argument, it is an admission of failure.

Btw there are 12 games on that 7600XT review and the 4060ti is working fine for all of them, that use proper settings, you know as that card was meant to be used...


proper settings are whatever the user is happy with. The argument, again, is that for someone who is holding onto a part for a long time and is on a limited budget is better off ensuring the card they buy has enough VRAM. That could be by buying up the stack and getting a 4070 or something or it could be by buying a 7600XT / 4060Ti 16GB depending on how much they can afford / are willing to spend.

Depends on the threshholds. A 4090 12GB would be far more usable, than a rx6600 with 16GBs.

That is an extreme delta and even then the 12GB 4090 would probably be pretty rubbish at 4K + RT which is one of the main reasons to own a 4090.

I mean it's right there, from your own carefully selected screenshots of the 7600XT review. Just see where the 4070 12GB is and where the 7600XT 16GB is....

The 4070 is also about $200 more than the 7600XT so you are not really talking about things in the same price range. The 4060 / 4060Ti 8GB are far more reasonable comparisons but you don't seem to like reasonable comparisons do you.


Oh no, you don't get to bring the console generation in PC vs PC arguments, because they are totally different things. The plague tale requiem you posted above, uses the medium preset for it's 1080p 60fps performance mode. UE5 games typically run at 720p upscaled with FSR2, the most epic recipe for disaster. Avatar also 720p. Alan wake II also 720p. Baldurs Gate 3 was the only one with a straight 1080p/60 Ultra preset. And you can do all that with a rx6600.

The ONLY games that have some problems on the PC, are *some* of the ports from the PS exclusive space and that is outright down to lazy programming on the PC. Especially on the TLOU. I mean you did see what they did in the medium preset texturest, on the launch version right? Who does that?

For console parity you need at most 1/2 of the vram the console has, for the vast majority of the games. My 3060ti can do UE5, Alan Wake 2, Avatar, Starfield, Plague Tale, wwwwaaaaayyy better than what the consoles can do. For the TLOU I cannot vouch for parity, but the rx6600 example above, can vouch for good enough.

Consoles matter because they set the bar for what devs target. That happens not from console launch to console launch but from end of cross generation phase to end of cross generation phase. We have pretty much only just come out of the cross generation phase of development so the ramp is going to happen from now until the end of the cross gen phase for the PS6 generation which is probably 7/8 years away.

This exact argument was had with PS4 generation and when the 8GB RX480 was released a lot of people said not worth the extra over the 4GB version. Sure if you upgrade every 2/3 years that is probably correct if you purchased in 2016 but by 2018/2019 those 4GB variants were really showing the shortcomings. We are at that stage now with a few examples to show you it will become an issue. In 2/3 years it will be a lot of examples and thsoe with the 7600XT or the 4060Ti 16GB are going to be having a far better time with their hardware than those who with with the 7600 / 4060 / 4060 Ti 8GB / 3070 Ti.

The half console ram was pretty valid when consoles had split memory pools but now they have unified memory pools you need to look at how much is allocated to VRAM. 12GB is about the minimum and for a weaker compute card 16GB is probably required. This is because the consoles have dedicated decompression hardware. For GPUs like the 7600XT where Direct Storage decompression is done by the shaders they don't really have the headroom to do that and render as well as a console does so those weaker cards having more VRAM means less decompression to do on the fly and means more of the compute resources can go to rendering. With the 4070 there is plenty of compute headroom so doing more decompression on the fly is not really going to cause as much pain.

See ratchet and clank.



Super playable settings here for the 4060Ti 16GB and the 7600XT but not so much for the 7600.



These 1% lows just highlight it even better. The 7600 has drops of 25 FPS from a middling 46 fps average. The 7600XT has drops of 10FPS from a really nice 82 fps and easily clears the 60 fps barrier.

With RT on the 4060Ti 16GB is pretty decent.



It is actually pretty close to providing the frame rate the 4060Ti 8GB offers with RT off so here choosing the 16GB variant of a card with the same RT and compute capabilities is the difference between being able to turn RT on and maintain 60+ fps average or needing to keep RT off to maintain 60+ fps average.

This is the clear sign of things to come, we have seen it before and if you want to deny reality then be my guest but whatever.
 

Elfear

Diamond Member
May 30, 2004
7,106
677
126
I'll also toss out that texture packs for games are a popular thing, and can really make a difference in visuals, with little extra compute cost, but a correspondingly large increase in VRAM requirements.
100% There have been multiple times I've used hi-res texture packs in games (mainly Bethesda) and the GPU wasn't challenged much more but the VRAM took a big hit. Some low-VRAM cards just weren't up to the task even though the GPU was more than capable.
 

MrPickins

Diamond Member
May 24, 2003
9,019
586
126
100% There have been multiple times I've used hi-res texture packs in games (mainly Bethesda) and the GPU wasn't challenged much more but the VRAM took a big hit. Some low-VRAM cards just weren't up to the task even though the GPU was more than capable.
For sure.

One of the first things I did when I upgraded from an HD7950 (3GB) to an RX580 (8GB) was to install the prettiest texture packs I could find on Skyrim, and it looked gorgeous.
 

MrTeal

Diamond Member
Dec 7, 2003
3,584
1,743
136
Man, where'd they even get an 8GB 6500 XT? I've never seen one in stock anywhere. The 6500XT 4GB is leaving things on the table, but that's somewhat mitigated by its $140 price point. It would certainly be more attractive with 8GB even if it was an extra $20 though. Can't be more than that though, or it's impossible to justify vs something like the 8GB 6600 you can get for $190 at NE and Amazon and is twice as fast as it.
 
Jul 27, 2020
17,479
11,269
106
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |