I no longer believe that N31, assuming it hits 2.5x 6900XT (N21), will have a big performance lead, if any, over AD102 simply because there's another 20% in the tank for the 4090, which is apparently 2x 3090 so roughly 2x 6900XT, and they could unlock the full die with 600W to get to 2.2-2.3x 3090. Will it be efficient? No, but they will have a product that can at least tie N31 on benchmark comparisons. Mindshare = protected, and that's all JHH cares about.Aren’t the AMD claims something like 3x navi21? If that is true ( which I personally doubt), AMD might have a significant performance lead this gen. I guess we will see.
Also, I knew those absurdly high TDP numbers weren’t going to hold. They are still too high, however. Hopefully both NVIDIA and AMD focus on lowering both TDP and power consumption moving forward. Energy is expensive, and renewable energy is limited. Climate change is a thing. yada yada.
Anyways, not trying to be political. My current GPU doubles as a space heater when gaming. If they keep inching up, next gen cards will be unbearable, not to mention PSU requirements. Not to mention many homes in the US only have 15 amp circuits. 🙃
Oh right, only saw that tweet just now. I still wouldn't be surprised if it is just the announcement, with the actual release coming weeks later.He said mid July in another tweet.
I guess the claims of N31 being uber powerful and Nvidia had to push AD102 over 600W just to match N31 can be thrown out entirely. (Not that it was believable from the beginning anyways)
The rumor was 2.5x performance. This so called 4090 is "only" 2x. The 600 W monster is presumably the 4090 Ti.
That's 4090 vs 3090? Raster only? No RT or DLSS 3.0?My assumption right now is ~80% improvement or so across the board.
The rumors for the flagship AMD card had a pretty big reduction in shader count. I almost wonder if there was some confusion over a CDNA part that has more raw compute than the consumer products.
Yeah but latest "rumors" also were about clock being higher, so it ended in the same teraflops range as the old leak.
That's 4090 vs 3090? Raster only? No RT or DLSS 3.0?
The TF projection fluctuated as rumors came in...The earlier theory was mid 90s TF. Now they are saying 75ish. That's still 3x the 6900 XT.
For reference, 6900XT is 2x 5700XT but the TF improvement is 2.36x (23 TF vs 9.75 TF), or an "efficiency" of ~85%. We probably see an even lower efficiency with higher TF so 2.5x seems reasonable. We'll see how much extra TF-to-fps efficiency AMD can claw back with an updated architecture.
Or could be just inflated FP32 TFlops number a.k.a. Nvidia style and all the performance predictions goes out the window. Especially when N31 no longer seems like monster GPU with 800mm2 worth of silicon with just one CCD according to the rumors.
I thought that this improvement only applies to top tier. The shader increase in lower tiers is supposed to be more moderate. Win the crown at all costs but lesser chips have to sell in reasonable quantities.Just raster, but across the range, so also 4080 vs 3080, 4070 vs 3070, etc. I don't particularly care if they overclock some chips to the moon where the additional power consumption is huge, for relatively little gain. That's not real gains, just abuse of poor silicon that never hurt anyone.
Perhaps they'll surprise me, but simply based on the percentage that the node shrinks due to the new process and some modest improvements to the architecture, it seems more realistic.
If they add some extra RT hardware which results in a 2-2.5 speedup in an artificial DLSS benchmark that appeals to the strenghts of the improvements, then they can claim 'up to 2-2.5 times faster,' but then it's not going to be 2-2.5 times faster on the whole.
I thought that this improvement only applies to top tier. The shader increase in lower tiers is supposed to be more moderate. Win the crown at all costs but lesser chips have to sell in reasonable quantities.
I thought that this improvement only applies to top tier. The shader increase in lower tiers is supposed to be more moderate. Win the crown at all costs but lesser chips have to sell in reasonable quantities.
It will be, what it will be.It's not a true generational improvement if they brute force it and it thus doesn't work for 'normal' cards.
It will be, what it will be.
Let me offer some solace.Expensive as all hell? My thoughts exactly.
Yeah but latest "rumors" also were about clock being higher, so it ended in the same teraflops range as the old leak.