Discussion RDNA4 + CDNA3 Architectures Thread

Page 88 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
Mar 3, 2017
1,615
5,869
136





With the GFX940 patches in full swing since first week of March, it is looking like MI300 is not far in the distant future!
Usually AMD takes around 3Qs to get the support in LLVM and amdgpu. Lately, since RDNA2 the window they push to add support for new devices is much reduced to prevent leaks.
But looking at the flurry of code in LLVM, it is a lot of commits. Maybe because US Govt is starting to prepare the SW environment for El Capitan (Maybe to avoid slow bring up situation like Frontier for example)

See here for the GFX940 specific commits
Or Phoronix

There is a lot more if you know whom to follow in LLVM review chains (before getting merged to github), but I am not going to link AMD employees.

I am starting to think MI300 will launch around the same time like Hopper probably only a couple of months later!
Although I believe Hopper had problems not having a host CPU capable of doing PCIe 5 in the very near future therefore it might have gotten pushed back a bit until SPR and Genoa arrives later in 2022.
If PVC slips again I believe MI300 could launch before it

This is nuts, MI100/200/300 cadence is impressive.



Previous thread on CDNA2 and RDNA3 here

 
Last edited:

Saylick

Diamond Member
Sep 10, 2012
3,194
6,492
136
they don't want to compete.
That's not enough.
I almost feel like like AMD needs 800mm2 of silicon to have a comfortable margin over whatever Nvidia cooks up because everyone knows Nvidia will have a 600mm2 behemoth. It will be interesting either way because Jensen doesn't like losing and will stop at nothing to win, even if it means 600W GPUs (melting power connectors be damned).
 

adroc_thurston

Platinum Member
Jul 2, 2023
2,328
3,151
96
I almost feel like like AMD needs 800mm2 of silicon to have a comfortable margin over whatever Nvidia cooks up because everyone knows Nvidia will have a 600mm2 behemoth. It will be interesting either way because Jensen doesn't like losing and will stop at nothing to win, even if it means 600W GPUs (melting power connectors be damned).
800mm^2 is really low.
Double that.
 

Saylick

Diamond Member
Sep 10, 2012
3,194
6,492
136
800mm^2 is really low.
Double that.
You must be implying an MI300-style configuration for 1600mm^2 of total silicon then, because there's no way a good portion of 1600mm^2 isn't on an older node in the form of an active base die(s). If the compute dies sit over an area that is roughly equal to the base dies, then 800m^2 of cutting edge silicon (read: silicon with actual compute) is in the ballpark of what I was saying.
 

adroc_thurston

Platinum Member
Jul 2, 2023
2,328
3,151
96
You must be implying an MI300-style configuration for 1600mm^2 of total silicon then, because there's no way a good portion of 1600mm^2 isn't on an older node in the form of an active base die(s). If the compute dies sit over an area that is roughly equal to the base dies, then 800m^2 of cutting edge silicon (read: silicon with actual compute) is in the ballpark of what I was saying.
MI300 is 2.3k mm^2, fair bit bigger.
And yeah, when you spam, you spam.
 
Reactions: Tlh97 and Saylick

Saylick

Diamond Member
Sep 10, 2012
3,194
6,492
136
yes, that's the only way for MSS to go up.
Shotgun the comp, bribe the shills to sign the songs of amaziness of your products.
Well, hopefully AMD develops a smarter upscaler by 2026 so that RDNA 5 doesn't have some big asterisk next to it in reviews, because we all know Nvidia will market their GPU as beating this purported behemoth due to software "trickery".
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,141
1,089
136
The vibe I get here is that AMD is a quitter in the GPU arms race. I see AMD as the 3rd wheel that has made a ton of money over the last 3 or 4 years. They need to invest some of that money into their GPU drivers. Then they need to stop charging premium prices for product that miss the mark compared to what Nvidia brings generation after generation.

I get that most people here buy the high end cards. The March steam survey shows the 3060 and 2060 as the top two graphics cards with 10.69% of the overall steam survey. Further down the list is the 3070 and 4070.

For AMD it would have made the most sense to have a better performing 7600XT from the beginning. That would have meant a 7600XT on N5 silicon instead of N6. Better GDDR6 memory than Nvidia for each graphics tier. Memory bandwidth is huge when it comes to OCing or having stock GDDR6 with fast speeds. Like 15-20% performance gains just by Ocing the memory. It also seems AMD has some firmware/driver limitations on their GPU's to gimp or cripple performance when it comes to OCing. The 7900GRE card that just had the GDDR6 memory speed limits lifted is just one recent example.

It seems AMD is selling the same GPU in some cases and limiting the performance with firmware vbios limits. Which begs the question on margins.
 

adroc_thurston

Platinum Member
Jul 2, 2023
2,328
3,151
96
The vibe I get here is that AMD is a quitter in the GPU arms race
they're literally building the monstrocity called Navi50 and I don't even talk about their GPGPU efforts (fancy!).
For AMD it would have made the most sense to have a better performing 7600XT from the beginning. That would have meant a 7600XT on N5 silicon instead of N6.
way to miss the point of N33 existence!
 

Mahboi

Senior member
Apr 4, 2024
341
574
91
More ray-box intersection capability just means you can get through more box intersection tests per cycle. That's it.
Right, I'm thinking in software ways, not in GPU ways.
I was doing more reading and came across this article. It helped me understand some of the theory.
Right so I wrote a super long response before reading that, you should've posted that first!

So it works differently from game physics.
In game physics, if you have to calculate a dynamic object (I.E, bullet) with every dynamic object in the game, you'll do hundreds of detections per bullet per tick. Say you have 50 players with 50 meshes made out of 20 hitboxes in your Battlefield-like game, you don't want to check all 50 * 20. That is unusable, so physics engines divide the space into smaller bounding spaces that bring down collision detection to a more bearable amount of checks. If you divide through bounding volumes, for a 3D world, you'll create 8 3D spaces from 1, each of which gets divided by 8, which gets divided by 8, and so on 8 times. That's how it would work for physics. But RT doesn't work like that.

I didn't pay close attention at first but this structure shows that you're not dividing by 8. Which makes no sense if you're dividing a world based on its center, like with phys engines.
RT divides by whatever seems sensible, so 2 by 2 or 4 by 4. Which again makes little to no sense in the context of a 3D world, but makes perfect sense in the context of a light in a 3D world.

If you visualise the BVH as a division not of the 3D space but of the space in front of a directional light, it makes a ton more sense.

The light doesn't require full space division. It just needs to have a structure that allows testing multiple ray collisions effectively. The wider you get, the more collisions you do per cycle, at the cost of more computation I expect.
So I got what you were saying, just took me a bit after I had a bad night's sleep.
 
Reactions: Tlh97 and cherullo

ToTTenTranz

Member
Feb 4, 2021
41
78
61
All of them. N33 too.

I doubt the Navi33 on N6 was supposed to reach much higher clocks than the RDNA2 predecessors on N7, considering the massive jump in clocks these had already brought compared to RDNA1.
How the RX 7600 XT on N6 averages >2.7GHz in games, almost +1GHz over the RX5700XT in N7, sounds super impressive to me. According to TSMC, N6 vs N7 should provide only 7% higher clocks at ISO power.

The big problem is how the N7->N5 transition should have brought at least +15% clocks and then N31/N32 had ~+0% over N21/N22.


Now if RDNA4 solves the clock problems that RDNA3 had, plus if it's using N4X, then we should be looking at +15%^2 between RDNA2 and RDNA4.
So if RDNA2 at N7 could average at ~2.4GHz with ease, then the RDNA4 chips on N4X should average (not boost) at 3.1GHz.



The vibe I get here is that AMD is a quitter in the GPU arms race.
AMD wasn't a quitter in 2008 when they held back on a top-end product but launched the cost-focused RV770 cards that wouldn't compete on the high-end against the massive Tesla GT200 with a 512bit bus. On the contrary, it was a period when AMD gained a lot of marketshare for releasing a product with much better price/performance than the competition.

They weren't quitters when they launched the Polaris family in 2016 that was actually pretty well received and also never had anything to compete in the high-end.

And they weren't quitters again when they launched the RDNA1 family in 2019 with their highest-end at $400 that had nothing to compete with the $1200 RTX 2080 Ti.


Interestingly save for the R300 / Radeon 9700 era and the crypto craze anomalies, it's whenever AMD focuses on less SKUs without competing at the high-end that they've been able to recoup marketshare.




Of course, not competing at the high end brings problems for brand value, product ASP and raw margins. However AMD desperately needs to increase dGPU marketshare at the moment. Being in the 10-15% marketshare probably makes it very hard to be profitable with spending all the money needed to put new chips out there.
 

Mahboi

Senior member
Apr 4, 2024
341
574
91
I doubt the Navi33 on N6 was supposed to reach much higher clocks than the RDNA2 predecessors on N7, considering the massive jump in clocks these had already brought compared to RDNA1.
How the RX 7600 XT on N6 averages >2.7GHz in games, almost +1GHz over the RX5700XT in N7, sounds super impressive to me. According to TSMC, N6 vs N7 should provide only 7% higher clocks at ISO power.
For the 100000000000000000000000000000th or so time, the core problem in RDNA 3 is voltage handling, or some part of the power handling within the arch itself.
Meaning every single RDNA 3 product from a 7640hs to the XTX is clocked necessarily about 20% lower than it should've been and consumes way more power.
The announcement was "50% more perf, 50% better power efficiency". It was a bold lie and the real announcement should've been "50% more perf, 0% better power efficiency".

If it's got RDNA 3 in it, you can be sure that it is underclocked to cover for the horrid power draw. If you don't believe me, simply get a 7600 or xt and watt it up to 300W and try and see if you can't get 3.1Ghz easily.
The node is irrelevant in this. The arch itself is clocked to compensate for the electrical problems. If it got factory clocked at 2.5Ghz and 170W, you can be sure that it could do 3Ghz if you were willing to feed it 300W.
AMD wasn't a quitter in 2008 when they held back on a top-end product but launched the cost-focused RV770 cards that wouldn't compete on the high-end against the massive Tesla GT200 with a 512bit bus. On the contrary, it was a period when AMD gained a lot of marketshare for releasing a product with much better price/performance than the competition.
Nobody's saying AMD are quitters. I'm saying AMD's policy of desperately pinching every penny for their products and releasing only when they are sure of comfortable margins isn't getting them any appreciation.
Nvidia releases fat, expensive monolithic dies way bigger than anything AMD has done in over a decade. They don't care if it's "not as economical". This is a company philosophy problem, not a technical one. Nvidia goes big or goes home, AMD goes as small as possible and that's getting tiring.
Interestingly save for the R300 / Radeon 9700 era and the crypto craze anomalies, it's whenever AMD focuses on less SKUs without competing at the high-end that they've been able to recoup marketshare.
That's not surprising since Nvidia's "go big" philosophy tends to build large, expensive products and to then find the markets to pay for them.
Whereas AMD's "maximise value" policy tends to build reasonable, smaller dies that are meant to satisfy 80% of the market instead of pushing the market their way.
The former makes for great top dies, while the latter makes for great midrange and low size stuff.

I should make a complete explanation of why Nvidia keeps winning against AMD because apparently nobody noticed yet: it's not about the product, it's about the way you get it sold. Nvidia selfishly pushes the market and tech where it wants to. Fermi and CUDA, later VR, streaming and AI. The general effort of Jensen has been to provide things to a sometimes non-existant market and then to hype the heck out of it. This is a highly risky strategy because you're basically creating something out of pure will, and it is kind of an obnoxious thing to push everyone to do things your way, but clearly Jensen is very good at it.

AMD meanwhile patiently waits for Nvidia to innovate and follows, or for Sony to make a request, or for a market to present itself. This is a fundamental difference in company culture, and is what thoroughly disvalues adroc or branch_suggestion's opinions on "AMD will just build the biggest, fattest GPU and they'll just win". I have never seen AMD make an unreasonably fat and risky thing unless they were 100% sure that it would sell. This is why they're always N°2, because the competition sees the prey and leaps, while AMD waits to be sure that the prey has been correctly identified in the bush, has been mapped, weighed, geolocated by satellite, genetically tested, and then only jumps after all the securities have been taken.

In clear: whatever AMD will output with RDNA 5, I just expect Jensen to go "go bigger, go harder, go even if it is stupid, but just go".
Of course, not competing at the high end brings problems for brand value, product ASP and raw margins. However AMD desperately needs to increase dGPU marketshare at the moment. Being in the 10-15% marketshare probably makes it very hard to be profitable with spending all the money needed to put new chips out there.
I don't know, last I looked GPU client still brought in some profits. I honestly think that AMD's game right now is to navigate on sight and follow where Nvidia went while penny pinching every single step of the way, outside of a few techs that unfortunately aren't important enough to really move the market.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,141
1,089
136
I doubt the Navi33 on N6 was supposed to reach much higher clocks than the RDNA2 predecessors on N7, considering the massive jump in clocks these had already brought compared to RDNA1.
How the RX 7600 XT on N6 averages >2.7GHz in games, almost +1GHz over the RX5700XT in N7, sounds super impressive to me. According to TSMC, N6 vs N7 should provide only 7% higher clocks at ISO power.

The big problem is how the N7->N5 transition should have brought at least +15% clocks and then N31/N32 had ~+0% over N21/N22.


Now if RDNA4 solves the clock problems that RDNA3 had, plus if it's using N4X, then we should be looking at +15%^2 between RDNA2 and RDNA4.
So if RDNA2 at N7 could average at ~2.4GHz with ease, then the RDNA4 chips on N4X should average (not boost) at 3.1GHz.




AMD wasn't a quitter in 2008 when they held back on a top-end product but launched the cost-focused RV770 cards that wouldn't compete on the high-end against the massive Tesla GT200 with a 512bit bus. On the contrary, it was a period when AMD gained a lot of marketshare for releasing a product with much better price/performance than the competition.

They weren't quitters when they launched the Polaris family in 2016 that was actually pretty well received and also never had anything to compete in the high-end.

And they weren't quitters again when they launched the RDNA1 family in 2019 with their highest-end at $400 that had nothing to compete with the $1200 RTX 2080 Ti.


Interestingly save for the R300 / Radeon 9700 era and the crypto craze anomalies, it's whenever AMD focuses on less SKUs without competing at the high-end that they've been able to recoup marketshare.

View attachment 97710


Of course, not competing at the high end brings problems for brand value, product ASP and raw margins. However AMD desperately needs to increase dGPU marketshare at the moment. Being in the 10-15% marketshare probably makes it very hard to be profitable with spending all the money needed to put new chips out there.
I had the Radeon 9700pro, 9800pro, Geforce 256 (original Nvidia card) Geforce 2, Geforce 3 4200Ti (capacitors blew on that card) HD 3850 and then the 8800GT that brought me years of gaming performance paired with my Q6600 @ 3.6ghz and 8GB of DD2 1000mhz. My HD 7950 Sapphire still works.
 
Aug 4, 2023
177
375
96
You can pinpoint the exact moment the mindshare machine won.
Good old Maxwell, it was certainly impressive, but even it didn't quite deserve the success it got.
See, when AMD got the console market monopoly, the whole PCMR movement began... Coincidence? I really don't think so, NV certainly had a hand in it.
Basically a mountain of compounding factors at this time absolutely massacred AMD beyond that.
I should make a complete explanation of why Nvidia keeps winning against AMD because apparently nobody noticed yet
I'll do it for you. When your only competitor has to clean out house to survive and tread water to resurrect their CPU division, things tend to get lopsided.
NV serves fewer markets, but markets they believe in and try to prop up, this was on the brink of collapse at a few points, had they failed to bounce back from the 1-2 punch of Bumpgate/Thermi they would've probably been forced to shelve CUDA development among other things to remain solvent.
They took the risky approach of burning profits for market capture in the hope they blow up, or don't miss a big break.
AMD had no luxury, they serve more markets and chose to prop up CPU at the cost of GPU for 5 years or so. Software was of course kept to what was necessary as they couldn't dedicate enough resources to go on punitive quests, like Mantle was handed over for the basis of current API's, ROCm was really ambitious initially and was then gutted, but is now back to being ambitious again since and is no longer constrained by money and other resources. Raja era Radeon did have a lot of big ideas that failed to make inroads, but they really were trying to find a way to get even. HBM took longer to take off than hoped, stuff like SSD caching was always a bit too niche and would require a bunch of dev support, stuff like that.

What AMD is is years behind in developing some things, but thankfully catching up is way faster than doing the initial pathfinding that their comp has done. Open sourcing up the more problematic stuff has helped a lot over the years. The battleground is set to see the best from both sides duke it out, no more caveats.
 

Mahboi

Senior member
Apr 4, 2024
341
574
91
I'll do it for you.
I don't think you're the right person to describe AMD objectively. All you do is justify everything they do.
When your only competitor has to clean out house to survive and tread water to resurrect their CPU division, things tend to get lopsided.
Irrelevant, CUDA is the winning piece for Nvidia since 2008, that's a long time before Bulldozer.
NV serves fewer markets, but markets they believe in and try to prop up, this was on the brink of collapse at a few points, had they failed to bounce back from the 1-2 punch of Bumpgate/Thermi they would've probably been forced to shelve CUDA development among other things to remain solvent.
And that's exactly what I'm saying, they take risks, sometimes very large ones. AMD takes technical risks, not market risks.
They took the risky approach of burning profits for market capture in the hope they blow up, or don't miss a big break.
AMD had no luxury, they serve more markets and chose to prop up CPU at the cost of GPU for 5 years or so. Software was of course kept to what was necessary as they couldn't dedicate enough resources to go on punitive quests, like Mantle was handed over for the basis of current API's, ROCm was really ambitious initially and was then gutted, but is now back to being ambitious again since and is no longer constrained by money and other resources. Raja era Radeon did have a lot of big ideas that failed to make inroads, but they really were trying to find a way to get even. HBM took longer to take off than hoped, stuff like SSD caching was always a bit too niche and would require a bunch of dev support, stuff like that.

What AMD is is years behind in developing some things, but thankfully catching up is way faster than doing the initial pathfinding that their comp has done. Open sourcing up the more problematic stuff has helped a lot over the years. The battleground is set to see the best from both sides duke it out, no more caveats.
Yaddi yadda, I heard all that.

You're justifying AMD's state on circumstances. I'm looking at causes.
The corpo that takes hard chances in a growing market is winning. The corpo that waits for trends and follows is N°2. That's just how an economy works.
AMD has hardware excellence but is holding back on risky investments. They were always like that and outside of trying some really fly things on the technical side, they never really went hard on big ventures. It's just not in their culture.

Let's be real for a second: if Intel hadn't fumbled the ball so hard they somehow shot it down their own throat, if they weren't choking on bloated cores and bloated internal designs since 5+ years, the game wouldn't have changed one bit. Zen would be a cheaper, decent, reasonable and cost effective arch, and Radeon would be the cheaper GPUs vs Geforce. That's it. Nothing has really changed in their attitude since 2010. They got a huge break with Zen/Zen 2 and onwards, and it is currently their breadwinner in every possible way outside of MI300/AI. Everything else is doing pretty decent, nothing explosive.

AMD essentially had the lucky break of Intel getting bloated and careless, and pounced. Since then, everything has been going great for them. But they're not market leader in their heads yet. And certainly not in GPUs. They play it safe mostly, take little risks and hardly invest really heavily.
 
Reactions: Saylick
Aug 4, 2023
177
375
96
AMD meanwhile patiently waits for Nvidia to innovate and follows, or for Sony to make a request, or for a market to present itself. This is a fundamental difference in company culture, and is what thoroughly disvalues adroc or branch_suggestion's opinions on "AMD will just build the biggest, fattest GPU and they'll just win". I have never seen AMD make an unreasonably fat and risky thing unless they were 100% sure that it would sell. This is why they're always N°2, because the competition sees the prey and leaps, while AMD waits to be sure that the prey has been correctly identified in the bush, has been mapped, weighed, geolocated by satellite, genetically tested, and then only jumps after all the securities have been taken.
Build the best and they will come.
Literally the corporate strategy of AMD in a nutshell.
It doesn't matter the market, the intent is the same. AMD through history has been a follower or second source because it is always a sound plan.
It doesn't matter who exposes a new market, nobody is untouchable. NV are very opportunistic because they are always chasing the highest growth possible by throwing stuff at the wall until it sticks, a strategy that can backfire and does leave room to be exploited. And at the end of the day, most of it gets usurped by more focused companies until they are ultimately left with their core business. Hardware is king, forever and always. Third party or open source stuff from ISVs is always more palatable than becoming slaves to a single all in one vendor.

Nobody got fired for buying IBM, Intel or... It is the same cycle, just a new subject. AMD might take a decade, but the goal is to out execute and win the most TAM without burning ISV/OEM/ODM et al. bridges.
 

Mahboi

Senior member
Apr 4, 2024
341
574
91
Might I add, RDNA 4 is exactly in line with the typical AMD philosophy.
If Jensen had been told: "Boss, N41, N42 and N43 are all going to be late to market" or "they're all below performance targets", he'd have literally laughed and said "we'll see what marketing can do about it". He wouldn't have cared one second about selling a second grade product and would have focused on whatever strong points it had, even if it was about spamming endless drivel about the Greatness of DLSS for 2 years. Remember Thermi? He sold that. Even made a lot of profits off of it.

Instead, AMD behaved in typical AMD fashion: the products will turn out late or not that good? Just cancel them. Just release smaller stuff. Don't take risks, don't release things that may be difficult to market.
Chad Leather Jacket Man pushes the market against the wall and goes "you want to buy my GPU".
Virgin Advanced Micro Devices silently stares into the middle distance in front of his sales table for the market to come and see what it has to sell.

That's why I don't buy the whole shtick about "RDNA 5 will obliterate Nvidia through sheer force". Whatever it is that AMD will output, Jensen will go harder and dumber just to retain the crown. It's what he's always done, the guy's basically the Terminator, he never stops spamming bigger stuff. And at the game of "who's going for the more insane product", Jensen will always go harder than Lisa.

I don't doubt RDNA 5's engineering, I doubt the company's going to have the nuts to just go out and smash Jensen in the face with enough force that it'll actually do damage.
 
Aug 4, 2023
177
375
96
I don't think you're the right person to describe AMD objectively. All you do is justify everything they do.
True, but I'm not unique. Look at how some people justify Intel or NV by comparison, I'm quite reasonable.
Irrelevant, CUDA is the winning piece for Nvidia since 2008, that's a long time before Bulldozer.
And it took a decade to actually yield noteworthy revenue from that work. They had the luxury to stick it out, not everyone does.
And that's exactly what I'm saying, they take risks, sometimes very large ones. AMD takes technical risks, not market risks.
I'll get back to this.
Yaddi yadda, I heard all that.

You're justifying AMD's state on circumstances. I'm looking at causes.
The corpo that takes hard chances in a growing market is winning. The corpo that waits for trends and follows is N°2. That's just how an economy works.
AMD has hardware excellence but is holding back on risky investments. They were always like that and outside of trying some really fly things on the technical side, they never really went hard on big ventures. It's just not in their culture.
AMD has tried more ambitious things, but either it fails to gain traction in time or the fail to execute.
Let's be real for a second: if Intel hadn't fumbled the ball so hard they somehow shot it down their own throat, if they weren't choking on bloated cores and bloated internal designs since 5+ years, the game wouldn't have changed one bit. Zen would be a cheaper, decent, reasonable and cost effective arch, and Radeon would be the cheaper GPUs vs Geforce. That's it. Nothing has really changed in their attitude since 2010. They got a huge break with Zen/Zen 2 and onwards, and it is currently their breadwinner in every possible way outside of MI300/AI. Everything else is doing pretty decent, nothing explosive.
At least AMD's CAGR is consistent, not as hypefulled mess like NV's has been through both crypto booms and the like. They will return somewhat to the mean eventually.
AMD essentially had the lucky break of Intel getting bloated and careless, and pounced. Since then, everything has been going great for them. But they're not market leader in their heads yet. And certainly not in GPUs. They play it safe mostly, take little risks and hardly invest really heavily.
They still would've been okay even against an Intel that didn't have overblown node design goals, but just a bit more muted.
Might I add, RDNA 4 is exactly in line with the typical AMD philosophy.
If Jensen had been told: "Boss, N41, N42 and N43 are all going to be late to market" or "they're all below performance targets", he'd have literally laughed and said "we'll see what marketing can do about it". He wouldn't have cared one second about selling a second grade product and would have focused on whatever strong points it had, even if it was about spamming endless drivel about the Greatness of DLSS for 2 years. Remember Thermi? He sold that. Even made a lot of profits off of it.
GP are stupid and someone with an eternal chip on his shoulder sure knows it. RDNA3 has actually performed okay in spite of being meh in every way.
Instead, AMD behaved in typical AMD fashion: the products will turn out late or not that good? Just cancel them. Just release smaller stuff. Don't take risks, don't release things that may be difficult to market.
Chad Leather Jacket Man pushes the market against the wall and goes "you want to buy my GPU".
Virgin Advanced Micro Devices silently stares into the middle distance in front of his sales table for the market to come and see what it has to sell.
The difference between a focused and a diversified company, NV doesn't have a choice but go all in, it is all they have ever known.
That's why I don't buy the whole shtick about "RDNA 5 will obliterate Nvidia through sheer force". Whatever it is that AMD will output, Jensen will go harder and dumber just to retain the crown. It's what he's always done, the guy's basically the Terminator, he never stops spamming bigger stuff. And at the game of "who's going for the more insane product", Jensen will always go harder than Lisa.
NV is behind in what they can cram into a single package, cramming more into a single device has been the #1 focus of AMD R&D ever since Su and Papermaster took over.
I don't doubt RDNA 5's engineering, I doubt the company's going to have the nuts to just go out and smash Jensen in the face with enough force that it'll actually do damage.
It is a GDDR DC accelerator part and a halo client part, so any risk of flopping has been covered. Things in semicon take time, remember a year ago the market tanked badly and anybody who invested hard took a big hit. Patience is a virtue, I have zero doubts that the greater wisdom is not on the side of a guy who cannot help but be envious of what isn't his.
It doesn't work that way. I can only write it so many times.
But it already has worked, just not in every BU yet.

In summary, one company wants to take what people do and do it better than anyone else, and the other wants to people to do what it wants them to do so nobody else can.
It really is the fundamental difference in psychology between the CEOs, my beef will always be with JHH for being a flawed human being, his company has always been an outward reflection of him and so I support anybody who wants to bring them back down to earth, that is where I stand.
His zealots are much the same, losers who so desperately want to be on the winning team, that they will sacrifice all moral and ethical concerns to do so, because that totally never backfires, right?
 
Jul 27, 2020
16,468
10,485
106
Geforce 3 Ti 200 (capacitors blew on that card)
I had the ASUS model. No problems to report, even overclocked. Sold it to upgrade to a Radeon 9000 series relatively cheap card (possibly 9500 something). It was my first Radeon and my foray into the exciting world of DX9 with all those furry animal demos (visuals as good as those demos are still not the minimum that developers aspire their game to have!).
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |