- Mar 3, 2017
- 1,622
- 5,892
- 136
Why are you assuming everyone needs a powerful dGPU, just because they have a powerful CPU? They handle completely different types of workloads.
Also, Intel does not agree with you either. Because otherwise they would not have made sure the ARL desktop CPUs will comply with the 40 TOPS AI-PC requirement without having to add a separate dGPU. I.e. it's an indication that they expect a lot of ARL desktop systems to be built without dGPU.
Not surprised. Same kind here.That is just so incredibly dumb for fixed hours work. They do it here as well. Awful loud machines though with annoying fans. On permanent dock. 3x as expensive. But when I request a nicer monitor or a nicer mouse, nope. IT people are dumb here.
Non gaming enterprise software development? I know many use Laptops for software development, but a powerful multi-core CPU makes a big difference in productivity.Well then, why don't you list some consumer workloads that require 8+ core CPUs that don't also benefit from having a dGPU and we can determine how important that market is?
Not just enterprise. Most fields of software development besides games have little to no use for a powerful dGPU.Non gaming enterprise software development? I know many use Laptops for software development, but a powerful multi-core CPU makes a big difference in productivity.
Depends on what you do even with software development. I'm a web developer myself, and outside of my specific role in working on E2E tests (where more cores comes in handy as it lets me do parallel runs at the same time), the rest of the team don't really have much use that kind of parallelism. And if they did, chances are they'd try offloading it to Azure pipelines instead.Non gaming enterprise software development? I know many use Laptops for software development, but a powerful multi-core CPU makes a big difference in productivity.
At the end ? Seemed to me it went on long ago.
The amount of applications that extensively use multithreading is quite small and specialised, aka not a market.Non gaming enterprise software development? I know many use Laptops for software development, but a powerful multi-core CPU makes a big difference in productivity.
I hate to be the Jensen here, but:Not just enterprise. Most fields of software development besides games have little to no use for a powerful dGPU.
Most companies giving out flex working arrangements or work from home arrangements issue laptops. What doesn't run locally fast enough is offloaded to cloud.Non gaming enterprise software development? I know many use Laptops for software development, but a powerful multi-core CPU makes a big difference in productivity.
But I do see where you're coming from, that is a valid usecase. My point is just that it's probably a bit more niche than just a generic "software developers" answer.
I'd even stretch it a bit further and say that in the next 10-15 years, we will see an increase in SFF, mini and laptop PCs in consumers to the point that APUs with a minimal amount of graphics will be the common option for 90% of non gamers.In AMD's case most systems that will be running without a dGPU will likely be mobile APU based anyway.
Mobile APUs are generally better suited to office PC style usecases than stuff like Raphael. They idle lower, those kinds of office style PC have no real usecase for the handful of cores you get on mobile anyway and of course now they'll have NPUs. That's just the more obvious solution.
I feel like you're heavily overestimating how much of the PC market is tower PCs running high end 12+ core CPUs without dGPUs. It's a very, very, very small subset of systems.
Heavy duty Excel files that take half a minute to recalculate. Non-accelerated data science tasks. These need very fast single core, high memory bandwidth, huge memory size (datasets that need to fit in RAM calling for at least 32GB), and to a lesser extent multithreading. No use for GPU or NPU. You'd be surprised at how many corporate jobs involve uses like these, think finance. Looking forward to Strix Halo for this.Well then, why don't you list some consumer workloads that require 8+ core CPUs that don't also benefit from having a dGPU and we can determine how important that market is?
That 40TOPs numbers is reached by a combination of CPU and NPU TOPs, it's not just NPU. The NPU only makes up a small portion even. No real workloads will utilise both at the same time in that fashion, so really hitting 40TOPs in that way is really just a marketing thing more than anything else.
Also p.s. Intel doesn't ship their mobile products on desktop the way AMD does, so obviously the desktop products will be designed to be used that way too.
I agree, the mini PCs from the like of Minisforum are very attractive. At some point the DIY DT market demand will be so low that the supply of desktop parts should dwindle. That market will remain viable for the ultra high end only.I'd even stretch it a bit further and say that in the next 10-15 years, we will see an increase in SFF, mini and laptop PCs in consumers to the point that APUs with a minimal amount of graphics will be the common option for 90% of non gamers.
I've already had my old man change from his 3000k era CPU to a Cezanne APU in a Minisforum box, and he's had basically nothing bad to say about it. Shoved 2 4K monitors on it and is as happy as can be. I expect a lot of people will go with the mini PC + one or two 4K 75Hz monos in the next 10 years...
Once bothered to run a home server type of thing on a Raspberry Pi 4.I agree, the mini PCs from the like of Minisforum are very attractive. At some point the DIY DT market demand will be so low that the supply of desktop parts should dwindle. That market will remain viable for the ultra high end only.
I also do software development as profession as well as hobby, you can certainly work with 4 cores but is it a torture especially if you are used to 8+ cores. You can run few services remotely to ease the local load, but running all of them locally improves productivity (otherwise when you fix a bug, push the code, wait for build server to complete etc..). There are nearly hundred back-end services, database servers, Web server, Multiple Opened IDEs, in these types of scenarios having multi-core CPU helps a lot. At least for me.Depends on what you do even with software development. I'm a web developer myself, and outside of my specific role in working on E2E tests (where more cores comes in handy as it lets me do parallel runs at the same time), the rest of the team don't really have much use that kind of parallelism. And if they did, chances are they'd try offloading it to Azure pipelines instead.
Very few developers are doing things like compiling massive codebases on the regular, which is where that extra CPU brunt would come in handy.
But I do see where you're coming from, that is a valid usecase. My point is just that it's probably a bit more niche than just a generic "software developers" answer.
When corporate IT writes a call for bids for workplace PCs for staff like yourself, would they put bothI also do software development as profession as well as hobby, you can certainly work with 4 cores but is it a torture especially if you are used to 8+ cores.
Unfortunately 8 core is the maximum for now. But my work laptop is only 4 cores 8 Thread Intel and it is pain to use compared with my PC 5950X. No idea about the sticker.When corporate IT writes a call for bids for workplace PCs for staff like yourself, would they put both
– more than 8 cores
and
– Microsoft's new sticker
into the requirements list?
That 40TOPs numbers is reached by a combination of CPU and NPU TOPs, it's not just NPU. The NPU only makes up a small portion even. No real workloads will utilise both at the same time in that fashion, so really hitting 40TOPs in that way is really just a marketing thing more than anything else.
NPU are nothing but MADD engines with operation/ memory /io aligned to GEMM. You could probably run AI on a ati 9700 pro , very small model with terrible performance but you could probably run it.
I don't disagree with your use cases being valid requirements for single thread performance. That said, these use cases are a niche in a niche of corporate needs. Most stuff is word processing accessing the web and some slides. Even within a finance institution 99% of what's done can be done on a potato.Heavy duty Excel files that take half a minute to recalculate. Non-accelerated data science tasks. These need very fast single core, high memory bandwidth, huge memory size (datasets that need to fit in RAM calling for at least 32GB), and to a lesser extent multithreading. No use for GPU or NPU. You'd be surprised at how many corporate jobs involve uses like these, think finance. Looking forward to Strix Halo for this.
I'd love to see more benchmarks with those markets in mind, but I guess the consumers of the content are all gamers?
The fact anyone would say this after zen 1 with its worse single core performance managed to claw away decent market share from intel, is beyond me.The amount of applications that extensively use multithreading is quite small and specialised, aka not a market.
I have worked inside multiple financial institutions and I never ever saw anyone working on a potato machine. In fact everyone that actually had a desk had tower desktops so they could drive > 3 displays. The sheer amount of compute used in these markets really is more than most people would assume at face value. And I should say that this doesn't implicitly speak about CPU performance no, but there is 0% chance these companies buy anything less than the best for their employees.I don't disagree with your use cases being valid requirements for single thread performance. That said, these use cases are a niche in a niche of corporate needs. Most stuff is word processing accessing the web and some slides. Even within a finance institution 99% of what's done can be done on a potato.
The fact anyone would say this after zen 1 with its worse single core performance managed to claw away decent market share from intel, is beyond me.
Anyone that thinks MT doesn't matter is either very ignorant or just pretending, perhaps even terminally online.
I work in the finance industry too. The data science teams have pretty chunky builds of tower PCs and some even chunkier on site servers that they use to log into for bigger modelling. Anything needing vast resources is often done in the largest off site infrastructure. The headcount of the people, even within a finance institution that are doing this modelling is tiny compared to the number of support staff, marketers, sales people, product people, legal, and front line staff.The fact anyone would say this after zen 1 with its worse single core performance managed to claw away decent market share from intel, is beyond me.
If you do anything other than use chrome and play fortnite, you will eventually find yourself in a MT bottleneck.
I have worked inside multiple financial institutions and I never ever saw anyone working on a potato machine. In fact everyone that actually had a desk had tower desktops so they could drive > 3 displays. The sheer amount of compute used in these markets really is more than most people would assume at face value. And I should say that this doesn't implicitly speak about CPU performance no, but there is 0% chance these companies buy anything less than the best for their employees.
Like really I don't think anyone in the world expects less than 16c from AMDs next flagship.
Anyone that thinks MT doesn't matter is either very ignorant or just pretending, perhaps even terminally online. Even outside of finance there are tons of uses for MT performance and I find myself (slightly) bottlenecked every day. But until costs makes sense for client/edge uses, its not worth the investment most of the time for peak MT perf. Mistaking prices/barrier to entry for market demand is a fools errand IMO. The market always wants more performance, but it never works if the progress:cost ratios go up. There is a reason after all that OEMs aren't shipping tons of threadripper or xeon, as much as you or I would love to play with one.
And also, in this thread, people are talking about 40% gains in FP performance with zen 5 like it is a big deal. That kinda stuff is even more niche than overall MT or ST perf, yet it seems to be very important for a lot of people. I think consumer workloads (from all sales angles) and their impact on relative marketshares shouldn't be underestimated.
Sure, you find yourself within the set of people that need powerful computers but who are not running anything big enough where the really big on or off prem devices will be needed. I'm not saying people like you don't exist, they clearly do and you're one of them.I am working in finance as well and I can tell you guys - You DO need powerful machine to handle everything. Multiple monitors: 3 or more, large amount of RAM, 64 GB is the bare minimum, at least 12 CPU cores, best - 16.
Mac Studio is good enough for people like me, but for people who are running multibillion funds - you need the horsepower for modelling.
I actually was confirming what you are saying. IMO, we are not bound by comptue anymore, the RAM, memory and latency is the bigger bottleneck.Sure, you find yourself within the set of people that need powerful computers but who are not running anything big enough where the really big on or off prem devices will be needed. I'm not saying people like you don't exist, they clearly do and you're one of them.
That said, if you look at a global bank and their IT needs, how many of the people within that organisation will actually need that grunt. Your department might, but what proportion of the headcount does. A global bank might have 100k employees, how many are modelling vs doing fairly basic tasks. How many of those with big data needs are going to process that on their PC Vs having that done on specialist hardware.
If you want more multi threaded performance you probably have the cash for a real multi threaded machine. Most corporate laptops just need to run the internet and office products at a decent chop.