DrMrLordX
Lifer
- Apr 27, 2000
- 22,615
- 12,534
- 136
The Wraith Spire was downgraded. The 3600 comes with Wraith Stealth, which IRC always used full aluminum heatsink.
Oh, my mistake. That cooler's even worse.
The Wraith Spire was downgraded. The 3600 comes with Wraith Stealth, which IRC always used full aluminum heatsink.
10 cores running at 4.9GHz ACO should consume more or equal to 12 cores running at 4.6 - 4.7GHz ACO because of the higher energy inefficiency delta between those clocks. The ring bus probably was the main barrier here. Rocketlake is too fat and should consume too much, as a result, to fit 12 cores or more on 14nm for client desktop.I can only hope the 16 thread stuff will compete, will be nice to have some price wars again. Not that i am in the market for one with the recent purchase of a 3900x. I could only look forward to more 24 thread upgrades or just ascend to more cores. I rather have 24 insanely fast threads over more cores for at least 5 years. If Intel gives me a faster 24 thread option in a few years will be up to them.
The 16 thread stuff will be the big seller prob by next year if next gen games finally push pass 12 threads. I think they will and if not well if the next gen consoles have 16 threads why the heck would the pc ports not use at least 14 if not more threads? I guess one could get a 3600/i5 Comet
now then ride that into the 4000/Rocket then get whatever best 16 thread option then. I doubt a 3700x/10700k will be a slouch for gaming for at least 3 years. Bet either will do 60fps+ in all games but the worst coded ones.
As for the thread issue, well what can ya do besides get a 3900x/3950x? Gonna be a niche market for anyone wanting more then 10 cores so i guess i could see why Intel capped at 10 cores for now. Well that and the inferno that would result from 14++++. Intel dropping more then 10 cores with Rocket or no? Or would this be in the realm of unknown information at this point?
10 cores running at 4.9GHz ACO should consume more or equal to 12 cores running at 4.6 - 4.7GHz ACO because of the higher energy inefficiency delta between those clocks. The ring bus probably was the main barrier here. Rocketlake is too fat and should consume too much, as a result, to fit 12 cores or more on 14nm for client desktop.
My expectation is it will match the amd cpu but needs to use 2x the power to do it.I will be curious to see some 10 core Intel vs 12 core AMD action, given they clocked these things to the moon MAYBE a chance in some select stuff it could hold up or beat the 3900x?
by that standard we could call the 2080ti a bad card because some of them had the space invaders minigame built inIt's still a downgrade from the last-gen cooler (no more copper core, louder fan). @VirtualLarry 's 3600 would in no way, shape, or form ever work properly with that cooler. Granted he does crazy stuff but whatever.
Yes. But is that so bad? 10 cores @2.8GHz for 65W?Assuming the i9-10900 comes with the same HSF Intel's other 65w chips have come with - it's going to be sitting at 2.8Ghz on any kind of sustained all-core load.
That depends how you compare.My expectation is it will match the amd cpu but needs to use 2x the power to do it.
It's not great considering the 16-core 3950x in 65w eco-mode will deliver those kind of all-core clocks with better IPC.Yes. But is that so bad? 10 cores @2.8GHz for 65W?
Sure, it won't go anywhere near Ryzen 3000 based on 7N node. They can't magically cover the density gap.
Then again, considering that in the same power budget AMD gives you 8 cores @ 3.6GHz (3700X) the gap isn't as huge as one could think.
Intel is slower, but it isn't slow.
... so someone is going to buy a 10c part for general computing and web browsing? The whole point of high-core-count parts is to be able to utilize them. With a 65w power budget, there are many better options from AMD.And the 5.2GHz single-core boost will make it perfectly smooth in everyday computing (browsing web etc).
Really, not bad.
And the 5.2GHz single-core boost will make it perfectly smooth in everyday computing (browsing web etc).
I'm not entirely sure why you bring 3950X into this.It's not great considering the 16-core 3950x in 65w eco-mode will deliver those kind of all-core clocks with better IPC.
People are using 8-core i9-9900 for general computing (be it home or business use), so why wouldn't they buy a 10-core successor (for the same money)?... so someone is going to buy a 10c part for general computing and web browsing? The whole point of high-core-count parts is to be able to utilize them. With a 65w power budget, there are many better options from AMD.
People might use it for that at times, but you don't pony up that kind of money unless you want multi core real power for encoding, or premium gaming. If ALL you did was general computing, a 9400 or 1600 or something low end would be fine.I'm not entirely sure why you bring 3950X into this.
Anyway, I can't find any consistent and credible eco mode results for 3950X, so I can't comment on that. It would be great if you can provide something.
The reality is that we have the 12-core 3900 PRO and - limited to 65W - it runs at 3.1GHz (all-core).
So yeah, I guess 3950X should be able to do 2.6-2.8GHz with that power budget. But it's a low-supply, halo product costing nearly twice as much as a 10900...
And I really doubt people buy a 3950X with a plan to run it at 65W, whereas the 10-core 65W i9 will probably be relatively common.
People are using 8-core i9-9900 for general computing (be it home or business use), so why wouldn't they buy a 10-core successor (for the same money)?
People are using 8-core i9-9900 for general computing (be it home or business use), so why wouldn't they buy a 10-core successor (for the same money)?
I'm talking about 9900, not 9900K. And it's a common option in OEM PCs.The saying" a fool and his money is soon parted" applies to anyone who buys a 9900k just to browse, its such a epic waste of energy and resources. By the time a desktop demands 16 threads for a basic browser box modern processors at that time will make the 9900k look like a pentium d. I can't fathom why anyone would pick a 9900k for a browser box.
And for how long will that 9400 or 1600 last? How slow will it be in 2024?People might use it for that at times, but you don't pony up that kind of money unless you want multi core real power for encoding, or premium gaming. If ALL you did was general computing, a 9400 or 1600 or something low end would be fine.
It is a waste of money, but power, not so much. Granted you dont need a 9900k for browsing, but in light use, it will not use that much power either. If you are talking about the resources for manufacturing the cpu, you could say the same for any AMD cpu above 6 cores as well.Anyone can buy whatever they want, its free will but i have found a i5 2500k at stock is still more then enough with a ssd for basic usage. My friend been happy with that thing since i gave her it in 2012. I had to almost beg her to accept the 7700k when my 3990x arrives. I guess with her gamer box the 1050 ti is more of a bottleneck but not for long when she gets my 1080ti when i get Ampere. The 2500k going to replace a i3 2100 htpc.
The saying" a fool and his money is soon parted" applies to anyone who buys a 9900k just to browse, its such a epic waste of energy and resources. By the time a desktop demands 16 threads for a basic browser box modern processors at that time will make the 9900k look like a pentium d. I can't fathom why anyone would pick a 9900k for a browser box.
Your argument was that the i9-9900 managing 10c at 2.8 GHz "wasn't that bad". It is that bad. The competition has 16c products able to run within the same power budget at the same clock speeds with better IPC, and 12c products that can run at significantly higher speeds with better IPC. Even look at the 8c parts. Intel's 65w 8c parts have a 2.9GHz base clock. AMD's 8c 65w part has a 3.6Ghz clock. Stock-for-stock, the 3700X will deliver significantly higher sustained throughput with the factory HSF.I'm not entirely sure why you bring 3950X into this.
Anyway, I can't find any consistent and credible eco mode results for 3950X, so I can't comment on that. It would be great if you can provide something.
I never said you CAN'T use those chips for general computing, but they are a waste. Most people don't buy 8-core i9 CPUs solely for web browsing. They'll get mainstream i5 and i7 parts.People are using 8-core i9-9900 for general computing (be it home or business use), so why wouldn't they buy a 10-core successor (for the same money)?
It is a waste of money, but power, not so much. Granted you dont need a 9900k for browsing, but in light use, it will not use that much power either. If you are talking about the resources for manufacturing the cpu, you could say the same for any AMD cpu above 6 cores as well.
Single-core performance is getting less useful every day. I was trying to give an example, but if someone wanted a PC that lasted longer, a 3700x (middle of the road) should last more than 4 years and be responsive. The NVME SSD with its PCIE 4.0 performance is WAY faster than a regular SSD. A middle of the road CPU and a good NVME drive will last a very long time. I can't foresee much past 5 years or so, but you don't need a 9900 to last 5 years for the tasks you mention.And for how long will that 9400 or 1600 last? How slow will it be in 2024?
Most people use their OEM desktop PC for as long as it's responsive (e.g. not irritating in daily tasks). This experience is driven by the single-core performance, not by number of cores.
So the goal is not to buy something that's barely sufficient today, but to buy something that has a chance of matching mainstream CPUs few years later.
And spending $100-200 more on a CPU and replacing every 5 years is cheaper than replacing every 3 years and buying the whole desktop (again: I'm talking about people buying from OEMs, not DIY upgrading). And even after 5 years you can still give that CPU to someone will smaller needs.
And of course the same is true for business PCs (where vast majority of these non-K i7/i9 are headed). A desktop will be used for as long as it can be utilized in a company.
So today you buy that 8-core i7 for a developer, a financial analyst, a photo editor etc. They may need an upgrade 3 years from now, but that PC will still be perfectly fine for trainees, for someone in accounting or legal - for less CPU intensive tasks.
by that standard we could call the 2080ti a bad card because some of them had the space invaders minigame built in
I have 5 of those, and not one has gone space invaders on me.The failure rate on those cards is still a secret. Not a great comparison.
I agree with your general sentiment but no one is going to benchmark these Intel CPUs at PL1. They will be limited to around that for OEM machines because of lackluster cooling. People will see these benchmarks and buy Intel equipped PCs from Dell and so on thinking they'll be getting much better performance than is really possible. An R9 3900(X) would do much better in the same thermal situation but won't look as great in benchmarks.You can always limit an Intel CPU to PL1 (TDP). The CPU will hover 1-2W under the TDP (just like server CPUs do).
Intel made some mistakes and they deserve criticism. They also lag in node density, which means the power consumption figures aren't great compared to TSMC/AMD.
But it's really weird that people attack the boost mechanism. It's a great idea and was such a huge improvement, basically making OC obsolete for most PC users.
Yes, it's hugely inefficient. It should be. Otherwise the "boosted" state would be the default one.
Disagree. Single core performance will always matter, even in servers. There are always serial tasks that cannot be decomposed to run in more than one thread and will bottleneck the system. I haven't seen a game yet that doesn't peg out one core at 100% (rendering task?).Single-core performance is getting less useful every day. I was trying to give an example, but if someone wanted a PC that lasted longer, a 3700x (middle of the road) should last more than 4 years and be responsive. The NVME SSD with its PCIE 4.0 performance is WAY faster than a regular SSD. A middle of the road CPU and a good NVME drive will last a very long time. I can't foresee much past 5 years or so, but you don't need a 9900 to last 5 years for the tasks you mention.
I agree with your general sentiment but no one is going to benchmark these Intel CPUs at PL1. They will be limited to around that for OEM machines because of lackluster cooling. People will see these benchmarks and buy Intel equipped PCs from Dell and so on thinking they'll be getting much better performance than is really possible. An R9 3900(X) would do much better in the same thermal situation but won't look as great in benchmarks.
I think I either said it wrong, or you misunderstood. Yes, single core performance matters, and its always getting better. But in relation to multi-core, that is becoming MORE useful at a faster rate, and the need for it, and the actual hardware is stepping up multi-core performance quicker than single core.Disagree. Single core performance will always matter, even in servers. There are always serial tasks that cannot be decomposed to run in more than one thread and will bottleneck the system. I haven't seen a game yet that doesn't peg out one core at 100% (rendering task?).