Power virus? Please.
That's . . . great. I think?
Yawn.
so where have you been when 1800x and 2700x eat more than their spec....FP is not bound by RAM speed, at least not the tests used by AT or Computerbase, the 99000K/KS do indeed quite well in this area.
Now on Integer it could be different like in 7Zip where the 3700X display 30% better perf/clock than CFL at Computerbase.de.
The discrepancy between the 9900K and the KS at AT is either due to the security mitigations if effectively implemented, or more simply to the 9900K being tested at the time with 3200MHz RAM like in their Ryzen 3000 review, but this latter possibility shouldnt had effects on the FP results.
That being said it s up to Intel to validate their IMC at higher frequency, PL1/PL2 tinkering is not technically overclocking since it s still within Intel s official specs and settings, and in this register the 9900KS use 172W in Cinebench to clock at 5GHz, according to the reviewer 159W is not enough to do so.
And if you wonder about DC, go in there and read a few threads. People talking about when its hot they have to turn them off, or cold, and they fire a bunch up.
I don't blame you. But after I got cancer, I kind of didn;t care about the cost anymore.I did F@H for something like 15 years - gave it up because electricity costs in New England suck.
It's a fluid discussion. Mid-range Zen2 parts available now are, like it or not, the 3900X since the EPYC chips are the top end. Once TR3 is released, 3900X will be low-middle. I moved the goalposts further away for my purposes, making it a tough game, by instead admitting that perhaps people thought I meant mid-range Ryzen chip. But even moving it down a little to the 3700X for the sake of fun, the 3700X is an incredible value proposition against the 9900K for multithreaded applications, rendering, encoding, etc. It doesn't match up as well against the KS, or the 9990XE.I hope this is not another semantic mishap on your part. You've gone from a performance argument to a value one; and the 9900KS is curiously missing from your argument, all of a sudden.
I don't blame you. But after I got cancer, I kind of didn;t care about the cost anymore.
First, I don't want to derail this thread any more, this will be my last reply.First, I wish you well in remission. May you win the battle.
Second, this is peculiar to me, for two reasons.
1) according to the general public, the greatest threat to humanity is climate change. While cancer is awful, and should be eradicated, should it be at the expense of the planet? Genuine question, and that can be absolutely rhetorical.
2) How confident are we, that folding proteins holds the cure for cancer? I am skeptical. That also, can be rhetorical.
So, I'd be happy seeing the power output used for: video encoding benchmarks, Blender Benchmarks or Maxon Cinema 4D.
Anyway, these sort of benchmarks only matter for content creators. Pro content makers aren't using x570 or z390 boards and associated hardware.
, run power viruses like Prime. This has gotten freaking ridiculous
so where have you been when 1800x and 2700x eat more than their spec....
another the Stilt left post
Well I guess it depends on how high the reviewer is...the 9900ks might be a good bit faster than the 50% more cores 3900x or it might be a good bit slower...on the same workload.The Intel Core i9-9900KS Review: The 5 GHz Consumer Special
www.anandtech.com
180W of power , and not faster than a 65W 3700X ..
And if you never use any DC apps you will also never notice the 3900x being any faster...than even a celeron,depending on what you do with your PC.it is the fastest gaming CPU, but realistically you will probably not notice even when comparing to CPUs half the cost
Show us one professional that uses any overclocked CPU a s a render/DC box,that's why.You do realize that for over a decade that PC users on this forum (and elsewhere) have sworn by Prime95 as a stability tester and a harbinger of relative power consumption between platforms, right? Why is it now a "power virus"?
And you'll get 4.7GHz ACT at 127w. An equivalent 8 core AMD Zen-2 cpu with 105w can barely do 4.6GHz sustained on one core (3800x). That's on 14nm which, I believe, is two nodes removed from 7nm, technically speaking. That's rather impressive.The 9900ks is not made for people that are concerned about power usage.
And if you are you can still lock it to 127W TDP and it will only consume 127W.
Gamers nexus showed that from day one.
But then just go and get yourself a 9900 non-k or a ryzen or anything else.
Well I guess it depends on how high the reviewer is...the 9900ks might be a good bit faster than the 50% more cores 3900x or it might be a good bit slower...on the same workload.
Yes it can go up to 180W it can even go higher than that,do you know why?Because it can hit 5Ghz and even more.
And you'll get 4.7GHz ACT at 127w. An equivalent 8 core AMD Zen-2 cpu with 105w can barely do 4.6GHz sustained on one core (3800x). That's on 14nm which, I believe, is two nodes removed from 7nm, technically speaking. That's rather impressive.
Whaaaaaaaat
You have got to be kidding.
You do realize that for over a decade that PC users on this forum (and elsewhere) have sworn by Prime95 as a stability tester and a harbinger of relative power consumption between platforms, right? Why is it now a "power virus"?
Show us one professional that uses any overclocked CPU a s a render/DC box,that's why.
The 9900ks is not made for people that are concerned about power usage.
I've used OCCT for many years and I don't have a cpu with AVX (yes, it's old).
OCCT uses Linpack if I'm not mistaken. That's just as heavy as Prime95 SmallFFTs, if not moreso.
I state this to you non-rhetorically: using power for computing has nothing to do with climate change.First, I wish you well in remission. May you win the battle.
Second, this is peculiar to me, for two reasons.
1) according to the general public, the greatest threat to humanity is climate change. While cancer is awful, and should be eradicated, should it be at the expense of the planet? Genuine question, and that can be absolutely rhetorical.
2) How confident are we, that folding proteins holds the cure for cancer? I am skeptical. That also, can be rhetorical.
That's definitely not true for a workload that ummm... I don't know, maybe actually uses the processor?And you'll get 4.7GHz ACT at 127w. An equivalent 8 core AMD Zen-2 cpu with 105w can barely do 4.6GHz sustained on one core (3800x). That's on 14nm which, I believe, is two nodes removed from 7nm, technically speaking. That's rather impressive.
That's definitely not true for a workload that ummm... I don't know, maybe actually uses the processor?
Holy crap, the old P4 argument again. Dont you guys ever get tired of digging up that red herring? Intel has at least equal IPC in gaming to AMD, plenty of cores, so clockspeed certainly does matter.I don't get why some people obsess over clock speed. You could have a high clocking P4 or BD, but who cares? The 3700X is very much competitive with the 9900k. And it uses far less power. A 3800X would beat it even more in most tests. About the only thing the 9900k has going for it is gaming. If all you are doing is gaming, you'd be better off with a 9700k. The 3700X/3800X handles most tasks better. As it should, it is much newer than Skylake. But that's all we have until we see Ice Lake on desktop.
So what? Clockspeed "is what it is" however it was obtained. Do you thing a game has some code in it that says "Oh, that is 14nm, an old process, I am going to downclock it to match Ryzen"?Quite obviously, you're missing an important fact: TSMC had started volume production of 7nm in the second half of April 2018, when in the case of Intel 14nm it happened over 5 years ago.
They had a plenty of time to tweak this process to the maximum.
It wins by a few %, like 3. But for the same money, you can have 12 cores, not 8, and for some things that does make a noticeable difference. And there is no way it will do what you said using 127 watts.So what? Clockspeed "is what it is" however it was obtained. Do you thing a game has some code in it that says "Oh, that is 14nm, an old process, I am going to downclock it to match Ryzen"?
Holy crap, the old P4 argument again. Dont you guys ever get tired of digging up that red herring? Intel has at least equal IPC in gaming to AMD, plenty of cores, so clockspeed certainly does matter.
. So don't think I'm giving Intel a hard time because I'm biased.