MrSquished
Lifer
- Jan 14, 2013
- 25,873
- 24,213
- 136
Are you using the AVX offset?
I have it set to '3' as per a couple guides I read
Are you using the AVX offset?
It seems like that should keep your temps out of the 90s with AVX loads.I have it set to '3' as per a couple guides I read
How hot does yours run at 4.7?
Mine is running quite toasty - during intense gaming sessions temps go into the 80's. I went back and re-seated the heatsink with a tried and true method and still the same temps. I can't even run prime95 without getting too hot for my comfort.
Damn, sounds like a really bad TIM application from Intel then, can't think of another reason apart from inaccurate probes why the temps would be so high esp for gaming.
Just as a context the only way I can get such temps is by stress testing my 8700K at 5GHz or running AVX code (with -2 offset). Gaming temps for me are much lower, around the 60s, and this at 5GHz on air.
Honesty upgrading your ram to a higher speed will likely make a bigger difference than the i9 to gaming . . .
Diablo3 higher rifts issues are mostly server side bound, players have been complaining about it for ages.
The game runs fine even on lower end systems as long as your network connection is good/excellent. I had no issue running D3 on a sub 3Ghz Haswell chip, with high rifts delays/freeze being the only problem, but that was always experienced by all members of the party, the AoE effects slow server response down to a crawl. Even the hour of the day makes a difference sometimes.
Did you switch the TR modes between gaming / creation and UMA vs NUMA memory or turn of SMT?
TR is not the most ideal pure gaming platform, but it has lots of room for tweaking and the 1920X reviews did highlight what settings work best for gaming.
How did you spend $1000 on the CPU when its original retail was $799, and now its $485 ??I tried everything. I researched and then I tried again.I spent over a thousand dollars on the CPU alone. The custom loop was another five hundred. So I really didn't want to spend anymore money. The gaming performance was terrible but not in all games. It shredded Wolfenstein New Colossus maxed but that's not open world and Vulkan doesn't seem very heavy on the hardware.
All I can tell you is what I have observed actually using a 1920x for around 12 months or so. It's not good for gaming and I was squeezing everything I could out of it. One difference I have noticed is that the 8700k gives more consistent fps. Whereas the 1920x used to have large fps drops and spikes in games like Far Cry 5 which was causing jagging and other issues.
How did you spend $1000 on the CPU when its original retail was $799, and now its $485 ??
... plays bad on Ryzen. Wonder why that is.I have been playing the game since it came out and yes there are server side issues however the game plays bad on Ryzen. Overclocking all cores to 4.1ghz and disabling SMT fixes 90 percent of my issues in the game.
There are very few niche cases where an 8core ryzen will beat an 8700k by enough of a margin to sacrifice all of that ST performance.
It's great that a 1800x might slightly edge out a 8700k in these niche tasks but, how many users really want that benefit at the expense of hobbling every other task they do? I was considering a Ryzen1800x for my home workstation(vSphere\HYPV) and was almost willing to put up with all of the Ryzen's issues but, then the 8700k came out.
AMD Has had 8core processors in the market for years under $100 no less and no one wanted them. Ryzen's in the current form is just a repeat of this.
I wanted TR and was ready to buy one now that all of the stability issues have been worked out in esxi and CAD but, Intel released SKX and CFL and there is literally no reason to choose ryzen anymore unless you are on some sort of budget.
Mark my words if AMD releases a multicore CPU that's within a few percent points of Intel i would spend money on it in a heart beat but, 15-20 percent less performance per core? Noway
The only AMD product thats interesting is EPYC. I purchase 20-30 esxi hosts a year and the thought of 32core single socket 2u servers with enough lanes for 6-8 pcie cards makes me giddy.
... plays bad on Ryzen. Wonder why that is.
So after openly stating Ryzen makes no sense against Coffee Lake 6c/12t, you somehow chose to buy the 2700X anyway: the 8 core CPU that according to you nobody should want. Would you like to share the reasoning behind major change in your purchase decision?
I never stated ever that the Ryzen 2700x made no sense over a 8700k. I was talking about the Ryzen 1800 which was absolute trash and still is. I have had a 2700x for a couple months and while it handles games well there are instances where the Architecture performs poor. I purchased a 2700x because my gaming rig is MITX and I figured the 2700x being 12nm would run very cool in the case.
It won't do that on a mITX board - since all SFF Ryzen boards that I know have modest VRMs and - more importantly - weak VRM cooling. I doubt they would come with aggressive stock settings.FWIW you may be surprised to see that a 2700X actually draws more power than a 1800X...
It won't do that on a mITX board - since all SFF Ryzen boards that I know have modest VRMs and - more importantly - weak VRM cooling. I doubt they would come with aggressive stock settings.
AFAIK you can still adequately run 8c/16t SKUs on SFF mobos, but they'll probably stick close to TDP. The new B450/X470 mobos may be better revisions of the first models, but when I did my research with the first gen models, all of them had some issue with VRMs - not in the sense that there was something really wrong with them, but rather that VRM thermals would dictate mild overclocks at best. It's all highly dependent on loads and case thermals/airflow ofc, but if I were running a 8c/12t Ryzen on a mITX board I would be more interested in VRM temps rather than CPU temps.So is the turbo boost essentially gimped on the SFF mobos? Is a 2700X forced to run at base clocks rather than max turbo? If that is so wouldn't a 1800X also be limited in this regard?
Now that the 9th generation is around the corner I'm looking to snap up a 8600/8700k for cheap 2nd hand once people start to offer them. Since I have a SFF system with a power supply of only 150W I'd like to bring the TDP down from 95W to ~65-70W range for PSU stability, better thermals and lower fan noise.
However, I'd like to keep the max single core performance through turbo boost, so my plan is to disable one core, SMT, and lower the base clock. Now, with turbo boost 2.0 the turbo mode can go beyond all core base and even the TDP so I'm afraid reducing the base clock and disabling a core wont affect max TDP as the other 5 cores will just clock higher until they reach their 5 core turbo speed.
Does anyone have experience underclocking/disabling cores and how that affects turbo speeds and TDP? The Asrock board i'm eying has a setting to put a max to your TDP although its unclear to me if this is system or CPU specific, and whether it actually works.
By the time GTX 12xx/etc are out, I expect the gap between the 8700k and 2700x to grow larger in Gaming at 1440/4k, not smaller, in most cases. Exceptions could apply if there are really good advances in high core count optimizations, but it still seems all these years later that max IPC + clock for 4-8 cores beats more cores but lesser IPC+clock. Of course the final thing here is that stock 2700x v stock 8700k the gap is smaller than max Air OC 2700x v 8700k, it just pushes more gaming advantage to 8700k both short and long term.
decide well gosh darn it I am going to get a new card, find out my current game runs faster but not as fast as it could if they got another cpu and feel sad.
Having spent time in various PC gaming related networks (reddit, discord, etc.) this actually happens way more often than I'd like to see.
What speed is your cache at and what voltage/mobo are you on,just curious?Something the 9900k will have in relation to the 8700k that the 2700x doesn't have on the 2600x is larger L3. The bump from 12->16MB may help quite a lot. My 5960x quite often beats out scores I see from people with 8700k's and it ain't the IPC or quad channel memory. It's the giant fast 20MB L3 cache (overclocked like mad, of course.)