And boy GCN pre Polaris looks bad with features upped on the PCGH review. Fiji went down fast!
Do you have that backward? 1060 3g is the only card that shows massive perf loss when upping quality. Fury gains a single fps.
And boy GCN pre Polaris looks bad with features upped on the PCGH review. Fiji went down fast!
Except GameGPU shows a 220W FX-9590 barely matching a 3-year old 84W i7-4770K at stock, but I guess throwing an expensive Intel chip + GPU bound scenario and/or scripted benchmark makes the comparison more interesting to some people.
OK so first off, could you show an example of CrossFire working in a modern title but SLI not working in that same title? Again, the point I was making was that there are a lot of games that simply can't use AFR. Period. You have twisted the original point.
If the GTX 1060 is overpriced, then what does that make the RX 480, which offers roughly the same perf/$ ratio?
No, NVIDIA would love to be able to sell people on more GPUs, that's just straight up more money into their pockets. The reality is that games are increasingly using rendering techniques that make AFR very difficult.
It's a good thing that people won't be "tempted" to buy dual GTX 1050 Tis or GTX 1060s, when they could get a much better gaming experience (and in the case of SLI'd 1060s, for less money) on average by going with a GTX 1070.
Except GameGPU shows a 220W FX-9590 barely matching a 3-year old 84W i7-4770K at stock, but I guess throwing an expensive Intel chip + GPU bound scenario and/or scripted benchmark makes the comparison more interesting to some people.
When was the last time we have seen 7 out of the top 10 slots in a review belong to a single video card maker? In this case Nvidia.
Not bad performance. I expected it to be a gimp showcase.
Your wording is kinda strange, maybe it is a language barrier. Just to sum your post up: almost 2 years older AMD stock CPU is equal to younger intel stock CPU but is rated at higher power consumption (probably because it is produced on much older manufacturing process)?
That seems nice for all those who have cheap Vishera CPUs. Its almost as fast as intel latest and greatest.
Just to sum your post up: almost 2 years older AMD stock CPU is equal to younger intel stock CPU but is rated at higher power consumption (probably because it is produced on much older manufacturing process)?
right, I forgot about that card.When you use 2 $1000+ cards from one maker and not the other? They could have thrown in a Fury Pro Duo.
Sorry but the only one who didn't understand is you. A 2011 95W i7-2600K at stock comes very close to a 2014 220W FX-9590 - both are 32nm chips and the Intel chip has a huge overclocking headroom on top. So no, it doesn't look good for FX at all unless you manage to find a very light scene were any CPU would perform the same (hi Guru3D!).
Also, the 4 year old 8350 handily beating a newer and more expensive i5 6600.
This game uses multiple threads well at least.
Sorry but the only one who didn't understand is you. A 2011 95W i7-2600K at stock comes very close to a 2014 220W FX-9590 - both are 32nm chips and the Intel chip has a huge overclocking headroom on top. So no, it doesn't look good for FX unless you manage to find a very light scene were any CPU would perform the same (hi Guru3D!).
Why do you keep using the 9590? Its nothing more than OC'd 8350. It also came out in June 2013. The 8350 is faster than the i5 6600
clearly not at 100% so wouldn't be using full power.
In this game, but it's slower in the majority of titles, and usually by larger margins than FX's advantage here. Not to mention an almost 6-year old i7-2600K would destroy any FX when both are overclocked, so I'm sorry but I don't see anything to brag about.
Wrong:
Back to the GPU discussion now, bye.
Ps: Your Guru3D chart where they tested a very light scene (hence why there's zero CPU scaling) doesn't count.
No, driver updates are literally hacks to fix issues caused by the developers making calls incorrectly or not optimally.
We shouldn't need driver updates per game release.
I highly doubt that is all driver updates are.No, driver updates are literally hacks to fix issues caused by the developers making calls incorrectly or not optimally.
We shouldn't need driver updates per game release.
Ultra shadows are WAY too sharp.
I highly doubt that is all driver updates are.
HTFS looks way better.Computerbase review up as well, they tested HFTS and PCSS as well
70% performance loss with HFTS over Ultra Shadows on 1060
Ultra:
HFTS:
https://www.computerbase.de/2016-11/watch-dogs-2-benchmark/2/
Funny you missed the FX8350 at default being faster than the 3 year younger and more expensive Core i5 6600 14nm Skylake
You mean the i5-6600 4 thread chip that was running 700mhz slower than the FX-8350 8 thread chip and still pretty much tied it?