While the Anandtech numbers are being questioned, they are NOT out of line. They used what is currently the best performing board for Ryzen, running the OS that extracts the most of Ryzen, applying all the patches that affect Non-Ryzen platforms the most... Their numbers trend match the configuration used, so it makes sense they had to get the best Ryzen numbers.
So please, if you like a review because it showed your platform of choice performing as you WANTED it to perform, analyze if it truly reflects real life test setup.
What is it about the AT review that's so good for Ryzen, or shows the effect of the Meltdown/Spectre bugs? Yes, it takes a lead in a few of the gaming tests over the 8700K; winning slightly in Shadow of Mordor and solidly in GTA V (though Anandtech also shows the 8400 beating the 8700K by a similar margin in the launch review, which of course makes no sense), while winning by a lot in Rise of the Tomb Raider, (though the 8350K crushes them both) and winning by a lot in Rocket League, which does look good for Ryzen. However, in Civilization 6, framerate is not how higher CPU performance presents itself, it's turn time, which by most accounts the 8700K is ahead in.
Starting with Civ VI, we used the AI benchmark to test the time required to compute AI turns, as FPS is useless here. The turn time is about the same at 1440p as it is at 1080p, though we did test both. AVG FPS actually goes up for worse CPUs, because the time spent sitting idle on the screen is longer, as it takes longer for the game to calculate a turn. This makes FPS an unusable metric for this particular AI benchmark.
https://www.gamersnexus.net/hwrevie...vs-ryzen-streaming-gaming-overclocking/page-5
I'd consider both the GTA 5 and the Rise of the Tomb Raider results as questionable because in both cases slower Coffee Lake CPUs beat the 8700K, in RotR's case by over 40%.
As far as most of the go-to CPU-heavy synthetics (Cinebench, Geekbench, etc.), as well as in every game AT tested, the 8700K's performance has not changed compared to launch. The Spectre/Meltdown patches have not affected gaming performance nor performance in CPU synthentics. The Stilt also has no performance degradation in Cinebench on his fully patched 8700K.
That's not to take any credit away from the 2700X; it's performance, especially with tuned RAM is really good (from the computerbase review), and it's a very solid refresh, way better than Haswell Refresh or Kaby Lake, regardless of what Linus says. It's more like Sandy->Ivy or better, and yeah, it's definitely setting up shop in the 8700K's territory. I just don't think that the narrative about the 8700K losing a bunch of performance from the Spectre/Meltdown mitigations holds much water, at least for consumer workloads (gaming, synthetics, emulation, video encoding, etc.).