I think it's a little soon to be going full defensive mode.
As opposed to what, full on condescending with comments like this?
"I don't want to scare you away, perhaps I could intentionally handicap myself?"
I never said easily, I simply said faster. I believe the 780 is overall faster than the R290, possibly even the 290x at commonly achieved clocks.
Fair enough, you said faster (others said easily faster). You are entitled to that opinion, though my results posted here would tend to prove otherwise.
Is it faster that it matters? Probably not so much, in the same way the R290x isn't fast enough to matter vs the R290.
On this we agree. The R9 290X is an overpriced waste of money compared to R9 290, the same applies to 780Ti and to a lesser extend GTX780 here in Europe. Even with the admittedly crap reference cooler I cannot fathom how the mainstream GTX780 is worth the extra £80. Classified GTX780, Lightnings and 780GHz cards are all ~£130 more expensive.
The banter there was directly related to 3DMark11 and score submissions by AMD users. Upon further review it is clear the 3DMark offical Hall of Fame doesn't accept user modified settings, which is why the R290 series isn't doing well - compared to places like hwbot which fully accept modified settings via control panel options such as but not limited to, tessellation.
Regardless of weather it was referring to 3DMark cheats or not, it can still be seen as an excuse to declare any AMD benchmarks results as suspicious.
I didn't choose TR. That was a R290 user who decided to actually post something. It's been over a month now and it's so rare to see any user results outside the few people getting golden R290 chips that do 1200+.
I didn't mean to imply you chose that particular game, I meant you chose to respond to Gloomy's post with your own results to "prove" GTX780 was far superior.
Again with the goalpost moving, now I am accused of owning a golden sample R9 290X because it can run at 1200 core. Another attempt to justify claiming my results are invalid and don't count anyway?
I look at individual games when making such a comment, compared to overall performance differences. I don't really care about TW or GE.
I put stock vs stock R290 about 6-7% ahead of stock 780. So when a title such as Metro shows a 13% difference I say that title favors AMD.
On the flip side when I look at say Crysis 3, BF3, Rome 2, Bioshock from the same review I say those titles slightly favor Nvidia.
It's not like we can run a single test and proclaim a winner. What I'm looking at are gains vs stock to basically add x to y from stock to try to figure out which card is gaining more from OC and thus either producing similar results to stock or perhaps slightly tipping the balance in the other direction.
How can you test for gains vs stock when you deliberately underclock your own card well below stock to prove your card overclocks 42%?
Stock is what your card runs at out of the box. Stock clocks on binned edition GTX780 GHz, or GTX780 Classified cards are going to be different to a reference GTX780 or more mainstream Custom cooled GTX780 and proves nothing as a general rule.
My point all along has been that GTX780, 780GHz, R9 290/X are all going to be very close as a general rule when the silicone lottery is taken into consideration.
Last edited: