WhoBeDaPlaya
Diamond Member
- Sep 15, 2000
- 7,414
- 402
- 126
This is a trend I observed over the last 1 to 1.5 month in all the latest titles. R9 290X beats GTX 780 Ti consistently in the recent releases while GTX 970 also beats GTX 780 Ti. Its as if Nvidia forgot that they sold these GTX 780 Ti cards as the top GPU for a year. My guess is they are just trying to get the GTX 780 / GTX 780 Ti owners to upgrade to GTX 970 / GTX 980 using such questionable tactics.
In all likelihood it's probably a hell of a lot more mundane. Bet they just prioritize the best driver dev teams to the latest product, and get the the second latest products as soon as possible -- which may end up slipping. I'm sure they're working feverishly to get some of the early-generation midrange-Maxwell optimizations in place and have dedicated more of their driver workforce to it. I highly doubt there's any conspiracy.
if you are saying that we cant perceive more smoothness beyond 60 fps then you should really find another subject that you may actually know something about. I cant believe we would even have such an ignorant debate at the end of 2014.
Were you dropped as a baby on the head? Were you negated of breast milk needed for the proper development of the brain? Sure seems like it.
Read my post again.
the human brain won't find it smoother over 60fps as it literally finishes the images on its own. It doesn't need additional frames.
The human brain won't find it smoother... yet many humans do find it smoother?
So why is it that 120 FPS looks smoother to me than 60 FPS if my brain can't perceive it?
Why are people able to tell a difference?
It doesn't. You are just perceiving it that way for some reason. Thinking you do, actually makes you believe you do.
Whether its 60fps or 120fps or 400fps the brain creates its own image, so anything over 60fps will just be thrown out of the brain and replaced with your own image.
the human brain won't find it smoother over 60fps as it literally finishes the images on its own. It doesn't need additional frames.
The human brain won't find it smoother... yet many humans do find it smoother?
So why is it that 120 FPS looks smoother to me than 60 FPS if my brain can't perceive it?
Why are people able to tell a difference?
Input lag. 120 fps vs. 60 fps don't feel smoother but more "direct" in response to input. I would think if you were to only watch 60 vs. 120 fps you wouldn't notice any difference at all. But if you are in control, it is a different matter altogether.
the ironic post of the day. and the rest of your comments in this thread show you truly have absolutely no clue what in the heck you are even talking about. but please continue to make a fool of yourself.Were you dropped as a baby on the head? Were you negated of breast milk needed for the proper development of the brain? Sure seems like it.
Read my post again.
It doesn't. You are just perceiving it that way for some reason. Thinking you do, actually makes you believe you do.
Whether its 60fps or 120fps or 400fps the brain creates its own image, so anything over 60fps will just be thrown out of the brain and replaced with your own image.
That's not true at all. Experienced players can easily tell between 60Hz and 120Hz in a blind experiment.
See Linus's experiment on seeing above 60Hz.
You probably have a lot of inexperience with 120Hz.
Feel=/ See
Brain - eye (also brain kinda but whatever)
This is right, and wrong at the same time. He can feel the difference, that is what they tested. This should be "Linus experiment on feeling the difference above 60Hz"
Some can feel the difference between 60 and 120Hz, but like other said, human can't see the difference.
Feel=/ See
Brain - eye (also brain kinda but whatever)
Notice, how is he is checking if its 60 or 120 Hz. He makes a quick short sweeps. It probably is so quick, that the 60Hz display starts showing the movement when the mouse is stopping. On 120Hz camera movement starts in the middle of the mouse movement. He measures the delay. This way he can tell the difference.
Put someone else behind the mouse and keyboard, and then have linus guess what refresh rate it runs.
Very disappointing performance of NV cards, if they don't release a patch that at least improves the performance of GK110 by 15-20% in those games I'll be very hesitant to buy a NV flagship card. I don't see a technical reason why Kepler is comparatively getting so much worse than Maxwell other than the unwillingness to optimize drivers to save cost and even Maswell is very mediocre in those games (FC4, DA3) They want to save cost by letting 7970 which was more than two times cheaper to catch up to a Titan? Fine by me, just count me out as a customer.
That has been my point for a while. Why pay so much more for flagship cards when they never prove to be more futureproof? It goes both was for AMD and NV but it's much worse for NV ($1000 Titan / $650 780 --> 1.5 years later 290 with similar performance was $350, $700 780Ti --> 11 months later 970 for $330). It's better to buy 2nd from the top, overclock them, and upgrade more often instead of spending $200-400 more for 15% more performance that accounts to nothing because sooner or later games get so demanding that the flagship card's performance advantage disappears into 1-2 fps (Titan/780Ti vs. 290/290X for example).
Remember the old days when if you got a next gen GeForce Ti 4600 there was no way a Radeon 8500 128MB would beat it, and it would be almost unheard of for a GTX480 to be slower or only as fast as an HD4890/5830 but how many games have you seen were the Titan is barely faster than a 280X/7970Ghz? A lot actually.
Shockingly Kepler's 2GB VRAM and SLI weaknesses have now cropped up in the last DLC for BF4, the Final Stand.
1. 980 SLI is mopping the floor with 780Ti SLI by miles, which can only be explained by the fact that NV doesn't care for Kepler anymore because nothing changed about BF4 in the last 2 years. Yet, 980 SLI is up to 27% faster in BF4 Final Stand against 780Ti SLI.
2. See that ARES II card, that's actually just 2x7970Ghz (1.05Ghz clock) and 6600mhz memory. Now look at 690 vs. ARES 2, with the latter leading the 690 by 36% and even outperforming reference 780 SLI:
Even the stock 7990 is still faster than a 690. Anyone can tell you that 1.05Ghz and 6600mhz memory (1650mhz) is a ridiculously low overclock on a 7970 card and yet at just those speeds it's wiping the floor with the 690.
Last 6 months have proven that Tahiti was a way better chip/graphics card than GK204, but of course 680 2GB users have apparently upgraded so they'll never own up to the fact that 680 was a rip-off (cost more and performance fell off a cliff in modern games). That's why I tend to wait until price wars erupt and refreshes happen to pick up "last gen" fast cards for 50% off when no one wants them anymore (i.e., when 480 was $175-200 when 580 was selling for $450). I got my 7970s as an exception due to mining so I didn't really care what they cost. I picked up 470s at $210 each barely 6 months after they launched for $349.
However, not all is great in the AMD camp either as CF doesn't work in some new games and there is stuttering in FC4, while in some games 290X scales poorly vs. 280X. Overall, it's just a sad state of PC gaming where we keep getting faster and faster cards but games keep destroying them without even approaching Crysis 3 / Metro LL level of graphics. If this continues, 4K is going to be unreachable for a while without spending a lot on GPUs.
We've known that for years now, Tahiti>GK104. nVidia marketing though managed to keep that a secret to most all that time. Go figure?
You check any review that uses games and warmed up cards and Hawaii has been as fast or faster than the 780ti from day one. Maybe nVidia has shifted their marketing dollars to Maxwell now and the sites are free to give us real numbers without fear of reprisal?
1250 boost out of the box? most 780 ti cards cant even oc past 1250 so what 780 ti boosted to 1250 "out of the box"?I argued that Tahiti was more future proof than gk104 when it was released due to three reasons: stronger compute (if games use it for rendering, ie global illum, particles or physics COH2), extra vram, and GCN being in consoles. In most of the recent AAA titles, 7970 Ghz spanks 680/770.
But, I disagree with the claim that R290X is ~= 780ti on launch, that never occurred outside highly cherry pick reviews. Overall it was ~10% slower over many reviews, and more slower compared to custom 780ti that boost to 1.25ghz out of the box.
Some people think Kepler is tapped out, ie. there's no more performance gains to be had via drivers and Maxwell is just starting to be optimized. Fair enough, comparing to NV architectures. But that claim falls flat on its face compared to AMD hardware unless the obvious conclusion is that GCN is just more future proof than Kepler. Which is probably true too.
1250 boost out of the box? most 780 ti cards cant even oc past 1250 so what 780 ti boosted to 1250 "out of the box"?
We've known that for years now, Tahiti>GK104. nVidia marketing though managed to keep that a secret to most all that time. Go figure?
You check any review that uses games and warmed up cards and Hawaii has been as fast or faster than the 780ti from day one. Maybe nVidia has shifted their marketing dollars to Maxwell now and the sites are free to give us real numbers without fear of reprisal?