RussianSensation
Elite Member
- Sep 5, 2003
- 19,458
- 765
- 126
AFAIK there isn't a standard out yet for that the bandwidth necessary to support 4k @120+ hz. So the issue is that no one can make such a monitor because the cable standards don't support it yet. So its going to be a really long while before we see it, at least a year or more and that is on the assumption there is going to be a 120hz update for 4k resolutions added to the standards for DP 1.3.
Let's face it, 120Hz at 4K is not even a realistic gaming target due to lack of proper PC hardware to drive such high fps on next gen games. Even with today's games like Watch Dogs, Metro, Crysis, Witcher, you probably can't hit that with 4 Titan Blacks. Once next generation games come out, they will be even more demanding. If this weren't true, we wouldn't be upgrading the GPU. Honestly even 144Hz @ 1440p sounds like a marketing gimmick for modern titles. Sure, if someone is playing some ancient UT2004 or COD game, or BF3, they will hit that, but good luck getting 140 fps in anything modern without turning down graphics settings significantly. But if I have to turn down graphical settings and AA, why am I spending $1000+ on GPUs? I guess only if playing competitively and kill/death ratio is the only thing that matters.
Then there are 2 more issues: CPU speed isn't sufficient to get 140 fps in games like BF4 or Arma3 on max settings incl. max draw distance, while plenty of genres from strategy to 3rd person action/adventure to sports titles hardly benefit from anything above 60 fps since they are slower types of games. For me, 120-144 Hz monitors only makes sense at 1080p where you have at least some shot to realistically hit those fps or if a gamer doesn't care about max graphics and is strictly playing competitively. That's why GSync is most valuable at 60Hz and below when frames drop below 60 because that's where most gamers will end up given their hardware budgets. Unless a gamer wants to keep buying 3-4 Titans every generation, I don't see the point of 120 Hz 4K, not even 144Hz 1440p.
Finally, we haven't had any good online multiplayer shooters on the PC in a long time. BF4 was buggy as hell, Hardline is delayed and is looking meh. The most exciting FPS in years is on consoles - Destiny. And who needs 120Hz in games like Witcher 3 or Tomb Raider? If we were in the era of Quake Arena or UT2004, I could understand the value behind 120-144Hz gaming where accuracy and reflexes are paramount. The focus now is on cinematic gaming experience, with realism. That demands an insane amount of GPU horsepower to even think about 120Hz at 4K.
I don't doubt for a second that 95% of next gen PC games will look miles crisper on a 4K IPS 60Hz display than on a 1440p TN 144Hz display with color shifts and inferior black levels. Sooner or later, 4K will become affordable at 32-42" inch sizes. Once you experience the emersion of PC gaming on a larger 30-37" monitor, it is very hard to go back to smaller sizes. I think hands down 4K is the future not 1440p @ 120-144Hz which appeals to a small niche of PC gamers. 4K will become the 1080p of PC gaming since it will drop in price drastically in the next 3-5 years as it slowly becomes the new industry standard, while 1440/1600p hasn't taken off in 10 years. As soon as 28 inch 4K IPS monitors show up for $600-700, 1440p/1600p gaming is on death row unless 1440/1600p monitors drop to $350-450.
Last edited: