96Firebird
Diamond Member
- Nov 8, 2010
- 5,737
- 334
- 126
They may be interested in the technology, but not interested in the implementation right now. Hopefully the technology expands into better quality monitors.
They may be interested in the technology, but not interested in the implementation right now. Hopefully the technology expands into better quality monitors.
I just knew from when they claimed there was a hardware component in the gpu required for this to work, that they have zero intention of having anyone with anything but a supported nvidia gpu be able to use this
Your cult could make more money from this in the long run simply by putting this forward as the industry standard and getting paid a royalty for every monitor sold.More money than they're going to make between now and the time an open alternative arises
You mean as in not TN? Petersen said the monitor type doesn't matter unless it doesn't use LVDS. So I take that to mean that any panel type (TN, IPS, other) using LVDS standard. So, between ASUS, Viewsonic, BenQ, and others he could not name yet, I think the bases are pretty well covered.
actually it doesnt work correctly with some games so its not 100% perfect but nothing ever is.They'd be lying. There isn't ANY reason NOT to like it. There isn't anything negative about it. It virtually eliminates stuttering, tearing, lag issues from using V-Sync. There isn't anything not to like other than not having an Nvidia capable card. At the moment. Them being IQ junkies is even more of a reason they would love this.
actually it doesnt work correctly with some games so its not 100% perfect but nothing ever is.
watch the pcper video Tom Peterson for an explanation but he does not say which games. they will be locked out of profiles so you wont have to worry about finding out the hard way.Which games would those be? Why would software have any influence on this at all?
actually it doesnt work correctly with some games so its not 100% perfect but nothing ever is.
So 2500x1600/1444, IPS should be possible?
Well then...
They'd be lying. There isn't ANY reason NOT to like it. There isn't anything negative about it. It virtually eliminates stuttering, tearing, lag issues from using V-Sync. There isn't anything not to like other than not having an Nvidia capable card. At the moment. Them being IQ junkies is even more of a reason they would love this.
Well that's very arrogant of you. Since your an NV focus member tell me and I will pass it on to them that this tech will work with their pioneer kuro elites and Panasonic's vt60s.
If you're going to tell me this is only limited to desktop monitors I already know their responses - "I don't want to be hunched over a desk playing games."
But sure they must be lying. Gotcha.
I think I might have a niggling suspicion why 768MB might be used: Complicated color processing algorithms to keep colors stable through a varying refresh rate.And also, we do not yet know what else these G-Sync boards are capable of. You saw Tom Petersen absolutely NOT answer Scott Wasson's question regarding the 768MB of memory (at least that we can see on one side) is for? I wonder what else they have in store.
I think I might have a niggling suspicion why 768MB might be used: Complicated color processing algorithms to keep colors stable through a varying refresh rate.
-- 6-bit FRC during variable refresh rates
-- Color stability during variable refresh rates (60Hz vs 120Hz vs 144Hz requires different calibration)
-- LCD Inversion during variable refresh rates or strobed mode (e.g. www.testufo.com/inversion )
-- Less 3D crosstalk and less strobe-backlight doubleghost effect
-- Overdrive algorithms compatible with variable refresh rates
So the 768MB could be full of LUT's and processing memory, to keep the pixel color values stable during G-SYNC, or during strobe mode. There might be other reasons, but, who knows?
Not all ghosting is crosstalk when dealing with 3D-Vision. Much of the ghosting you see is light bleed through the darkened lenses. They don't black them out completely, and light does get through when there are bright spots on the screen. There is still room to improve the glasses. I think that is the biggest problem at the moment, in regards to ghosting.
I find it most interesting that it can give you 30FPS gaming without issues.
-- Less 3D crosstalk and less strobe-backlight doubleghost effect
lol most of that is way over my head.I think I might have a niggling suspicion why 768MB might be used: Complicated color processing algorithms to keep colors stable through a varying refresh rate.
-- 6-bit FRC during variable refresh rates
-- Color stability during variable refresh rates (60Hz vs 120Hz vs 144Hz requires different calibration)
-- LCD Inversion during variable refresh rates or strobed mode (e.g. www.testufo.com/inversion )
-- Less 3D crosstalk and less strobe-backlight doubleghost effect
-- Overdrive algorithms compatible with variable refresh rates
So the 768MB could be full of LUT's and processing memory, to keep the pixel color values stable during G-SYNC, or during strobe mode. There might be other reasons, but, who knows?
Some of us are using higher end monitors that has more motion clarity than yours:Was I the only one who wasn't dying for a new replacement for or improved version of VSync? If my games are running at 60fps and I have vsync on, I'm usually pretty damn happy. The need to buy a new monitor.. or have a "professional" modder install a chip into your existing monitor doesn't seem like the most practical way to sell a new feature either. Don't get me wrong - at least from what I've read it looks great in person. But I guess as anand wrote - it does seem like a complicated solution to the "problem".
Was I the only one who wasn't dying for a new replacement for or improved version of VSync? If my games are running at 60fps and I have vsync on, I'm usually pretty damn happy. The need to buy a new monitor.. or have a "professional" modder install a chip into your existing monitor doesn't seem like the most practical way to sell a new feature either. Don't get me wrong - at least from what I've read it looks great in person. But I guess as anand wrote - it does seem like a complicated solution to the "problem".
Complicated for Nvidia and the monitor makers. Not for us. So why should we care how complicated it is? It does what no other thing has been able to do. Something that presented an annoyance. A problem. Is about to go away.
Tell you what. Give me complicated.
Was I the only one who wasn't dying for a new replacement for or improved version of VSync? If my games are running at 60fps and I have vsync on, I'm usually pretty damn happy. The need to buy a new monitor.. or have a "professional" modder install a chip into your existing monitor doesn't seem like the most practical way to sell a new feature either. Don't get me wrong - at least from what I've read it looks great in person. But I guess as anand wrote - it does seem like a complicated solution to the "problem".
I don't think you understood what I was saying. I have a 120hz monitor. Even if I am not running at a 120FPS, capping at 60FPS still keeps things smooth.
I am not talking about frame interpolation, but rather than having a low monitor refresh rate, having it sync with a higher refresh rate. If I was running at 25FPS, a 25hz refresh rate could cause eye strain, however, a 50hz refresh rate wouldn't. It just wouldn't update the screen with a new frame every refresh.