[PCPER] NVidia G-sync

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
I understand that g-sync is better than triple-buffering.

But triple-buffering is better than regular v-sync or no-v-sync (which both have only 2 framebuffers).

Comparing the difference between g-sync and triple-buffering would have made more sense than comparing g-sync with the two old rendering techniques that use only 2 framebuffers.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
I understand that g-sync is better than triple-buffering.

But triple-buffering is better than regular v-sync or no-v-sync (which both have only 2 framebuffers).

Comparing the difference between g-sync and triple-buffering would have made more sense than comparing g-sync with a 2-framebuffer-rendering-technique.
I thought they were using triple buffering
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I understand that g-sync is better than triple-buffering.

But triple-buffering is better than regular v-sync or no-v-sync (which both have only 2 framebuffers).

Comparing the difference between g-sync and triple-buffering would have made more sense than comparing g-sync with a 2-framebuffer-rendering-technique.

You do realize that triple buffering is only useful while v-sync is active, and it still does not stop juddering at FPS below 60 when active. It does however, prevent a lot of lost FPS when v-sync is on.

If both were getting FPS around the same, then it is fairly safe to assume triple-buffering was used in the demo.

If you are looking for a way for DX to force it on, then I don't know about that. That option has only existed for OpenGL. There may be a limitation with DX.
 

nOOky

Diamond Member
Aug 17, 2004
3,205
2,251
136
I am so sick of a hand full of people saying that.

No doubt. Straight from nvidia's website under the GeForce Titan description:

GeForce GTX TITAN
"With the DNA of the world’s fastest supercomputer and the soul of NVIDIA® Kepler™ architecture, GeForce® GTX TITAN GPU is a revolution in PC gaming performance. GeForce GTX TITAN graphics card combines extraordinary power, advanced control features, and game-changing thermal and acoustic capability to provide an entirely new class of super-performance graphics card."

Anyway, I am interested in this technology and can't wait to see official reviews running compatible hardware.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I didn't realize that 65% of the discrete market was utilizing dual Nvidia cards in their PCs. Are you sure about your facts?

I'm sorry you didn't realize something that NOBODY insinuated.

You don't need dual Nvidia cards. Same card can do both (if powerful enough).

EDIT: "I learned something new today. "

Saw this after. I didn't realize you didn't know.
 
Last edited:

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
That's my point. If G-sync needs Kepler to work, then fine. If it doesn't and it's simply being artificially locked out of other hardware, not fine. Seems like nVidia can only compete if they stack the deck against the competition and when they do gain a competitive advantage they use it to bend their supporters over as far as possible. Why you would be in favor of that defies logic.

while I can find it frustrating that such tactics can seem anti-competitive when first looking at it...but then I think to myself, where is AMD in pushing innovations like lightboost or gsync? where are their alternatives for me to put my money towards? So far its been pretty bleak, at least until we see something with Mantle, which is pretty much just as bad anything nVidia has ever done in terms of trying to gain an "unfair competitive advantage"...

if Gsync is as awesome as it sounds, then we're pretty much guaranteed we'll see an alternative to compete with it as there isn't much stopping someone else from supply an alternative technology, nVidia just happens to be paving the way

until that time I'll happily give nVidia my money if the product delivers, because its certainly something I want and its definitely something no one else has yet provided a solution for
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
while I can find it frustrating that such tactics can seem anti-competitive when first looking at it...but then I think to myself, where is AMD in pushing innovations like lightboost or gsync? where are their alternatives for me to put my money towards? So far its been pretty bleak, at least until we see something with Mantle, which is pretty much just as bad anything nVidia has ever done in terms of trying to gain an "unfair competitive advantage"...

if Gsync is as awesome as it sounds, then we're pretty much guaranteed we'll see an alternative to compete with it as there isn't much stopping someone else from supply an alternative technology, nVidia just happens to be paving the way

until that time I'll happily give nVidia my money if the product delivers, because its certainly something I want and its definitely something no one else has yet provided a solution for

And when you do give them the money and buy the new monitor (or modify your old one) keep in mind that you won't be able to change to AMD without changing your monitor. Even if you wanted an AMD card you'd have to figure additional cost of changing your monitor in. That will almost always lock you in to staying with nVidia, stifling competition. IF it's unnecessary, I think it stinks and is purely anti-competitive.

What's worse is that people won't care and will go with the flow. Imagine if your car maker made it where if you installed a different brand oil filter or battery, your car wouldn't run, and they did it for no other reason but to force you to buy their proprietary products.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
keep in mind that you won't be able to change to AMD without changing your monitor.
Nowhere has it been suggested that monitors with g-sync can not interoperate with AMD cards (or Intel iGPUs) via the old fashioned way any more.

You can switch from nVidia to AMD without having to change your monitor. You'll just lose the benefit of g-sync, that's all.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Nowhere has it been suggested that monitors with g-sync can not interoperate with AMD cards (or Intel iGPUs) via the old fashioned way any more.

You can switch from nVidia to AMD without having to change your monitor. You'll just lose the benefit of g-sync, that's all.

So you would upgrade to get GSync and then buy a card that didn't support it? I don't think so.

Besides, you are avoiding the whole point. Vender lockout for no reason.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
Nowhere has it been suggested that monitors with g-sync can not interoperate with AMD cards (or Intel iGPUs) via the old fashioned way any more.

You can switch from nVidia to AMD without having to change your monitor. You'll just lose the benefit of g-sync, that's all.

It has actually, the module that replaces the one in monitor is only for Display port to keep costs down for licensing.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So you would upgrade to get GSync and then buy a card that didn't support it? I don't think so.

Besides, you are avoiding the whole point. Vender lockout for no reason.

We do not know of any such lockout currently. You are only speculating. However, this has been posted here already, which suggests there is no such lockout, so can we drop it, until we know?

http://mygaming.co.za/news/hardware...ch-will-boost-gaming-monitor-performance.html

A boon to the adoption of GSync is that the translation is all done at the driver level. GSync is merely an extension of the Displayport protocol and does not need approval from a standards body, nor is the method patented, leaving AMD and Intel to work on their own solutions or help the industry to shift to this new display method.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Has anyone heard anything solid in terms of a RELEASE date? I haven't been this excited about monitor developments since overclockable 1440p IPS was discovered
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Has anyone heard anything solid in terms of a RELEASE date? I haven't been this excited about monitor developments since overclockable 1440p IPS was discovered

We are expecting upgrade modules for the Asus monitors before the end of the year and products from the monitor vendors Q1 2014. So its only a few months away.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
He also suggested strongly the gsync module does a lot more than just supporting gsync. With 768MB onboard it certainly doesn't need all that RAM purely for controlling the LCD but he wasn't willing to say more about what it could do.
It's probably used for various reasons:
-- 3D LUT's for response time acceleration; perhaps with 2 or 3 frame histories
-- Advanced variable-refresh-rate LCD overdrive algorithms
-- Maintaining correct colors/gamma through variable refresh rates (remember: 60Hz vs 120Hz vs 144Hz looks different and requires different calibration)
-- Processing memory for the G-SYNC FPGA
Etc.

I'm just speculating, though, based on my knowledge of LCD panel behavior.
 

Majic 7

Senior member
Mar 27, 2008
668
0
0
My disappointment with the stream is that it seems pretty certain that modules to update older monitors isn't going to happen. They will sell some for the ASUS monitor already announced and already sold without the new module but I think that will be it. If they sold them for everything else what incentive would the monitor makers have to adopt the tech? They would be sitting on inventory for years waiting for displays to wear out. Sucks for me since I bought my monitor a little over a year ago.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
My disappointment with the stream is that it seems pretty certain that modules to update older monitors isn't going to happen. They will sell some for the ASUS monitor already announced and already sold without the new module but I think that will be it. If they sold them for everything else what incentive would the monitor makers have to adopt the tech? They would be sitting on inventory for years waiting for displays to wear out. Sucks for me since I bought my monitor a little over a year ago.

Well, they sell monitors all of the time. For those who are in the market this is added incentive to buy one of theirs.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
And when you do give them the money and buy the new monitor (or modify your old one) keep in mind that you won't be able to change to AMD without changing your monitor. Even if you wanted an AMD card you'd have to figure additional cost of changing your monitor in. That will almost always lock you in to staying with nVidia, stifling competition. IF it's unnecessary, I think it stinks and is purely anti-competitive.

What's worse is that people won't care and will go with the flow. Imagine if your car maker made it where if you installed a different brand oil filter or battery, your car wouldn't run, and they did it for no other reason but to force you to buy their proprietary products.

and you don't think I don't undersetand that? If Gsync is as good as it sounds, why on earth would I not want to run it? Why would I want to go to AMD if they don't have an alternative? From my perspective I'm not losing out, AMD is...


and for what would consider to be a more accurate car analogy, this is more like if I bought a car that got twice the HP on half the fuel consumption than any of the other competitors, and the catch was that I had to buy proprietary replacement parts from only one source, would I be upset about it...the answer would be no, because the product is that much better...would I be happier if I wasn't tied down? Sure, but that doesn't mean I would be upset or unhappy, I'd be grateful to have such a product at all.

granted, I have yet to see Gsync for myself, but it definitely sounds like a holy grail type of product for a motion clarity / input lag junkie like me.

The minute AMD announces R-sync (or any company, really) that I could conclude as being a superior alternative to G-sync (ie, just as good but more flexible and/or cheaper), I'll be all over it and anxious to give them my money instead. Until then, they'll be missing out.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
and you don't think I don't undersetand that? If Gsync is as good as it sounds, why on earth would I not want to run it? Why would I want to go to AMD if they don't have an alternative? From my perspective I'm not losing out, AMD is...


and for what would consider to be a more accurate car analogy, this is more like if I bought a car that got twice the HP on half the fuel consumption than any of the other competitors, and the catch was that I had to buy proprietary replacement parts from only one source, would I be upset about it...the answer would be no, because the product is that much better...would I be happier if I wasn't tied down? Sure, but that doesn't mean I would be upset or unhappy, I'd be grateful to have such a product at all.

granted, I have yet to see Gsync for myself, but it definitely sounds like a holy grail type of product for a motion clarity / input lag junkie like me.

The minute AMD announces R-sync (or any company, really) that I could conclude as being a superior alternative to G-sync (ie, just as good but more flexible and/or cheaper), I'll be all over it and anxious to give them my money instead. Until then, they'll be missing out.

Sorry, but your logic escapes me. You must just have a thing for companies taking advantage of a situation.

Unless said car company's parts are the only ones that can properly operate in your car, you are being unnecessarily limited in your rights to spend your money where you want to, and on what you want to. It's called manipulating the market. It's not a good thing if you are a consumer.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
A company taking advantage of their own invention, innovation and risk so there may be rewards for their brands and improved gaming experiences for their potential customers. How dare nVidia innovate and allow the market place and partners to decide! Some impressive display partners.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
My only concern is not where we are today with gsync being NVidia and special monitors only with currently 0 customers. My concern is when it takes off and everyone wants it and will AMD and Intel (and anyone else for that matter) be allowed to implement it for themselves. Will the extension to display port be freely implementable by all.

Even further out there comes the question of royalties for monitor manufacturers and whether they will be able to design their own ASICs for the purpose of supporting the protocol.

But these aren't a concern today because the reality is AMD hasn't yet said "hey we would like to do this but NVidia wont let us". So far they haven't said anything at all on the topic. The monitor manufacturers to begin with seem to be going with the module and limiting the types of monitors and taking the associated cost of doing so. The current state of this thing is still prototype and market testing not full blown in release product for millions of customers. I can't judge NVidia yet as its too early, but if they "physX" this technology I will be annoyed.
 

desprado

Golden Member
Jul 16, 2013
1,645
0
0
My only concern is not where we are today with gsync being NVidia and special monitors only with currently 0 customers. My concern is when it takes off and everyone wants it and will AMD and Intel (and anyone else for that matter) be allowed to implement it for themselves. Will the extension to display port be freely implementable by all.

Even further out there comes the question of royalties for monitor manufacturers and whether they will be able to design their own ASICs for the purpose of supporting the protocol.

But these aren't a concern today because the reality is AMD hasn't yet said "hey we would like to do this but NVidia wont let us". So far they haven't said anything at all on the topic. The monitor manufacturers to begin with seem to be going with the module and limiting the types of monitors and taking the associated cost of doing so. The current state of this thing is still prototype and market testing not full blown in release product for millions of customers. I can't judge NVidia yet as its too early, but if they "physX" this technology I will be annoyed.
First thing why Nvidia will allow AMD use there tech.
Same is AMD will they allow Mantle to use on Nvidia cards the answer is no.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
First thing why Nvidia will allow AMD use there tech.
Same is AMD will they allow Mantle to use on Nvidia cards the answer is no.

The evidence is the opposite. There are statements out there that this extension is not covered by an form of patent or otherwise, its just Nvidia is being a bit elusive on the details when directly asked by the press which bothers me. But actually so far it looks like AMD/Intel/whoever can implement it freely, it sounds like all it really is is a bit of EDID information showing its supported and an extended Vblank signal. That seems to be about it.

As to mantle it really has nothing to do with gsync or nvidia. If AMD wants to create an API tied to a particular release of its hardware its free to do so and obviously can't be supported on NVidia's hardware because by definition its a low level driver for a particular design of AMD cards. Its not a question of AMD not wanting Nvidia to support it, it is by definition not ever going to be possible because its designed to be very hardware specific to save overhead. It likely means future AMD hardware wont be supported either. Apples and Oranges, Mantle can never be a standard whereas G-sync could.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |