[PCPER] NVidia G-sync

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
The point is they have all that hard won performance for consoles, so why not spend a little money to use those chunks of code on the PC? They aren't starting over from nothing. They are optimizing the PC version.

Honestly, and I say this in the nicest non troll way, but Carmack and Sweeney are so far up Nvidia's rear end that I can't honestly take anything they say seriously.

At least Repi from DICE has the balls to appear at Nvidia and AMD events to talk abou the future of PC gaming, and he still has great things to say about Mantle, and G-Sync. He's truly excited about new technology for PC gamers. Sweeney and Carmack haven't released a decent PC game in 10 years (and those good games 10 years ago were meh at best.) Johann is PC gaming at the moment.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
If Overlord gets out a 120hz 1440p IPS with gsync I will drop my u3011 for one. Heck if anyone gets out a 60hz 1440p or 1600p IPS I will go for it.

Hopefully this tech is modular enough that people with the know how can fashion daughter boards for any monitor that supports display port.

Ideal situation right there.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
If Overlord gets out a 120hz 1440p IPS with gsync I will drop my u3011 for one. Heck if anyone gets out a 60hz 1440p or 1600p IPS I will go for it.

Hopefully this tech is modular enough that people with the know how can fashion daughter boards for any monitor that supports display port.

Ideal situation right there.

I think that for this to gain any major adoption AMD and Nvidia will need to push for it together. It's a really great value added option for Nvidia, and one that will likely sway me towards nvidia with my next GPU. Unfortunately I don't think they are interested in doing something awesome like teaming up to fight Intel. Instead they will continue to fight each other.

Imagine how the fanboys worlds would be absolutely turned upside down if Nvidia and AMD combined to fight Intel and Qualcomm?

It may need to happen one day for them to both survive.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
The point is they have all that hard won performance for consoles, so why not spend a little money to use those chunks of code on the PC? They aren't starting over from nothing. They are optimizing the PC version.

Honestly, and I say this in the nicest non troll way, but Carmack and Sweeney are so far up Nvidia's rear end that I can't honestly take anything they say seriously.

At least Repi from DICE has the balls to appear at Nvidia and AMD events to talk abou the future of PC gaming, and he still has great things to say about Mantle, and G-Sync. He's truly excited about new technology for PC gamers. Sweeney and Carmack haven't released a decent PC game in 10 years (and those good games 10 years ago were meh at best.) Johann is PC gaming at the moment.

EPIC's engines are still extremely important to PC gaming, more so than Frostbite. Carmack could easily build a new engine that would eclipse Frostbite since it's not exactly an impressive looking or functioning engine. Plus he's working on VR which is the real future of the PC gaming landscape.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
EPIC's engines are still extremely important to PC gaming, more so than Frostbite. Carmack could easily build a new engine that would eclipse Frostbite since it's not exactly an impressive looking or functioning engine. Plus he's working on VR which is the real future of the PC gaming landscape.

Frostbite not impressive? what?

ROFL....

There is absolutely no engine that looks as good as frostbite at the framerates it achieves on a variety of hardware types.

The only thing unreal about UE3 (last gen I know) in comparison to Frostbite is the hideous texture pop in. I haven't personally seen id tech 5, but all reports lead to Frostbite being far superior.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
EPIC's engines are still extremely important to PC gaming, more so than Frostbite. Carmack could easily build a new engine that would eclipse Frostbite since it's not exactly an impressive looking or functioning engine. Plus he's working on VR which is the real future of the PC gaming landscape.

Is this a joke ? Frostbite is the best thing on PC, and not just for visuals alone, although it has those on the lock. Where is epic ? That's right, still churning that UE3 turd forever and Carmack has done nothing since Doom 3 worth mentioning.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Frostbite not impressive? what?

ROFL....

There is absolutely no engine that looks as good as frostbite at the framerates it achieves on a variety of hardware types.

The only thing unreal about UE3 (last gen I know) in comparison to Frostbite is the hideous texture pop in. I haven't personally seen id tech 5, but all reports lead to Frostbite being far superior.


Yes UE3 is so old yet it's being used by next generation consoles and the PC. So that is pretty telling of how good Epic built their engine. FB 3 powers what, 5-6 games? I guess in your view of the PC world, an engine that has a marginal # of games using it is all-important.

Is this a joke ? Frostbite is the best thing on PC, and not just for visuals alone, although it has those on the lock. Where is epic ? That's right, still churning that UE3 turd forever and Carmack has done nothing since Doom 3 worth mentioning.

Here's where Epic is: http://www.giantbomb.com/unreal-engine-3/3015-86/games/ and with UE4 coming, FB3 is easily outclassed in visuals (although some UE3 games already can do that): http://www.youtube.com/watch?v=dO2rM-l-vdQ Then there's EA controlling Frostbite Engine 3 so no other devs can use it which makes it even less important for PC gaming overall. Unless you equate EA with PC gaming.

If you want to talk about an engine that looks and runs great with a huge open world, FB gets eclipsed by Rockstar's RAGE engine. Yes yes GTA V is console first but eventually it'll come to the PC just like most games do these days.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
FB looks great in videos but in the actual game its not impressive looking at all to me. there where plenty of times in BF 3, I just looked at things thinking wow this looks bad for a modern game. I guess I just had higher expectations based on all the hype about graphics. BF 4 looks only a little better than BF 3 from what I have seen. and some of the trees still look so out of place compared to the rest of the graphics.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
FB looks great in videos but in the actual game its not impressive looking at all to me. there where plenty of times in BF 3, I just looked at things thinking wow this looks bad for a modern game. I guess I just had higher expectations. BF 4 looks only a little better than BF 3 from what I have seen. and some of the trees still look so out of place compared to the rest of the graphics.

FB took the photoshop filter nature of deferred rendering way too far.

Cryengine 3 has the best balance at the moment in my opinion.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I've been seeing those worthless unreal tech demos for literally two years now. Wake me up when there is a released game on it. Been playing Battlefield 3 for two years now and BF4 is stepping it up in two weeks.

DICE is dominating pushing PC gaming forward with actual games you can play, not another recording of a tech demo on youtube every six months. I don't get this infatuation with id and epic. The days of Doom and Unreal are long over, those guys are dinosaurs who have done nothing ground breaking in about a decade.

It's devs like DICE, 4A, CDPR, Crytek etc. who are putting out actual ground breaking games you can play. At this point I think Epic is just in the tech demo business lol.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
I've been seeing those worthless unreal tech demos for literally two years now. Wake me up when there is a released game on it. Been playing Battlefield 3 for two years now and BF4 is stepping it up in two weeks.

DICE is dominating pushing PC gaming forward with actual games you can play, not another recording of a tech demo on youtube every six months. I don't get this infatuation with id and epic. The days of Doom and Unreal are long over, those guys are dinosaurs who have done nothing ground breaking in about a decade.

It's devs like DICE, 4A, CDPR, Crytek etc. who are putting out actual ground breaking games you can play. At this point I think Epic is just in the tech demo business lol.

The developer tools the Unreal Engine packs are unmatched.

The reason why there are so few games made on the engines you listed as "awesome" is because they just aren't very artist friendly.

You forget that artists do the production value part of gaming, and that that is the largest part of the ballooning budgets of AAA games.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The point is they have all that hard won performance for consoles, so why not spend a little money to use those chunks of code on the PC? They aren't starting over from nothing. They are optimizing the PC version.

Honestly, and I say this in the nicest non troll way, but Carmack and Sweeney are so far up Nvidia's rear end that I can't honestly take anything they say seriously.

At least Repi from DICE has the balls to appear at Nvidia and AMD events to talk abou the future of PC gaming, and he still has great things to say about Mantle, and G-Sync. He's truly excited about new technology for PC gamers. Sweeney and Carmack haven't released a decent PC game in 10 years (and those good games 10 years ago were meh at best.) Johann is PC gaming at the moment.

Good or bad, Mantle does cause extra fragmentation. That aspect is a bad thing, but the good could be better performance.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
FB looks great in videos but in the actual game its not impressive looking at all to me. there where plenty of times in BF 3, I just looked at things thinking wow this looks bad for a modern game. I guess I just had higher expectations based on all the hype about graphics. BF 4 looks only a little better than BF 3 from what I have seen. and some of the trees still look so out of place compared to the rest of the graphics.

I also don't understand the infatuation with BF3/BF4 visuals, they're marginal at best compared to other games out there. The textures up close are not impressive at all. BF4's legacy is bringing back old features from BF2 and tossing in scripted events.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
29th Apr 2013, 16:16
Tearing is a bad experience to me personally but it seems the majority accept varying amount of it, if they manage to solve it without Vsync then that will be the new thing on the block with what was before being put forwards as unplayable and flat out bad experience with reviews asked to focus on it depending on who gets there first.

http://forums.overclockers.co.uk/showpost.php?p=24199463&postcount=100

Getting rid of the tearing without Vsync is a good thing and is something that is needed, i dont remember anyone saying that they dont use Vsync because they dont like how it looks, most people like how it looks but how it feels is a real problem for many people, personally i have got accustomed to it so i notice it but my gaming skills over come it just like when i used to play Quake world at 12fps @ 320 x 200 on my Amiga againts PC users who had 100s of fps and still come in the top 3
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Have they explained why this is an nVidia only feature? Is there something about AMD cards that they aren't able to run it, or is it simply vendor lockout?
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Have they explained why this is an nVidia only feature? Is there something about AMD cards that they aren't able to run it, or is it simply vendor lockout?

Simple vendor lockout, or if you prefer: The Way It's Meant To Be Played. But in eyes of many here, it seems to not be a market fragmentation. Bringing more performance through software by AMD is somehow market fragmentation. PhysX is nvidia added value. Mantle is is faul play and fake Function_Cripple_Competition_For_Lots_of_Funds. 3 TWIMTBP games are better than 3 out of 8(?) GE titles. And frame rates doesn't matter anymore (for now), but the "quality" of frames do.

What I find interesting about this whole thing is people's perception. From a usability standpoint, I'm not even remotely interested in it, since use less than FHD display.

BTW. Why GTX650Ti and lower cards are not compatible with g-sync?
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
FB looks great in videos but in the actual game its not impressive looking at all to me. there where plenty of times in BF 3, I just looked at things thinking wow this looks bad for a modern game. I guess I just had higher expectations based on all the hype about graphics. BF 4 looks only a little better than BF 3 from what I have seen. and some of the trees still look so out of place compared to the rest of the graphics.

I feel the same way. I follow Battlefield pretty close because I like the game but the engine doesn't impress me and it doesn't seem all that modular. When they used Frostbite for a Need for Speed game I thought it looked worse than the previous Need For Speed game that did not use that engine at all.

I've been seeing those worthless unreal tech demos for literally two years now. Wake me up when there is a released game on it. Been playing Battlefield 3 for two years now and BF4 is stepping it up in two weeks.

DICE is dominating pushing PC gaming forward with actual games you can play, not another recording of a tech demo on youtube every six months. I don't get this infatuation with id and epic. The days of Doom and Unreal are long over, those guys are dinosaurs who have done nothing ground breaking in about a decade.

It's devs like DICE, 4A, CDPR, Crytek etc. who are putting out actual ground breaking games you can play. At this point I think Epic is just in the tech demo business lol.

Out of the games you listed only CDProjekt is doing anything remotely groundbreaking. While Crytek is putting tessellation on hidden frogs in their games and DICE is developing a cheap way to port their games to PC from Consoles CDProjekt is actually making a top tier game out of it. Battlefield is cool but it's nowhere near the quality level of CDPR's stuff. Unreal Engine has always been very developer friendly and is only matched by IDTech engines in terms of a developer's ability to modify it for their desired art style. Look at what they have donw with it to create games like Bioshock Infinite, Dishonored, and Borderlands. Some of the scenes in these games look like they could be out of a painting which is a pretty neat visual style IMO.

Anyway G-Sync has the ability to make every game played feel and look more fluid. It won't be limited to specific titles or specific engines. This is what we need really as games become more demanding at higher resolutions and GPUs that can handle those resolutions become more expensive. It will be possible to enjoy the game without stutter and input lag without spending thousands on GPUs to do it. This is provided it comes to resolutions above 1080p which it seems it may because of Overlord's interest in the technology or perhaps it's Nvidia's interest in Overlord?
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Have they explained why this is an nVidia only feature? Is there something about AMD cards that they aren't able to run it, or is it simply vendor lockout?

It's neither. The hardware and software doesn't exist for the new feature. Nvidia is having both made for their cards.

Now there may be a patent for it, so I don't know how easily it will be for AMD to follow suite.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Could care less about how developer friendly the ancient and low-tech UE3 is. I play the games not make them.

Dice, Crytek and 4A in particular all have games out that I can play that kick the backside of everything out there visually and game play wise. I'm sure Witcher 3 is going to be stellar as well, again, wake me up when it's here and I can play it.

It's worth noting all these dev houses are using their own in house engine. There is nothing on UE3 that comes near the quality of Frostbite 3/4, Cryengine 3, 4A Engine. Just look at the scope of a Battlefield game and the size and detail of the game world. It's impeccable. I'd argue as a whole Cryengine 3 looks better in Crysis 3 than Battlefield 3 or 4 do, but you'd never get a game of the scope of Battlefield to run at acceptable frame rates using CE3 to the extent it's used in Crysis 3.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
http://forums.overclockers.co.uk/showpost.php?p=24199463&postcount=100

Getting rid of the tearing without Vsync is a good thing and is something that is needed, i dont remember anyone saying that they dont use Vsync because they dont like how it looks, most people like how it looks but how it feels is a real problem for many people, personally i have got accustomed to it so i notice it but my gaming skills over come it just like when i used to play Quake world at 12fps @ 320 x 200 on my Amiga againts PC users who had 100s of fps and still come in the top 3

I remember a few months back, someone on this forum proposed the idea here, though with more detail, or at least, saying that it should refresh when the GPU asks it to. Even then everyone agreed it was a great idea. I don't see how anyone could see a downside other than cost or fanboyism.

It is a good idea for sure.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I remember a few months back, someone on this forum proposed the idea here, though with more detail, or at least, saying that it should refresh when the GPU asks it to. Even then everyone agreed it was a great idea. I don't see how anyone could see a downside other than cost or fanboyism.

It is a good idea for sure.

I suspect I was first with the idea, although I have been mentioning it for years in various circles. It is an obvious invention based on the way LCDs work however.

Based on the way its implemented its perfectly possible for AMD/Intel to implement this in their cards in the future. It might requiring licencing but I suspect actually NVidia is more than happy to say the details of how to make it work, they themselves said they were interested in pushing it out to other vendors. This isn't like PhysX, everyone should want it and every vendor should move to it and that means monitors and GPUs need to change. But you have to start somewhere, a single GPU and a few monitor models to show that its better then you can start to talk about standards and licencing and all that good stuff. But if you try and do that before (as has been done in the past) it will fail.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Since not much has been disclosed about the inner workings behind what probably is one of the most interesting concepts in gaming as of late, we can do nothing but speculate about what REALLY is needed for this new way of frame syncing to work.

From what Nvidia said, we know the following is required:

- Kepler uArch, specifically 650 ti and above.
- A supported monitor. By the asus one's specs, we can infer: 120/144hz capability. We still dont know if it's tied to a Panel technology, but right now only TN is discussed as it is the most fit for the task.
- A DP output. This is bolded because I think that, after being modded, losing the other outputs it's not a coincidence and the DP is key in allowing the dynamic vertical refresh to happen.
- The mod in question, that is shaped as a piece of PCB that should replace one of those 2 PCBs that are behind the panel when you mod your monitor. This is currently priced at 170 bucks, but they hope to get it lower.

So, what exactly does this PCB that the one that it replaces can't? Does it add the dynamic refresh rate functionality or just enables it? CRTs really could only work with fixed refresh rates because of the tech behind those monitors.

Now, in the era of LCDs, it isn't really needed because there is now persistence and they can hold an image without the need to refresh at a fixed rate if the image didn't change. So we could have had this technology earlier if it weren't for the need to keep CRT compatibility and the need of a refresh rate they (as in, monitor manufacturers) knew to make into a custom.

My assumption in the end is that monitor manufactures could have this feature incorporated a long ago, but following the motto "dont fix what it isnt broken" they stayed with fixed refresh rates even when it wasn't needed. The PCB to mod targets this point specifically and enables the feature, while the DP output may be the only one capable of dictating the refresh commands the other way around (that is now, from the GPU to the monitor) and thus is the only one up for the task. Last but not least, there might be in fact a requeriment for some kind of hardware frame metering needed, giving validity to the Kepler only requeriment.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
120/144 has nothing to do with it and thats probably why they demonstrated at 60 hz. it will just start out in the 144 hz ASUS because that is probably the type of screen most gamers are interested in.

http://www.geforce.com/whats-new/ar...evolutionary-ultra-smooth-stutter-free-gaming


Beginning later this year, NVIDIA G-SYNC will be available as monitor module you can install yourself, or buy pre-installed in one of the best monitors currently available. Next year, G-SYNC monitors will be available on the shelves of your favorite e-tailers and retailers, in a variety of screen sizes and resolutions, eventually scaling all the way up to 3840x2160 (“4K&#8221.


EDIT: it would be great if BrightCandle or a mod could add that nvidia link since it explains a lot
 
Last edited:

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Which adds to my point in saying: what is really needed for this to work? My bet is less what NV is saying it needs, but well.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Have they explained why this is an nVidia only feature? Is there something about AMD cards that they aren't able to run it, or is it simply vendor lockout?

No reason other than Lockout as the clever stuff the hardware is on the monitor side just like 4K display VESA Display ID v1.3 which AMD came up with but anyone can use it.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |