Info HWUB: Big Native Vs DLSS vs FSR comparison

Geegeeoh

Member
Oct 16, 2011
145
126
116
Is DLSS good or is TAA a shitshow?
TAA in Death Stranding makes eyes bleed!

I personally dropped AA when I jumped to 4k since, with high resolution, it was less useful and especially since the "new" AA were worse and worse... in a way they were progenitors of DLSS.

Gimme back SSAA!
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
A followup on the Big DLSS vs FSR comparison. This time seeking to answer the "is it better than Native" question.

The answer is sometimes...

You could argue the newest DLSS is better then native on average. It's only by including old versions in older games that it becomes a tie. Even then there is a fix as you can update the DLSS in that game if you really care, and there are programs that exist to make this easy. Pretty impressive considering the amount of nerd rage that still exists on this forum against using it.
 

leoneazzurro

Senior member
Jul 26, 2016
951
1,514
136
If DLSS looks better than "native+TAA" then this means that the game is really badly optimized, particularly in the TAA algorithm. Why? Because the game engine has ALL the information that are available for DLSS, PLUS an higher rendering resolution. Better than "native without TAA" should be almost always the case for the simple fact that DLSS includes antialiasing.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,237
5,244
136
You could argue the newest DLSS is better then native on average. It's only by including old versions in older games that it becomes a tie. Even then there is a fix as you can update the DLSS in that game if you really care, and there are programs that exist to make this easy. Pretty impressive considering the amount of nerd rage that still exists on this forum against using it

I think a lot of the "nerd" rage, is really NVidia rage that is very common on this forum.
 
Reactions: DooKey

Heartbreaker

Diamond Member
Apr 3, 2006
4,237
5,244
136
If DLSS looks better than "native+TAA" then this means that the game is really badly optimized, particularly in the TAA algorithm.

Does the reason matter? The overall takeaway, is that DLSS is quite decent, and given the performance gain, Tim concludes there is seldom reason to not use it.

The claims of DLSS image degradation on this forum, don't really seem to hold up when put to the test.
 

leoneazzurro

Senior member
Jul 26, 2016
951
1,514
136
Yes, the reason matters, because DLSS is not everytime "better than native" as even the test you linked points out, and no, there is not "seldom reason to not use it" but more "there are several times where it should be used" depending on the game and how badly it is programmed. Image degradation (especially in dynamic scenes) can range from none/better than Native+TAA to significant, as the same test you are linking shows in several games, giving Native+TAA and DLSS a 50/50 split in 4K Quality and absolute domination of Native+TAA image quality compared to DLSS Performance.
Conclusion: there is no reason to not use it where it provides a better image than Native+TAA and/or when the performance hit of Native+TAA is too big, but there is no reason to use it when the image quality takes a hit (i.e. the very first game they test Spiderman Miles Morales) and the performance is enough to run the game at native+TAA settings.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,722
3,909
136
I think a lot of the "nerd" rage, is really NVidia rage that is very common on this forum.

A lot of NVIDIA's wounds lately have been self inflicted, it's like they're trying to piss people off. You can't blame people for getting upset with them. EVen brand loyalty only lasts so long. Call it NVIDIA rage if you want, but NVIDIA only has to look in the mirror to find out why.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,237
5,244
136
A lot of NVIDIA's wounds lately have been self inflicted, it's like they're trying to piss people off. You can't blame people for getting upset with them. EVen brand loyalty only lasts so long. Call it NVIDIA rage if you want, but NVIDIA only has to look in the mirror to find out why.

It seems almost entirely because NVidia charges more than AMD.

But the reality is everyone in the lead charges more. When AMD Ryzen 5600X beat Intel at gaming for the first time, they basically killed the 5600 model, until Intel came back with a competitive product.

That effectively meant the price of 6 core AMD CPU went from $200 to $300 in one generation (50% increase), and not a peep from the same crowd that want lynch NVidia when a GPU card goes from $500 to $600 (20% increase).

None of these companies are your buddy. They will ALL price to maximize their profits, and without effective competition, they will ALL take fatter margins. ALL of them.

It's fine to grumble about pricing, but when you just view everything through sour grapes, then your opinions are questionable.

DLSS is very good technology. Often simultaneously delivering improved performance and visuals.
 
Reactions: DooKey and psolord

Thunder 57

Platinum Member
Aug 19, 2007
2,722
3,909
136
It seems almost entirely because NVidia charges more than AMD.

But the reality is everyone in the lead charges more. When AMD Ryzen 5600X beat Intel at gaming for the first time, they basically killed the 5600 model, until Intel came back with a competitive product.

That effectively meant the price of 6 core AMD CPU went from $200 to $300 in one generation (50% increase), and not a peep from the same crowd that want lynch NVidia when a GPU card goes from $500 to $600 (20% increase).

None of these companies are your buddy. They will ALL price to maximize their profits, and without effective competition, they will ALL take fatter margins. ALL of them.

It's fine to grumble about pricing, but when you just view everything through sour grapes, then your opinions are questionable.

DLSS is very good technology. Often simultaneously delivering improved performance and visuals.

That's not true and you know it. RDNA 3 is hardly a value proposition. People are unhappy with NVIDIA largely because of their VRAM shenanigans. The naming nonsense is just the cherry on top. In an odd way, if NVIDIA hadn't been forced to "unlaunch" the 4080 12GB that wouldn't have been a thing.

4070 Ti -> 4080 12GB
4070 -> 4070 Ti

If the 4070 was the 4070 Ti at $600 there would probably be less noise. At least then the names would make sense, even if the performance didn't.

As for the 5600, not sure where you're getting that. Zen 3 was selling so well the non-X models didn't come out until, looks like eight months, after the launch of Zen 3. By then there was ample supply. I think you are remembering wrong.

And I know AMD is not my buddy. But I don't think you were serious.

I agree DLSS is actually an intriging technology. Like G-sync is has the advantage of being first to market and is the best because of that.
 
Last edited:

Heartbreaker

Diamond Member
Apr 3, 2006
4,237
5,244
136
That's not true and you know it. RDNA 3 is hardly a value proposition. People are unhappy with NVIDIA largely because of their VRAM shenanigans. The naming nonsense is just the cherry on top. In an odd way, if NVIDIA hadn't been forced to "unlaunch" the 4080 12GB that wouldn't have been a thing.

4070 Ti -> 4080 12GB
4070 -> 4070 Ti

Seriously, the renaming of 4070 Ti isn't something anyone who doesn't already have a massive chip on their shoulder about NVidia would get mad about.

It makes Nvidia look kind of lame, something to laugh at, not to get mad about.
If the 4070 was the 4070 Ti at $600 there would probably be less noise. At least then the names would make sense, even if the performance didn't.

As for the 5600, not sure where you're getting that. Zen 3 was selling so well the non-X models didn't come out until, looks like eight months, after the launch of Zen 3. By then there was ample supply. I think you are remembering wrong.

I'm not remembering wrong. AMD finally beat Intel, and they could finally charge more, so they did.

Intel was ahead in gaming when, 3600 and 3600X came out together at $200 and $250.
Zen 3 finally beat Intel at gaming, so 5600X came out alone at $300. A massive increase for 6 cores.

It was only after Intel came back with 12th gen, and AMD lost the upper hand that the better value 5600 was released.

5600X was released Nov 2020, 5600 was released April 2022. A lot more than 8 months.

While there is nothing wrong with charging a premium when you have the top product, it's silly to pretend that only Intel and NVidia do this.

All companies want to be selling the most premium product, so they can charge the biggest margin.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,722
3,909
136
Seriously, the renaming of 4070 Ti isn't something anyone who doesn't already have a massive chip on their shoulder about NVidia would get mad about.

It makes Nvidia look kind of lame, something to laugh at, not to get mad about.


I'm not remembering wrong. AMD finally beat Intel, and they could finally charge more, so they did.

Intel was ahead in gaming when, 3600 and 3600X came out together at $200 and $250.
Zen 3 finally beat Intel at gaming, so 5600X came out alone at $300. A massive increase for 6 cores.

It was only after Intel came back with 12th gen, and AMD lost the upper hand that the better value 5600 was released.

5600X was released Nov 2020, 5600 was released April 2022. A lot more than 8 months.

While there is nothing wrong with charging a premium when you have the top product, it's silly to pretend that only Intel and NVidia do this.

All companies want to be selling the most premium product, so they can charge the biggest margin.

You are assuming I am mad and have a chip on my shoulder. I'm part of the group that thought it was a bad idea, or lame, as did so many others that NVIDIA had to "unlaunch" a GPU, which is unprecedented. Yet I'm the one who has a problem.

I misunderstood the 5600 remark. I took it to mean they launched it then took it away. But your argument doesn't make sense either. The 5000 series only launched with X models. They were impossible to get for months. It was a supply issue, not "let's rip people off since we have the best gaming CPU". You just said it yourself:

there is nothing wrong with charging a premium when you have the top product

I have no problem saying AMD will do it too. I don't know why you would assume otherwise. You said the forum has "NVIDIA rage". Do you think the same of them too?

I don't know know where Intel came up, I've only mentioned NVIDIA. I own Intel and have owned NVIDIA in the past. I think the best GPU ever was the 8800GT. I ran that until it literally died. Only card I've ever done that with. Doesn't mean I can't be unhappy with what NVIDIA is currently doing.

Anyway, this is pointless and going nowhere. I see no reason to further derail this thread.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,237
5,244
136
I have no problem saying AMD will do it too. I don't know why you would assume otherwise. You said the forum has "NVIDIA rage". Do you think the same of them too?

Because the overwhelming thing I see raging at NVidia about is pricing. People unironically calling them "Ngreedia", which is right up there with calling Microsoft, M$.

People act as if the only reason GPUs cost more today than they did 5 or 10 years ago is only because of NVidia charging ever greater margins.

Corporations all attempt to maximize profits. It's what they do. If AMD was the leading in GPUs, they would be charging NVidia prices, and if NVidia was behind, they would be charging AMD pricing.

It's just that NVidia has been in front so long that some people seem develop complex about them, that colors their view of everything NVidia does.

So I think DLSS gets dumped on more than it deserves because of that.

That's not to say NVidia doesn't do some scummy things, but I just hate those things, not letting it color my views of everything thing do. They bring a lot of interesting technology to market that should be viewed on it's own merits.

I think the best GPU ever was the 8800GT. I ran that until it literally died. Only card I've ever done that with. Doesn't mean I can't be unhappy with what NVIDIA is currently doing.

I used an ATI 9700 Pro until it died (which made me very unhappy - great card), then I bought an 8800GT.

My 8800 GT is still the only dGPU in the house, and it still runs with tens of thousands of hours on it. Recently built a new PC after ~15 years, and I'm finally looking for a new GPU.
 
Reactions: GodisanAtheist

Thunder 57

Platinum Member
Aug 19, 2007
2,722
3,909
136
Because the overwhelming thing I see raging at NVidia about is pricing. People unironically calling them "Ngreedia", which is right up there with calling Microsoft, M$.

People act as if the only reason GPUs cost more today than they did 5 or 10 years ago is only because of NVidia charging ever greater margins.

Corporations all attempt to maximize profits. It's what they do. If AMD was the leading in GPUs, they would be charging NVidia prices, and if NVidia was behind, they would be charging AMD pricing.

It's just that NVidia has been in front so long that some people seem develop complex about them, that colors their view of everything NVidia does.

So I think DLSS gets dumped on more than it deserves because of that.

That's not to say NVidia doesn't do some scummy things, but I just hate those things, not letting it color my views of everything thing do. They bring a lot of interesting technology to market that should be viewed on it's own merits.



I used an ATI 9700 Pro until it died (which made me very unhappy - great card), then I bought an 8800GT.

My 8800 GT is still the only dGPU in the house, and it still runs with tens of thousands of hours on it. Recently built a new PC after ~15 years, and I'm finally looking for a new GPU.

The 9700 Pro is pretty much tied for the 8800GT for the best ever IMHO. I give the 8800GT the edge for how long I used it. I had a 9800 Pro and loved it. The only reason I had to get rid of it was because I was moving to PCIe.

I'm surprised you still have an 8800GT. With how active you are on this forum, I figured you were some (modern) hardcore gamer. I get it, for years now I haven't cared for games that have been coming out.
 
Reactions: NTMBK

Heartbreaker

Diamond Member
Apr 3, 2006
4,237
5,244
136
The 9700 Pro is pretty much tied for the 8800GT for the best ever IMHO. I give the 8800GT the edge for how long I used it. I had a 9800 Pro and loved it. The only reason I had to get rid of it was because I was moving to PCIe.

I'm surprised you still have an 8800GT. With how active you are on this forum, I figured you were some (modern) hardcore gamer. I get it, for years now I haven't cared for games that have been coming out.

Financial challenges. I've been wanting to upgrade for a long time. I just keep up on all the advances in the meantime, and play old games from GOG.
 

AdamK47

Lifer
Oct 9, 1999
15,262
2,879
126
The games they point out that look worse at 4K vs the upscale methods comes down to postprocessing. DLSS and FSR techniques rely heavy on postprocessing in their upscaling algorithms. This can reduce artifacting such as shimmering.

Injecting postprocessing at native 4K can produce phenomenal results. A tweaked mixture of SMAA, quality FXAA, and lumasharpen through ReShade can make scenes look much better. There are also plenty of user created shaders that can make native the standout winner in every scenario.

I have my own ReShade preset I use in just about every game. Native 4K always. No upscaling for me.
 

coercitiv

Diamond Member
Jan 24, 2014
6,256
12,189
136
Sure, let's just dismiss this under... "nerd rage".
Don't take the bait. There's a far more productive discussion to be had here, and it starts by acknowledging both the strengths and weaknesses of upscalers. Personally I'm inclined to say that DLSS is worth enabling when there's a benefit to be had (for most people there is), and this includes compensating for a poor AA implementation, saving power when playing with a frame cap, locking in a fluid FPS while keeping some desired IQ settings on higher level. I think I said this before, but DLSS suffered greatly at launch not only due to technical limitations and weak IQ results, but also because it was used as a crutch for RT. It was pushed out half-baked to take the RT performance hit, and it took a while for the tech to stand on it's own feet as people and media began to use it independently of RT effects. Nowadays it is a good tool to prolong the life of a card or simply enable higher graphical settings on a system which is very close to fluid performance but still needs a bit of a push to lock in the desired FPS.

I still wish we had more freedom over deciding input/output resolution though. One neat trick with upscalers is to enable them while also increasing output resolution beyond native monitor specs. For example, enabling FSR 2.0 Quality with 3840x2160 output on a 1440p monitor will result in FSR being applied to a 1440p render (so "native" render), which it will upscale to 4K and then downsample to 1440p again. Some games support higher than 100% render scaling, for others you need driver features (and some games don't like the driver tricks).

One of the games where developers understood perfectly how higher than native rendering could be harnessed is Titanfall 2. The game can dynamically increase rendering resolution beyond native as long as performance stays within a certain (configurable) threshold. The result is just great as it melts certain aliasing artifacts, and I wish more games made use of this technique.
 

coercitiv

Diamond Member
Jan 24, 2014
6,256
12,189
136
I'm not on board as taking "a poor AA implementation" as an acceptable baseline...
I think you read too much into my comment. We still judge games based on IQ, and AA implementation is a big part of that.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,237
5,244
136
Don't take the bait. There's a far more productive discussion to be had here, and it starts by acknowledging both the strengths and weaknesses of upscalers. Personally I'm inclined to say that DLSS is worth enabling when there's a benefit to be had (for most people there is), and this includes compensating for a poor AA implementation, saving power when playing with a frame cap, locking in a fluid FPS while keeping some desired IQ settings on higher level.

It has such a wide array of uses, it's hard to imagine not wanting it as an option today.

I think I said this before, but DLSS suffered greatly at launch not only due to technical limitations and weak IQ results, but also because it was used as a crutch for RT. It was pushed out half-baked to take the RT performance hit, and it took a while for the tech to stand on it's own feet as people and media began to use it independently of RT effects. Nowadays it is a good tool to prolong the life of a card or simply enable higher graphical settings on a system which is very close to fluid performance but still needs a bit of a push to lock in the desired FPS.

DLSS 1.0 was massively oversold, and massively underdelivered, and on top of that had the ridiculous requirement that it had to be trained individually for each game.

The first half decent version was a one off shipped as an update to the game Control. A custom DLSS 1.x that didn't actually use any Deep Learning. It was just shader program, perhaps ironically, it was essentially FSR 2.0.

From there, they applied the same logic used in that custom Control version and enhanced it with Deep Learning.

I expect that next generation AMD cards (RDNA 4) will have dedicated Deep Learning units, and the FSR 4.0 will use Deep Learning. FSR 3.0 will add Fake Frame Generation...
 

Kocicak

Senior member
Jan 17, 2019
982
973
136
... DLSS is worth enabling when there's a benefit to be had (for most people there is), and this includes ... saving power when playing with a frame cap,

Does anybody have experience with frame generation turned on with limited frame rate?

With my limited experience I found DLSS with frame generation to work properly only if the monitor is uncapped. When the cap is lower than what DLSS with frame generation could do, it does not work.

Has anybody else experienced this or am I doing something wrong?
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,237
5,244
136
Does anybody have experience with frame generation turned on with limited frame rate?

With my limited experience I found DLSS with frame generation to work properly only if the monitor is uncapped. When the cap is lower than what DLSS with frame generation could do, it does not work.

Has anybody else experienced this or am I doing something wrong?

HWUB in their extensive testing said Frame Generation was a bad idea if you are capped by your monitor.

The useful range of frame generation is kind of narrow. You need to have decent starting frame rate, so the latency penalty isn't too large (or a title were latency doesn't matter), and then you need headroom on your monitor for the extra frames to not be limited.
 
Reactions: Mopetar

Kocicak

Senior member
Jan 17, 2019
982
973
136
It is a pity if it really does not work, because for example in Hogwarts legacy, with my graphic card and settings the FPS outdoors can drop to 80, but indoors can go over 130. I do not need that. I would be happy with 100 and save energy instead. I need Frame generation ON, because without it would drop to 40 fps outside, which does not look very good, and it is also outside the range of my monitor.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |