Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 53 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Heartbreaker

Diamond Member
Apr 3, 2006
4,238
5,244
136
Yeah, I think my initial assessment of DLSS3 was correct. I see absolutely no reason to use it over DLSS2.x or native. Also, with the launch of Ada/DLSS3, Digital Foundry has confirmed beyond all doubt that they're, uhm, rather partial to Nvidia. Can't take them seriously at all.

This DF analysis clearly felt like it was bending over backwards to keep NVidia happy.

Even without a Quid Pro Quo, there is a tendency to not want to bite the hand that feeds you an early exclusive.
 

Saylick

Diamond Member
Sep 10, 2012
3,217
6,585
136
This DF analysis clearly felt like it was bending over backwards to keep NVidia happy.

Even without a Quid Pro Quo, there is a tendency to not want to bite the hand that feeds you an early exclusive.
Nvidia has enough influence and clout that there rarely needs to be a need for a formal "I help you, so you help me" agreement for them to exert their influence, e.g. that whole Hardware Unboxed debacle.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,238
5,244
136
Since Samsung has been doing this specifically for gaming (Game Motion Plus) on their TV's for years, NVidias claim that you need 4000 series card to to do this seems unlikely.

If NVidia makes much marketing headway with this, I bet AMD will code up an equivalent that works with current cards.
 

linkgoron

Platinum Member
Mar 9, 2005
2,317
833
136
Since Samsung has been doing this specifically for gaming (Game Motion Plus) on their TV's for years, NVidias claim that you need 4000 series card to to do this seems unlikely.

If NVidia makes much marketing headway with this, I bet AMD will code up an equivalent that works with current cards.
I assume that Game Motion Plus waits for the next frame and puts something in-between while Nvidia tries to create a new image before the next frame is available.
 

coercitiv

Diamond Member
Jan 24, 2014
6,257
12,197
136
I assume that Game Motion Plus waits for the next frame and puts something in-between while Nvidia tries to create a new image before the next frame is available.
Not according to the Digital Foundry video. They state the method involves rendering 2 "traditional" frames and then generating one intermediary frame to go between them.
 
Reactions: Mopetar

CakeMonster

Golden Member
Nov 22, 2012
1,394
503
136
Even if DLSS3 is not acceptable for extreme enthusiasts, I hope it will contribute to driving high refresh 4K monitors to market.
 

exquisitechar

Senior member
Apr 18, 2017
657
872
136
IMO the only fair comparison is DLSS 3 with Frame generation enabled vs disabled on the same card.

Since the whole point of the comparison is to see how much latency frame generation adds.
I agree. I see people in other places saying: "Sure, it adds latency, but with Nvidia Reflex on, it isn't bad at all".

The thing is, every DLSS3 game supports Reflex and DLSS2. DLSS3 is essentially DLSS2 + frame generation. You can use the rest without using frame generation. So, as you say, the real comparison is between frame generation vs no frame generation, or simply put, DLSS3 vs DLSS2. And I don't see DLSS3 holding up in that comparison, since it has worse latency and degrades image quality by adding artifacts. Is the visual fluidity worth it? Maybe I'd change my mind if I used it myself.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
The latency increase was a given, but they're not even using forward projection, just inserting an "AI" frame between 2 real frames.

So I can either buy an overpriced 40x0 furnace monstrosity where it works in exactly three games, or an el-cheapo "smoothing" TV where it works everywhere.

Hmmm, decisions, decisions...
 
Reactions: Tlh97 and Stuka87

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
The more we learn, the less confusion there is.

It works just Like Samsung "Game Motion Plus" and other TV smoothing.

The initial confusion was from the misleading overhype of NVidia.

To be fair, it should work much better than the more basic extant algorithms for doing this.

They're doing a lot of computational work of a sort that NN's are brilliant at. With the images on both sides to work with the results should presumably be very good in terms of image quality. They probably run at least parts of the intermediate frame generation at the same time as working out the next 'real' frame.

As to whether the results are worth it, I've no idea.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
Priorities matter.
Of course.

They were pushing gsync with 4K monitors because real 4K was a big deal, but then DLSS arrived and suddenly real 4K didn't matter, only imaginary "AI" resolutions.

Also they were more than happy with 3080 10GB vs 2080 8GB in Doom Eternal Ultra textures because 8GB was being squeezed. But then 3070/3070ti came out and suddenly texture quality didn't matter and it was quite "reasonable" to dial back the setting.

And now they were pushing ultra low latency...until DLSS 3.0 arrived.
 
Last edited:

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
The latency increase was a given, but they're not even using forward projection, just inserting an "AI" frame between 2 real frames.

So I can either buy an overpriced 40x0 furnace monstrosity where it works in exactly three games, or an el-cheapo "smoothing" TV where it works everywhere.

Hmmm, decisions, decisions...

I am 100% certain, that now that Nvidia has done it:

-AMD will do something similar, that will be an Adrenaline tick that will work in everything but with slightly lower image quality (FSR 3.0?).
-Intel will do the same in XeSS 2.0 or something.
-Some third party hacker/whatever dev, will do the same with a third party app, like the devs from "magpie" and/or "lossless scaling" apps.

I did turn on motion smoothing in one of my TVs of old and the lag was....bad....Maybe newer models are better.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,238
5,244
136
To be fair, it should work much better than the more basic extant algorithms for doing this.

They're doing a lot of computational work of a sort that NN's are brilliant at. With the images on both sides to work with the results should presumably be very good in terms of image quality. They probably run at least parts of the intermediate frame generation at the same time as working out the next 'real' frame.

As to whether the results are worth it, I've no idea.

Some TV's do have a lot of lag doing this as they have a lot of lag doing everything.

I haven't researched it extensively (because I never cared) but at least with Samsung's "Game Motion Plus", it only adds about the same half frame of Lag that NVidia does.

So timing at least seems to be a non issue for little ARM SoCs in TV's.

Now quality, I haven't examined, but as defenders of the NVidia fake frames have been pointing out, these frames are up for a short period so they cover the artifacts...

I will note that if I search on Youtube for "Samsung Game motion plus", I only seem to find positive reviews.

Bottom line here, is that this seems to be easily done on low powered ARM SoC with no more lag than RTX 4000 series, at acceptable quality. Which means it can be done on a Potato in the PC CPU/GPU world.

NVidia claims that only RTX 4000 can do this seem to be BS.
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
Some TV's do have a lot of lag doing this as they have a lot of lag doing everything.

I haven't researched it extensively (because I never cared) but at least with Samsung's "Game Motion Plus", it only adds about the same half frame of Lag that NVidia does.

So timing at least seems to be a non issue for little ARM SoCs in TV's.

Now quality, I haven't examined, but as defenders of the NVidia fake frames have been pointing out, these frames are up for a short period so they cover the artifacts...

I will note that if I search on Youtube for "Samsung Game motion plus", I only seem to find positive reviews.

Bottom line here, is that this seems to be easily done on low powered ARM SoC with no more lag than RTX 4000 series, at acceptable quality. Which means it can be done on a Potato in the PC CPU/GPU world.

NVidia claims that only RTX 4000 can do this seem to be BS.
But but Nvidia magic sauce, mumbo jumbo, AI, machine learning, algorithmic computing, DLSS, did I mention AI...
 

yottabit

Golden Member
Jun 5, 2008
1,369
229
116
Back to raw raster performance for a second

Is it unreasonable to think 4080 16 GB would have over 2x 4k gaming performance of a 3070 ? I realize I'm comparing across tiers here but I bet others will be too since some of us had to pay scalper prices for the 3xxx cards

Checked some benches from Tom’s and 3090Ti is already 1.7x 3070 performance in composite fps at 4k which is what I care about.

Compared to the 3070, the 4080 16 GB has 1.65x the cores and 1.42x the boost clocks
That's 2.3x multiplicative raw performance excluding any architectural improvements, but also assuming perfect scaling.

I'd like to see 2x raster perf at 4k and VR and not sure if I should be budgeting for 4090 or 4080 16 GB. I like the lower TDP of the 4080 as well and would be undervolting either card. The only thing worrying me is the 256 bit mem bus on 4080 vs 384 on 4090. is there any evidence of that having effect on the 3xxx cards?
 

jpiniero

Lifer
Oct 1, 2010
14,688
5,318
136
Back to raw raster performance for a second

Is it unreasonable to think 4080 16 GB would have over 2x 4k gaming performance of a 3070 ? I realize I'm comparing across tiers here but I bet others will be too since some of us had to pay scalper prices for the 3xxx cards

Checked some benches from Tom’s and 3090Ti is already 1.7x 3070 performance in composite fps at 4k which is what I care about.

Based upon what nVidia said, it's about 20-25% faster than the 3090 Ti. Might be able to get just over double if you OC the memory but seems unlikely to be much more than that.
 

yottabit

Golden Member
Jun 5, 2008
1,369
229
116
Doesn't really matter how fast it is. 2x performance at 2.4x price is a regression on performance/price metric, this is worse than Turing.

Do people really keep track of this? I never even heard of the term until Intel's Arc presentation lol

It wouldn't be fare to compare the 4080 against a 3070 in perf/$ since you are always paying a premium once you get past midrange

If the 4080 12 gb has 1% better performance than a 3080Ti, or 28+ % better performance than a 3080 10 gb, then there was no perf/$ regression. And I really doubt there will be, although admittedly it is not a super impressive uplift and I do think its reasonable they might lower the MSRP in the future.

Maybe I'm giving Nvidia too much leeway but I feel like armchair warriors are getting their pitchforks out without seeing any benchmarks. The "4070" right now is probably the 3080 10 GB for people that are shopping on a budget. Up until very recently you couldn't get 3xxx series at MSRP, and now there's a glut of them marked down.

Gamers, if they were being objective/pragmatic, should be happy. Esp. if there aren't any 4xxx features you care about and you prefer Nvidia or rely on their various technologies. And with inflation the way it is if you expected cheaper cards I have a bridge to sell you

To me the performance uplift is what matters for buying a premium card, not so much the price as long as it fits into my budget. But I'd prefer for it to not be a furnace in terms of TDP. I'm happy that all the cards offer substantially more perf/watt than their predecessors. I'll be curious if any launch day reviews explore undervolting the 4090 I guess but seems like 4080 16 GB would be the card for me.

I think one of Nvidias biggest mistakes is getting rid of the Titan brand and convincing gamers they need an xx90 card. the xx90 should really be marketed as a workstation card with a similar market to Threadripper HEDT but for GPU compute tasks. If they called the 4080 (16 Gb) a 4080 Ti or Super I doubt people would be as upset.
 
Reactions: igor_kavinski

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Doesn't really matter how fast it is. 2x performance at 2.4x price is a regression on performance/price metric, this is worse than Turing.

There has literally never been a 1:1 ratio for performance to price as you move up tiers. The higher the tier, the lower the performance per dollar.
 
Reactions: Mopetar
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |