It could be AMD simply back tracking on their policy as a result of the criticism.
It would not be the first time and changed their tune as a result of criticism. E.g Ryzen 5000 support on x470 Motherboards. What makes this more like this time around is AMD could have easily said they do not block features when this whole drama came about but they did not. Same thing happened to the RTX 4080 12gb naming.
Generally criticism of companies lead to positive outcomes. The fact Jedi survivor is finally getting DLSS support after AMD statement about not blocking features means AMD likely changed its policy. This is also positive for AMD owners which use FSR as it forces AMD to make the technology better.
I'm surprised we haven't seen anyone make a connection between that situation and the Zen4/AM5 stuff, as I think there's likely a direct correlation. That being that AMD knew mobo makers were playing too fast and loose, and was exactly why they were worried about Ryzen 5000 support (if I remember they said the power states/power gating stuff was why they were going to restrict it, and didn't that turn out to be the issue with AM5?). Now we see why AMD was apprehensive about offering that support.
No, it’s unambiguously a good thing. Let’s hope this signals a change and that we never have to argue over it again.
If only, but we'll have some other molehill to make a mountain.
I find It hilarious that we are already at page 23, where we are arguing about missing upscaling techniques in games.
Soon we won't even know what It means playing at a native resolution and well optimized games will be only a myth.
I think that ship has already sailed. While I don't like how much of this has been done, I actually don't totally hate it, and think there are uses for this.
They finally added DLSS and frame generation to Jedi Survivor.
Native:
View attachment 85421
FSR:
View attachment 85422
DLSS quality:
View attachment 85420
Looks really good and stable. FSR is over sharpened (which causes aliasing), lacks fine detail and is there's flickering. DLSS and native do not use any sharpening which is good - it's impossible to fix over sharpening. In any case, good thing that I waited.
I like how Nvidia one has like 50 more antennae than even native. Which is by far more noticeable than relative sharpening/softness. I'm sure its probably just simply a minor difference in proximity/LOD or something, but it shows how often such comparisons run into issues.
Yeah I'm not going to enter into another DLSS good/DLSS bad debate, it's been done a million times and everyone is firmly dug in. My statement should read they did 40%+ (always increasing) of PC players a favor so props to them.
That's a weird dismissal of their point whilst you explicitly agree with it by trying to twist it into a positive. Then again saying you're not entering a debate then going "yeah that thing that everyone hates about this, Nvidia deserves props for it!" is a bit...odd.
So true. The more people embrace upscalers the lazier developers will be in optimizing their games. Users should be pushing back on developers by wanting native performance. Upscalers are a step backwards, not forwards.
I think upscaling has its place (like for Steamdeck and similar devices), but its absurd that its basically being targeted as a big feature of the newest (and often highest end) hardware. I think also weird is that its being done in the software instead of leveraging say the video processing blocks that are already on the cards.
I think developing it as a separate use has other benefits. The streaming aspect being one, but imagine leveraging that to help improve games, not just to make them look more realistic, but as we move to Augmented Reality and we'll be overlaying rendered imagery on real world imagery. And think of integrating people into games (Star Trek Bridge style games, D&D style games, but also think of interactive games like music rhythm games where one person plays a DJ and controls the game while other people play; plus games like Rock Band really change how streaming and game interactivity is).
At this point I'm leaning towards typical marketing game playing, FUD spreading by team green. GN and HUB buying into this nonsense was really just sad. I guess the whole consumer watchdog ideal has clouded their analytical skills.
Edit to add, more to the theory that it is a Bethesda game and thus a wonky sub-optimal bug fest... Starfield has some I/O issues apparently.
Yeah the real question is why
There's a difference between now and then: the initial question was legitimate, this radicalization stage where "AMD is lying because I say so" is a different beast. Keep in mind both GN and HUB gave AMD ample time to respond, they didn't just jump the bait.
I don't agree. I think they jumped the bait by even acting like it was legitimate and chose to make videos on it when they still had literally no evidence to be claiming it. They essentially got goaded into being Nvidia's developer outreach.
Hey, how about I make up just completely baseless speculation. Nvidia made a fuss about DLSS which is why Starfield plays worse on AMD CPUs! Bethesda felt they had to sabotage AMD in some way so the dastardly duos' complicit agreement to not check a box to enable DLSS didn't get outed!