igor_kavinski
Lifer
- Jul 27, 2020
- 24,884
- 17,310
- 146
Except for the people who are overwhelmed by their gag reflex upon seeing the words AMD or Radeon.I think the 6700xt is a better choice than the 3060.
Except for the people who are overwhelmed by their gag reflex upon seeing the words AMD or Radeon.I think the 6700xt is a better choice than the 3060.
The AMD cards I like are the 6800xt,7900xt and 7900xtx. I would take a standard 6800 over a 6700xt any day. The 6800xt undervolted would have been nice if the prices went down to $350-400.Except for the people who are overwhelmed by their gag reflex upon seeing the words AMD or Radeon.
All this drama over a $300 computer part, lol.
Are these going to be available at retail, tomorrow the 29th?
5600xt 192bit bus (6GB) --> 6600 Xt 128bit bus (8GB)
5700xt 256bit bus (8GB) --> 192bit bus (12GB)
6900xt 256bit bus (16GB) on a $1000 dollar card where AMD's previous high end had 512bit bus (8GB) or 384bit bus (3GB).
Ironically the reason it works is the same, but the response from the media is completely different.
I don't think that he is a shill, but that he simply stopped caring. He has become the kind of guy that simply buys the best GPU and CPU on the market, no matter the price or value, since he has way more money than time to even bother researching. And he doesn't like doing the research as a hobby either. As far I can tell his biggest passion is the Blinkenlights anyway. What excites him is pretty things, not actual engineering.Got around to checking in on Jay's review. His channel has developed a pattern. Shill. Get rekt by the community for shilling. In the very next video is a mea culpa, with a bunch of woulda coulda shoulda. Including an unbelievable claim that what you witnessed wasn't shilling, Followed by making excuses for everything that led to the accusations of shilling. Some of the excuses are, "the dog ate my homework" bad.
He gives a TLDW = buy the 6700XT. He throws praise at ARC a couple of times too. It comes off like some serial cheater douche trying to give their S.O. flowers and candy, while begging for forgiveness on their knees, after getting caught.
If they dropped the price to $249, and still called it the 4060, reception would be MUCH more positive than it is now.
The price is MUCH more important than the name.
It's both, otherwise they could have called this a 4090 Ti and charged $350 for it. An xx90 card at only $350! Never before could you get such a deal on current generation parts.
But as we've been over dozens of times already this really is an xx50 tier part, but NVidia has tried to cash in on their own branding reputation by naming a lower-tier die as though it were something historically better. They deserve the scorn and even NVidia fans should expect better from the company they love.
So has there been a dozen discussions about the RX 7600 really being an RX 7500 then?
Ultimately all that matters is perf/$ and features. You could call the new card: "The Shirley", or you could call it the RX 4030 and sell it for over $400, and as long as there was a big jump in perf/$, people would cheer.
It is the ultimate reflection of Enthusiast market! It is a card that is outdated before it is even released.For $199 it would be decent, but $300 is garbage. It can't even do 1080p in Last of Us. 2023: the year of 720p @ $300, yo!
No, because the 7600 isn't on Navi 34 and has more shaders than the 6600.
The 4060 is on the AD-107 die, when the 3060 was on GA-106, the 2060 on TU-106, etc.
The first time I saw a screenshot from somebody on social media I thought it was a joke, but then I stumbled on the tweet from their official account:
View attachment 82401
Fun times, marketing go Brrrrrrrr... !
The first time I saw a screenshot from somebody on social media I thought it was a joke, but then I stumbled on the tweet from their official account:
View attachment 82401
Fun times, marketing go Brrrrrrrr... !
Inb4 DLSS4 where they allocate even more die area to the Optical Flow Accelerator and now Frame Generation can do >1 interpolated frame between key frames. If it could do 2 in-between frames, it nets you 50% more fps than DLSS3. 3 in-between frames doubles the fps over DLSS3.NVidia marketing really annoys me. They have flooded YT with Frame Generations comparisons for practically ever game that has DLSS3. So I guess I shouldn't be surprised they stoop to 0 FPS claim...
Oh you're thinking small. DLSS4 will include game flow prediction. The NVidia data-center-on-a-chip will generate future frames using AI to predict your inputs based on other players and display the results before you even pressed the button. Negative latency! Available for only $10/mo.Inb4 DLSS4 where they allocate even more die area to the Optical Flow Accelerator and now Frame Generation can do >1 interpolated frame between key frames. If it could do 2 in-between frames, it nets you 50% more fps than DLSS3. 3 in-between frames doubles the fps over DLSS3.
I'm going to really love the moment we see people enjoy 100 fps that has the latency of 25 native fps. /s
Precog games, here there come.Oh you're thinking small. DLSS4 will include game flow prediction. The NVidia data-center-on-a-chip will generate future frames using AI to predict your inputs based on other players and display the results before you even pressed the button. Negative latency! Available for only $10/mo.
"Our AI neural network is trained on a bank of professional gamers! Trust that you will have the biggest competitive advantage with Nvidia Game Flow!"Oh you're thinking small. DLSS4 will include game flow prediction. The NVidia data-center-on-a-chip will generate future frames using AI to predict your inputs based on other players and display the results before you even pressed the button. Negative latency! Available for only $10/mo.
Precog games, here there come.