Originally posted by: LTC8K6
ATI didn't need a dual slot cooler until now, so why would they have ever "pioneered" it?
Being the first to make a GPU that needed a dual slot cooler could only be viewed as good by an NVidia supporter.
I guess Intel is better than AMD for making hotter CPU's first.
I don't think NV pioneered the dual slot cooler anyway. SLI either.
You're missing my point. ATi (and of course, the community spankers) were quick to jump on the bash-wagon when NV30+ cards started using a dual slot solution. They bandied on about how terrible it was, etc etc.
And of course, now they're doing it.
Same deal with SLI. When Nvidia announced it, ATi came out all over the place saying "it's too cost prohibitive, there's no market for it, power consumption, diddly diddly da" - and now, as we all know, they have their own version in the works.
Don't misunderstand - Nvidia's PR machine is just as bad. They talk about R300s "low precision" (24bit v 32bit), and the "lack of features" (SM3.0, etc.)
The only real difference I've noted is that I have yet to see Nvidia then go and replicate something they've chided ATi over. Nvidia, I imagine, very easily could've worked 24 bit FP precision into NV40 along with the 16 and 32 options in order to put themselves on equal performance footing, but they didn't. I'm sure they could've bolted 12 extra pipes onto a previous core, but they didn't.
Both companies have skeletons - both have cheated on benchmarks, both FUD each other all over the place, but when you get down to the
direction each company has chosen thus far, I just think Nvidia has made consistently good design decisions, excluding the major misstep that was the NV30 core.
But then, if we want to talk about major missteps, we can talk about anything pre-R300
Really it's a taste thing. Go with the features you want for the price you want from the company you want. There really aren't any wrong answers this generation - both chips put out good numbers with good IQ.