Anand R420 review analysis

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Perhaps nVidia potentially has the advantage at >1600 x 1200? Given no review seems to go above that setting it's going to be pretty tough to get anything concrete out of them in that regard.


Actually some review of the 6800 did go upto 2048 in some game and was getting like 30 FPS. I cant remember which game it was but I was surprised the game would be playable at that rez.
 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
Originally posted by: Acanthus
Originally posted by: Ackmed
Originally posted by: Edge3D

Well. That Ackmed fellow seems to want to paint me in fanboy colors but I'm not. Proof is in the pudding, ATI just needs to deliver at 2048. Otherwise NV offers comparable performance with more of the features I want.
Is that fanboyism? And hilarious that Ackmed says "are the MASSES ignorant?" LOL, need I say more?
Owned by his own words.

Anyway, yes I am "biased" towards having SM3.0 in my next $500 card. Who in their right mind deep down, WOULDNT want this feature?
If its so meaningless, then why would ATI ever implement it? They should just forget about SM3.0 and DX9C should be erased from earth because it is a useless feature. I mean, seriously. You have to be a fanboy to subscribe to that kind of ideology.
Its like 3dfx and 16bit. Or Intel and 32bit. Why NOT have more features for less or equal money? Silly. Just silly.

Unless, like I said it blows it away in some res that was previously unplayable at all in a newer game like UT2K4 (2048x1536). THEN I'd forget about my SM3.0 that I hold so dear.

Im not exactly sure if anyone around here is a ATI fanboy as you say, but it does appear that there are. And it is funny watching them squirm under the SM3.0 gun. It just doesnt make sense stepping out of the fanboyistic "NV vs ATI" circle and to see people actually trying to downplay a upcoming technolgy that one company doesnt have.

I didnt paint you as a fanboy, you have acted that way since you registered. I have one PC with a P4, one with a Athlon, one with a 9800XT, one with a 5900NU, one with a Creative sound card, one with a Phillips one, one with Kingston ram, one with Corsair ram. The only thing I have that is the same in both PC's is WD hd's. Oh no, Im a WD fanboy!!!

So now everyone who voted that ATi has the better card is ignorant now? Haha. Saying someone got "owned" on a forum is childish at best.

ATi didnt add it in the gen because they said they dont think its needed right now. I think they know before you or I about when it will be needed. Sure, it will be in future generations of their cards, but they dont think its needed now. While personally I think it was a bad decision, its what they did.

You seem to keep forgetting the added features that NV doesnt have, that ATi does. Im not downplaying PS 3.0, but how can I get excited about it, when we have not seen any benifit from it? If it was so great, why hasnt there been some showing of it? Its far from 16bit vs. 32bit I think. If there was some screen shots of benchmarks of PS 3.0 being better than PS 2.0, it would be a lot easier to get excited.

For the 70th time, you learning disabled troll. PS3.0 DOES NOT LOOK BETTER, IT IS FASTER.

Now, now, play nice kids!
 
Apr 14, 2004
1,599
0
0
If I am going to spend $400+ on a new card i will want it to be fast in the games "I" play which right now is Farcry and BF vietnam.
Bingo. Arguing about the future (Hint: SM3.0) is more than a waste of time. I'd prefer to buy cards for the games I play here and now. The effectiveness of SM3.0 is up in the air, and its anyone's guess to whether its worthwhile or not.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: SickBeast
You are telling me you are running 1280x1024 4AA/8AF with 9700Pro FAR CRY at 50FPS?

YEAH RIGHT..keep dreaming. 9700 Pro ONLY gets 49FPS at 800x600

I was actually referring to UT2004...sorry...I should have been more clear. I guess you haven't read my other posts, I've basically said that Far Cry is my only reason for upgrading my graphics card at this point in time.

Sorry sickbeast i didnt read your other posts and didn't figure that you posted about the wrong game by accident. Far Cry is an excellent game and I tend to agree with you, especially since I can't really play it well with an 8500 so I am in need of an upgrade ASAP. What Far Cry shows to me though is that once a manufacturer releases a new engine that neither card has been optimized for, the card that has more raw power should run fastest because neither optimized for it. So I can't seem to dismiss the huge advantage ATI cards hold over Nvidia in a completely new stunning engine...What about all new other engines to come? Will Nvidia continue to release driver updates for all these games that it is slower in just to catch up? What if it isn't a major game someone likes to play and nvidia won't optimize for it? hmm..
 

NoVo

Senior member
May 16, 2001
463
0
0
A vote for the 6800GT. For some reason this card appeals to me, even after happily owning a 9700PRO and 9800XT. I'm itching to jump back on the nV bandwagon.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Eagle17
Question : is doom3 openGL?

Yes. It's apparently a mostly "DX8.1"-ish engine (ie, uses some simple shaders) with "DX9"-ish lighting effects and stencil shadows. That's what I've gathered from statements that Carmack has made, anyway.

*Supposedly* the GeForceFX cards were built around this type of game engine, and if this is true, I would expect them to perform very well in Doom3. In particular, they can make full use of their 4x2 pipeline architecture when doing stencil shadow calculations. However, they tend to suck at anything resembling DX9/SM2.0 shader code, and Carmack himself said they had to program in mixed-precision (FP16/FP32, as opposed to pure FP32) code paths just for these cards so that they could run acceptably (as opposed to the 9700Pro/9800/9800Pro, which ran FP24 shaders just fine).

Both the R420 and NV40 should have plenty of horsepower to run Doom3. The NV40 may be *better*, but I doubt it will be a crushing victory.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: Ackmed
Originally posted by: Acanthus
For the 70th time, you learning disabled troll. PS3.0 DOES NOT LOOK BETTER, IT IS FASTER.

First off, name calling is childish.

Second, its Edge3D that thinks it does, Ive said many times it doesnt.

If you actually read post, you would know this. From the first post on this page:

Originally posted by: Ackmed


Because quite simply, we dont know how PS 3.0 will affect games, if it indded does. If it going to give them such a huge boost in speed, why havent we seen any tests with it? They seemed to want to brag about it so much, why didnt they release a benchmark to show how much faster it makes the NV4x? The quality is going to be the same, so its all about speed.

No, my spindoctoring friend. I never said PS3 increased IQ. Vertex Shader 3.0 compliance DOES.
Quit spinning this.
Only the Parhelia has vertex displacement mapping as the NV40 does to my knowledge. Its just way too slow as we all know anyway.

For your information. PS3.0 and SM3.0 are NOT interchangable.
Education- SM3.0 equals Vertex AND Pixel Shader 3.0

Have you read about PS/VS3.0 at all? Do you know what you're talking about? It doesnt sound like it.

READ THIS OR QUIT POSTING ABOUT PS3.0 AS IF ITS SM3.0. THEY ARE NOT INTERCHANGABLE.
Clicky

Its amusing actually. I'm being branded as a fanboy when it seems to me that many are taking this stance for no apparant reason.. fanboyism maybe?
Not pointing fingers, but maybe a few need to do some soul searching and untie themselves to a corporation that wants nothing but your dollars. I'm not saying anyone is, maybe you just havent even read enough about SM3.0 to realize that when I say SM3.0 it does NOT equal PS3.0 alone.

VERY comparable performance, and not having this feature? I dont understand that.
Now maybe if you bought a lower end part I would understand.. but since ATI doesnt have a dominant part other than the X800XT that be a telling reason why its being chosen for a purchase eh?
Regardless, at $500 and not to have SM3.0.. well, not for ME. I'm just prodding as to exactly WHY someone would make that trade off for such a large amount of money and a few FPS that dont matter in todays, yesterdays games (which it sadly gets schooled in by the GT, even) and tomorrows.
Sorry guys, but a card like that getting owned in those 3 old games is not cool. Even with beta drivers.
Note- Expect D3 engine based games, which will likely be as relevant the Q3A has been for as long as it has been, to run MUCH better on the NV40 hardware. "Crushing?" I dunno. But certainly, noticably better. Looking at that game and merely uttering the word, "ultrashadow" is enough to come to that educated conclusion.

Its a price premium thats a waste, unless you want the only ATI card worth purchasing IMO. Because a GT surely could fit everyones needs or a Ultra, OR an Ultra Extreme (ALL with SM3.0).. UNLESS the x800XT plays at 2048.. then I'd understand!
But I highly doubt it has the power to do so in a older tech game like UT2K4, without AA and/or AF even.

No one has answered this for me-

1. If Vertex and Pixel shader 3.0 is so useless and 2.0 is fine and dandy.. then WHY is ATI planning to implement it into future R420 revisions if there is no need? Please, I'm all ears.
To Ackmed's trusted "masses", bring it on.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: Matthias99
Originally posted by: Eagle17
Question : is doom3 openGL?

Yes. It's apparently a mostly "DX8.1"-ish engine (ie, uses some simple shaders) with "DX9"-ish lighting effects and stencil shadows. That's what I've gathered from statements that Carmack has made, anyway.

*Supposedly* the GeForceFX cards were built around this type of game engine, and if this is true, I would expect them to perform very well in Doom3. In particular, they can make full use of their 4x2 pipeline architecture when doing stencil shadow calculations. However, they tend to suck at anything resembling DX9/SM2.0 shader code, and Carmack himself said they had to program in mixed-precision (FP16/FP32, as opposed to pure FP32) code paths just for these cards so that they could run acceptably (as opposed to the 9700Pro/9800/9800Pro, which ran FP24 shaders just fine).

Both the R420 and NV40 should have plenty of horsepower to run Doom3. The NV40 may be *better*, but I doubt it will be a crushing victory.

Doom 3 is the only game engine with full per pixel dynamic shadowing and lighting. Even the much touted HL2 still uses light and shadow maps. For everyones' information... that was the technology used in Quake 2!
Its not really comparable.


/sigh
And you are incorrect. They are now running all cards on standard ARB2 paths. Do you guys read the "news"? Link
NV has improved their drivers to the point where its now just as fast in ARB2 as NV30 mixed mode specific path.

It sucks seeing OGL compared to DX versions. Anything that can be done in DX9 can be done in OGL and its portable to Linux/Mac/whatever, not stuck on the Windows platform.
Like I said earlier I'm not a fanboy of ANYTHING, but interoperability IS very cool.

So I guess I'm branded a "advanced technology fanboy" ie. SM3.0. And a interoperability fanboy who desires Mac/Linux/Wintel/whatever else might become incredibly popular/, playfriendly goodness?
Shame on me.
Must be a character flaw.
 

MemberSince97

Senior member
Jun 20, 2003
527
0
0
Yes. It's apparently a mostly "DX8.1"-ish engine (ie, uses some simple shaders) with "DX9"-ish lighting effects and stencil shadows. That's what I've gathered from statements that Carmack has made, anyway.

*Supposedly* the GeForceFX cards were built around this type of game engine, and if this is true, I would expect them to perform very well in Doom3. In particular, they can make full use of their 4x2 pipeline architecture when doing stencil shadow calculations. However, they tend to suck at anything resembling DX9/SM2.0 shader code, and Carmack himself said they had to program in mixed-precision (FP16/FP32, as opposed to pure FP32) code paths just for these cards so that they could run acceptably (as opposed to the 9700Pro/9800/9800Pro, which ran FP24 shaders just fine).

Both the R420 and NV40 should have plenty of horsepower to run Doom3. The NV40 may be *better*, but I doubt it will be a crushing victory.[/quote]

Doom 3 is the only game engine with full per pixel dynamic shadowing and lighting. Even the much touted HL2 still uses light and shadow maps. For everyones' information... that was the technology used in Quake 2!
Its not really comparable.


/sigh
And you are incorrect, they have dropped the mixed-mode precision. They are now running all cards on standard ARB2 paths. Do you guys read the "news"? Link
NV has improved their drivers to the point where its now just as fast in ARB2 as NV30 mixed mode specific path.

It sucks seeing OGL compared to DX versions. Anything that can be done in DX9 can be done in OGL and its portable to Linux/Mac/whatever, not stuck on the Windows platform.
Like I said earlier I'm not a fanboy of ANYTHING, but interoperability IS very cool.

So I guess I'm branded a "advanced technology fanboy" ie. SM3.0. And a interoperability fanboy who desires Mac/Linux/Wintel/whatever else might become incredibly popular/, playfriendly goodness?
Shame on me.
Must be a character flaw.[/quote]

I don't know,but dammit I'm tired of waiting on these games
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Edge3D

No, my spindoctoring friend. I never said PS3 increased IQ. Vertex Shader 3.0 compliance DOES.

VS3.0 compliance *could*. Displacement mapping and other complex vertex shader programs might also tank performance on the NV40; I haven't seen any numbers.

Note- Expect D3 engine based games, which will likely be as relevant the Q3A has been for as long as it has been, to run MUCH better on the NV40 hardware. "Crushing?" I dunno. But certainly, noticably better. Looking at that game and merely uttering the word, "ultrashadow" is enough to come to that educated conclusion.

Based on... what? The 9800Pro and R420 can do stencil shadows, too, ya know. And I have yet to hear of a major game that will be based on the D3 engine, whereas several companies have licensed Valve's Source engine already. Of course, NEITHER GAME IS OUT YET.

Doom 3 is the only game engine with full per pixel dynamic shadowing and lighting. Even the much touted HL2 still uses light and shadow maps. For everyones' information... that was the technology used in Quake 2!
Its not really comparable.

Different lighting models are not necessarily better. HL2 uses a very different rendering system, with lots of little shaders to handle its dynamic lighting needs.

/sigh
And you are incorrect, they have dropped the mixed-mode precision. They are now running all cards on standard ARB2 paths. Do you guys read the "news"? Link
NV has improved their drivers to the point where its now just as fast in ARB2 as NV30 mixed mode specific path.

No, I hadn't heard that. You'd think that NVIDIA would have maybe mentioned this and/or made a big deal out of it? This was also FOUR DAYS AGO; it's not exactly old news.

In any case, here are more specific comments from Carmack in the discussion thread linked to off of Beyond3D:

discussion

There are two things nvidia drivers have done:
a) take advantage of the ARB_precision_hint_fastest hint in the fragment programs. IIRC until the 50 series drivers this was ignored and and the nv3x computed everything at full 32bit precission. Now they take advantage of this hint and run as fast as possible.
b) Added a compiler directly into their drivers to extract better parallelism from fragment programs.

You can verify those changes yourself. These have eleminated the need for writing a specific nv3x path manually specifing low precission for speed. You can now use a standard shader and make it fastest if you want speed, or nicest if you want full 32bit image quality. Those with old drivers will of course suffer when using the ARB path when they wouldn't with the nv3x path, but that's what you get for not using modern drivers

nvidia has two parts to their compiler. First is a generic optimizing compiler that takes any shader and makes it run as well as possible on the nv3x or nv4x. This is very important, the r3x0, nv3x, r4x0, and nv4x all have different optimizations just as code can be optimized for a PentiumIV, Athlon, or PentiumIII. Writing code that performs well on four different architectures (plus S3, XGI, Intel, etc) is going to be a trade off.

The 2nd part is a bit dirty. They can detect a specific set of instructions and replace it with another set. This is so if a benchmark company threatens to write a shader that is really difficult for them to optimize in a generic way unless nvidia pays big $ to join their beta program they can just replace with a more sensible set of instructions that presumably produces the same result.

IE, they didn't really improve their speed, they just made it so it runs at lower precision automatically, instead of Carmack having to tell them to do it explicitly via a different codepath. No big changes here; it's still running mixed-mode FP16/FP32, just doing it in the driver instead of in the game code itself.

There's actually quite a bit of good discussion in that thread about how Doom3's shading works, and how it compares to HL2. Perhaps you should take a look.

It sucks seeing OGL compared to DX versions.

Then what the hell *should* I compare it to? It uses shader code that has features comparable to the ones found in SM1.4 and SM2.0. Sorry if I've offended you by mentioning DirectX in the same sentence as OpenGL.

So I guess I'm branded a "advanced technology fanboy" ie. SM3.0.

It's starting to look that way.

1. If Vertex and Pixel shader 3.0 is so useless and 2.0 is fine and dandy.. then WHY is ATI planning to implement it into future R420 revisions if there is no need? Please, I'm all ears.

It *won't* be useless in the future, when a) there are actually games that have meaningful support for it, and b) the cards are fast enough to actually use its nifty new features (super-long shaders with dynamic branching, hardware displacement mapping, etc.). For now (until proven otherwise), it's a paper feature, much like the SM2.0 support in the FX5200.

Plus, it might actually shut up all the people going "Ooh, look, NVIDIA has SM3.0 and ATI only has 2.0! ATI sucks!"
 

ChkSix

Member
May 5, 2004
192
0
0
Why is he ignorant for stating his own point of views? I don't get that. I hope this can be an open discussion where we all win by learning more about both cards. Constructive criticism, even with different points of view, is how any open forum should be conducted. I just came from a place where fanboyism was the rule, and I hope that doesn't prevail here.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: Matthias99
Originally posted by: Edge3D

No, my spindoctoring friend. I never said PS3 increased IQ. Vertex Shader 3.0 compliance DOES.

VS3.0 compliance *could*. Displacement mapping and other complex vertex shader programs might also tank performance on the NV40; I haven't seen any numbers.

Right. But that isnt what I said.
My whole point of my existence here has been to tell some of you that VS3.0 DOES have HUGE IQ gains when used.
That is all. Its nice to see at least you agree with the facts of the matter... /phew!

Maybe Ackmed will listen to you, now that you agree.
Whats with the defensive nature over SM3.0? I dont get it. Its not about ATI vs. NV. Its about the consumer.
Just seems like a political argument around these parts, not a technical one.


Note- Expect D3 engine based games, which will likely be as relevant the Q3A has been for as long as it has been, to run MUCH better on the NV40 hardware. "Crushing?" I dunno. But certainly, noticably better. Looking at that game and merely uttering the word, "ultrashadow" is enough to come to that educated conclusion.

Based on... what? The 9800Pro and R420 can do stencil shadows, too, ya know. And I have yet to hear of a major game that will be based on the D3 engine, whereas several companies have licensed Valve's Source engine already. Of course, NEITHER GAME IS OUT YET.

UltraShadow2 is not "just" stencil shadows. Play Splinter Cell on a NV3X and then a R3XX and you'll see the difference.

Doom 3 is the only game engine with full per pixel dynamic shadowing and lighting. Even the much touted HL2 still uses light and shadow maps. For everyones' information... that was the technology used in Quake 2!
Its not really comparable.

Different lighting models are not necessarily better. HL2 uses a very different rendering system, with lots of little shaders to handle its dynamic lighting needs.

Hehe. "different" lighting models?? Interesting way of putting it.
Thats like saying SM2 is merely "different" than SM3.. not inferior as it is in reality.

/sigh
And you are incorrect, they have dropped the mixed-mode precision. They are now running all cards on standard ARB2 paths. Do you guys read the "news"? Link
NV has improved their drivers to the point where its now just as fast in ARB2 as NV30 mixed mode specific path.

No, I hadn't heard that. You'd think that NVIDIA would have maybe mentioned this and/or made a big deal out of it? This was also FOUR DAYS AGO; it's not exactly old news.

In any case, here are more specific comments from Carmack in the discussion thread linked to off of Beyond3D:

discussion

There are two things nvidia drivers have done:
a) take advantage of the ARB_precision_hint_fastest hint in the fragment programs. IIRC until the 50 series drivers this was ignored and and the nv3x computed everything at full 32bit precission. Now they take advantage of this hint and run as fast as possible.
b) Added a compiler directly into their drivers to extract better parallelism from fragment programs.

You can verify those changes yourself. These have eleminated the need for writing a specific nv3x path manually specifing low precission for speed. You can now use a standard shader and make it fastest if you want speed, or nicest if you want full 32bit image quality. Those with old drivers will of course suffer when using the ARB path when they wouldn't with the nv3x path, but that's what you get for not using modern drivers

nvidia has two parts to their compiler. First is a generic optimizing compiler that takes any shader and makes it run as well as possible on the nv3x or nv4x. This is very important, the r3x0, nv3x, r4x0, and nv4x all have different optimizations just as code can be optimized for a PentiumIV, Athlon, or PentiumIII. Writing code that performs well on four different architectures (plus S3, XGI, Intel, etc) is going to be a trade off.

The 2nd part is a bit dirty. They can detect a specific set of instructions and replace it with another set. This is so if a benchmark company threatens to write a shader that is really difficult for them to optimize in a generic way unless nvidia pays big $ to join their beta program they can just replace with a more sensible set of instructions that presumably produces the same result.

IE, they didn't really improve their speed, they just made it so it runs at lower precision automatically, instead of Carmack having to tell them to do it explicitly via a different codepath. No big changes here; it's still running mixed-mode FP16/FP32, just doing it in the driver instead of in the game code itself.

There's actually quite a bit of good discussion in that thread about how Doom3's shading works, and how it compares to HL2. Perhaps you should take a look.

I was already aware of all this stuff. It IS good that it is in the driver though. Believe me, I am no NV3X fan.. I was just stating the facts.

It sucks seeing OGL compared to DX versions.

Then what the xxxx *should* I compare it to? It uses shader code that has features comparable to the ones found in SM1.4 and SM2.0. Sorry if I've offended you by mentioning DirectX in the same sentence as OpenGL.[/quote]

Well. I was just being a biyatch about it. I prefer OGL. I'm not a programmer.. but I like its interoperability. Who in their right mind wouldnt?
*cough*fanboys*cough*
Or a programmer with a LEGITIMATE reason, that only a graphics programmer could provide personally.
Even then, that person, whoever it may be.. PALES in comparison to JCs abilities.

So I guess I'm branded a "advanced technology fanboy" ie. SM3.0.

It's starting to look that way.[/quote]

Thank you.

1. If Vertex and Pixel shader 3.0 is so useless and 2.0 is fine and dandy.. then WHY is ATI planning to implement it into future R420 revisions if there is no need? Please, I'm all ears.

It *won't* be useless in the future, when a) there are actually games that have meaningful support for it, and b) the cards are fast enough to actually use its nifty new features (super-long shaders with dynamic branching, hardware displacement mapping, etc.). For now (until proven otherwise), it's a paper feature, much like the SM2.0 support in the FX5200.[/quote]

Thank you again. My point is well illustrated now by someone else other than me. I wouldnt disagree with what you said but you are intelligent enough to see what it is at LEAST capable of.
I give you that its worth remains to be proven.
But, it cannot be denied that those NV cards have comparable performance across the board and offer a feature that could bring HUGE performance and drastic increases in IQ.
I dont think any other "single" feature released with the new generation of cards can say the same from the data I've analyzed.

Plus, it might actually shut up all the people going "Ooh, look, NVIDIA has SM3.0 and ATI only has 2.0! ATI sucks!"

What does it have to do with ATI sucks? No one said that at all.
Its the defensive nature that leads peope who are genuinely excited about the best technology to appear against ATI.. because people who seem too partial to ye olde Canadian ATI than they should be take it like a slam against "their" product.
My god, if its not the product to have, dont lock yourself into some fanboy mode and buy it anyway. Whats the point? In five years is ATI going to cut you a check for being loyal? No they are likely to burn you sooner rather than later, by making you into their sheep and selling you a inferior product over their competition. Like its clear the NV30 was, or this x800Pro it has little reason for purchase as well. It cant even be modded to more pipes like previous cards (9500).
Or on some forum are you going to brag about how many ATI cards you've ran? Or is it going to help ATI?
Very doubtful. The market is based on the lower end to midrange. Which honestly goes off performance nearly last for sales. Checkbox features DO help there.
The enthusiast market is supposed to jump on the BEST, most advanced, as well as fast hardware.. which is fine if you think ATI is the ticket in those departments.. but downplaying Shader Model 3.0 seems VERY odd indeed. Thats all I'm after stamping out. Ridiculous honestly. Its not like even as you said in your own words, that ATI is moving to it as well!

And in all honesty, NV has had pretty much "DX9C" class hardware since the NV30.. not totally but MANY of its features were in the NV30 core... yes that 5800 leafblower. Reason I say this? They have much, much, much, much more experience with "SM3" class hardware.. you say that you wonder about NV's future SM3.0 performance.
I will say this right now- NV is exponentially likely to have GREAT performance in SM3 based games. This is, essentially, 3rd generation SM3.0 hardware (NV30 was missing some crucial features like displacement mapping, it is like the new ATI part "SM2+").
While ATI with the R420 could be said to be on their 1st revision of anything much more than bare SM2.0 requirements and taken at least a step towards full SM3 compliance.

They dont even have FP32 precision yet. Thats huge in DX9C and SM3.0. NV had that in the NV30.. expect the performance and IQ delta to widen considerably with DX9Cs release and supporting games.
And dont expect ATI to catch up fast, or at all... they are actually behind on tech.
Because logically they have to-

a) Engineer the tech
b) Add it to their existing design
c) Produce it, test it, tweak it.. then repeat A->C until its right
then
d) They have 1st gen compliant SM3.0 hardware.

NV has done steps A->D already. Many times. And on top of that, tweaked their FP32 performance.
FP24 (ATI) is partial precision. FP32 is full precision. Truth of the matter is and moral of the story, ATI got stuck with their pants down somehow by deciding to not take the time or money to develop SM3 class hardware sooner, rather than later.
Because I predict it will pain them, sooner.. rather than later. It would be MUCH better if that at LEAST had 1st gen SM3.0 compliant hardware. But they are still on DX9"+" like the age-old NV30.

Its not as easy as people think, they cant just throw on 32bit FP precision... the rest of the SM3.0 requirements and run to the bank with the performance crown. It wont happen.
And until at LEAST 2006, with Longhorn and the successor to DX9C you wont see anything changin.
32bit FP precision is considered across the industry as "full precision", has been for 20+ years.
The rest of the market has to catchup to NV before moving on. And the rest of the market is not even at 32bit FP, they are very likely to not have but half of the performance when they do either.


Also, the NV40 core DOES have the power to perform displacement mapping, and all the nifty features of SM3.0.. dont believe me? Well, I could elaborate. But I'm tired of sending info to this forum and not recieving anything but cries of "fanboy". Lame.
Or could just point to the simple known fact that SM3.0 is as much about speed increases as IQ. Case in point.

To sum it up once again, NV has most of the performance of every single ATI card but sometimes the x800XT runs away.. but its absolutely absurd IMO to spend $500 and short yourself something like a overall, very longtime developed, mature SM3.0 architechure from Nvidia.. RIGHT at the time when SM3.0 IS going to be relevant in the market place.
I agreed with your POV, and the rest of this forums when the NV30 was released. DX9+ was silly, and it was poor performing at that.
But things, and times, have changed.

They have done a total design. Its now VERY fast in DX9 and has something else as well, DX9C support.

Whatever. I'm just trying to toss in a bit of educated opinion on the subject. I hope I help you be a bit more open armed to NV in the future.. they DO make great product as does ATI. They've both had their merits, past and present.

I am going to be taking some time off from the forums, to much of your relief. I have RL obligations. But I'll be back!
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: Ackmed
Ignorance is bliss, for Edge3D.

Another thread Im done with.

YEAH! RIGHT! LOL

You are comedy AND tragedy dude! Seriously. You were completely and utterly destroyed in this conversation. Latah! Dont let the door hit cha on da butt on the way out!

Lack of meaningful response is one thing I've become accustomed too with you ACKmed.

Come back when your educated on the topic and maybe we can tango.
Until then, its clear when the facts are brought up.. you're gone!
 
Apr 14, 2004
1,599
0
0
RIGHT at the time when SM3.0 IS going to be relevant in the market place.
And how exactly do you know when/where this will happen? And where are you getting this HUGE benefit from SM3.0? Can I have this crystal ball of yours?
 

fwtong

Senior member
Feb 26, 2002
695
5
81
Originally posted by: ChkSix
Why is he ignorant for stating his own point of views? I don't get that. I hope this can be an open discussion where we all win by learning more about both cards. Constructive criticism, even with different points of view, is how any open forum should be conducted. I just came from a place where fanboyism was the rule, and I hope that doesn't prevail here.

Fanboyism is the rule here. Sorry
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: GeneralGrievous
RIGHT at the time when SM3.0 IS going to be relevant in the market place.
And how exactly do you know when/where this will happen? And where are you getting this HUGE benefit from SM3.0? Can I have this crystal ball of yours?

NV is making it happen. Partial P3.0 support has already been added to Far Cry. It will have full PS/VS3.0 support. Be prepared to eat your words. The odds are in my favor my friend.
All that has to be done is modifiying the games with patches.. you dont think NV wouldnt DO THIS FOR THE GAME DEVS? rofl
To know the benefits all you have to do is read.. to anyone with ANY technical background its clear.
 

ChkSix

Member
May 5, 2004
192
0
0
Edge,

Do me a favor and continue posting. I am amazed by your insight on the matter. It is extremely informative in every sene of the word.

Mike
 

ChkSix

Member
May 5, 2004
192
0
0
Originally posted by: fwtong
Originally posted by: ChkSix
Why is he ignorant for stating his own point of views? I don't get that. I hope this can be an open discussion where we all win by learning more about both cards. Constructive criticism, even with different points of view, is how any open forum should be conducted. I just came from a place where fanboyism was the rule, and I hope that doesn't prevail here.

Fanboyism is the rule here. Sorry


Why? Wow anywhere you go, fanboys seem to lurk. Isn't being exremely loyal to any company (who by the way aren't doing anything for you but taking your money) without accepting the facts as we know them, not only a bit uneducated, but foolish to boot?

I don't know, and I won't state my liking. I do know that I enjoy any post that offers more insight into the pros and cons of both cards. That is a good thing for anyone who has to work for everything that they purchase, so keep the facts coming....it is quite educational.

Mike
 

vshah

Lifer
Sep 20, 2003
19,003
24
81
i'll be getting an x800xt simply because i've got connections and can get it cheap. i'd prefer nvidia though. just cause it looks cooler.

a true american would say bigger is better and get the 6800u. by buying ati, you are not being patriotic. shame on you.

(KIDDING for those who don't get it)

-Vivan
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: fwtong
Originally posted by: ChkSix
Why is he ignorant for stating his own point of views? I don't get that. I hope this can be an open discussion where we all win by learning more about both cards. Constructive criticism, even with different points of view, is how any open forum should be conducted. I just came from a place where fanboyism was the rule, and I hope that doesn't prevail here.

Because some of these guys arent here to learn. Or even have an educated debate. That much is clear.

Edge,

Do me a favor and continue posting. I am amazed by your insight on the matter. It is extremely informative in every sene of the word.

Thank you! Thats the first kind thing I've heard from anyone on this forum yet. I've done a lot of research on these cards.
For me ChkSix, I dont rely on anyone elses opinions. ESPECIALLY not in forums. A lot of these guys are pure trolls.. some are, some arent. The ones that are trolls you'll notice have little to say.. with never any real information.. just opinions based on nothing.
I try to take the best info i can find and come to my own conclusion. When it comes down too it, I dont even trust Anands. I take information thats purely unbiased, like benchmark info and check the config.. check settings.. think about variables and loopholes.. THEN work towards accumulating the data in my own mind and what it all means.

Why is he ignorant for stating his own point of views? I don't get that. I hope this can be an open discussion where we all win by learning more about both cards. Constructive criticism, even with different points of view, is how any open forum should be conducted. I just came from a place where fanboyism was the rule, and I hope that doesn't prevail here.

Fanboyism is the rule here. Sorry


Why? Wow anywhere you go, fanboys seem to lurk. Isn't being exremely loyal to any company (who by the way aren't doing anything for you but taking your money) without accepting the facts as we know them, not only a bit uneducated, but foolish to boot?

I don't know, and I won't state my liking. I do know that I enjoy any post that offers more insight into the pros and cons of both cards. That is a good thing for anyone who has to work for everything that they purchase, so keep the facts coming....it is quite educational.

Mike
Yes it is foolish ChkSix. More than foolish actually.
In fact, nothing ensures your fate to getting ripped off than when you give your power to think for yourself to someone else. Whether it be a company like ATI, a website, a forum, the government, even your own mother and father. Listen carefully, deeply consider when it appears wise or credible.. but always think for yourself.
Thats an incredible power to hand someone.
Like I've told ACKmed many times, I'm just not quite ready to sell out myself, my credibility and my respect and turn into one of these horrid fanboys. To me, to read their words.. I actually feel its almost disgusting. A lot of them are probably really young kids though who had a Radeon XXX and liked it so they fell into some kind brand loyalty when they booted up Will Rock and saw a ATI logo.. or a Geforce and saw a NV logo in UT2K4. But mostly its apparant, that ATI has the cretin thugs.
NV has been great for me, better than ATI. But I dont let that fact cloud my judgement.. ATI is going to be under new leadership soon. These things must be first, KNOWN (educated) then considered for what they are worth.

Like I stated, things change, times change. This is definitely NOT the industry (tech sector) to be complacent or take others opinions or develop some kind of relationship with a brand name.
If the quality of a product has been good to you, consider that.. but dont let it cloud your outlook. Or if you've been burnt by a particular company dont let it stop you from considering them again.. just dont FORGET what happened and give them another chance remembering that it is possible to happen again.

Examine all the information given to you.

But notice strange things, like WHY I'm not getting much of a response here... believe me if the info was there they'd throw it in my face.. and I'd digress on that particular point.

I mean, when people start saying things like, "all the reviewers like the X800s more", or "look at what the people want? You think YOU can matter to the MASSES??!"
Is that not hilarous to even view? Its ignorance pure and simple. Forget thinking for yourself!
I mean, what are the "masses" exactly going to do about it? Beat me? It appears that is the tactic being attempted but you cant defeat what is the truth.
Hence no response worth a grain of salt.

Anyway, if you have any particular questions I'd be happy to answer the best I can. Just PM me.
I'm not sure exactly what else to post thats factual here because I havent gotten a challenger to debate me. ACKmed is quite obviously running with the tail between the legs.
I'm going to have to pull out of here cuz I've spent alot of time here as of late.. but I'll answer any PM questions you might have to the best of my knowledge.
 

CrystalBay

Platinum Member
Apr 2, 2002
2,175
1
0
Originally posted by: GeneralGrievous
RIGHT at the time when SM3.0 IS going to be relevant in the market place.
And how exactly do you know when/where this will happen? And where are you getting this HUGE benefit from SM3.0? Can I have this crystal ball of yours?


Very True when : The games will be out this Summer , D3,HL2 and the DX9 refresh will follow.;
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |