Doom 3 Benchmarks at [H]

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: GeneralGrievous
The GT is looking good. The 5xxx cards are not. ATI really needs that opengl rewrite soon.

It's great to see that lastgen cards can push 50 fps in this game.

Didn't you forget to mention that the 5950 performed roughly the same as the 9800 XT?
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Originally posted by: VisableAssassin


Same goes for ATi and its 420....
No it doesn?t. ATI doesn?t use game specific optimizations. They?re using shader replacement for the FX series because they?re too slow to run the normal shaders. Just like in Halo, etc. etc.

For the 6800?s, well ? NV has been working hand-in-hand with Carmack on optimizing the game for their cards. So who knows what?s in there.
 

VisableAssassin

Senior member
Nov 12, 2001
767
0
0
Originally posted by: Blastman
Originally posted by: VisableAssassin


Same goes for ATi and its 420....
No it doesn?t. ATI doesn?t use game specific optimizations. They?re using shader replacement for the FX series because they?re too slow to run the normal shaders. Just like in Halo, etc. etc.

For the 6800?s, well ? NV has been working hand-in-hand with Carmack on optimizing the game for their cards. So who knows what?s in there.

you missed the entire point of that reponse.....
 

Johnbear007

Diamond Member
Jul 1, 2002
4,570
0
0
Originally posted by: GeneralGrievous
Hey General, if the guy who bought your XT for 700 bucks is a Doom Fan, he should be commiting Harey Carey right about now. (not sure how to spell harey carey? LOL)
He might be. And if he is? I guess he'll have to live with only having the 3rd best card to run this game as opposed to the first. I personally wouldn't pay $55 for what is looking more and more like a solely single player game. By the time all the modders get done with making this a truly viable multiplayer game, it'll probably drop in price a bit.

btw, what happened to your bold statement that the 5900s would crush the 9800s in this game?



Personally I'm about ready to puke over the non stop multi player focus in so many games. I for one will immensley happy if there is NO multi player in Doom 3, and that they spent ALL of the time making a great single player game that actually has a good STORY.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
You must suck johnbear

Seriously I know what you mean. FIRST make a good single player game...good story...good number of levels... good art etc.... then worry about touting MP/... so many games suck and money down the drain.


Luckily I play RTS's and TBS's which take months in the interms of badness.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
So where do we get v8.05 of the CATS and what happened to the 60FPS cap....as I asked in the other thread.
 

ZimZum

Golden Member
Aug 2, 2001
1,281
0
76
Originally posted by: Johnbear007
Originally posted by: GeneralGrievous
Hey General, if the guy who bought your XT for 700 bucks is a Doom Fan, he should be commiting Harey Carey right about now. (not sure how to spell harey carey? LOL)
He might be. And if he is? I guess he'll have to live with only having the 3rd best card to run this game as opposed to the first. I personally wouldn't pay $55 for what is looking more and more like a solely single player game. By the time all the modders get done with making this a truly viable multiplayer game, it'll probably drop in price a bit.

btw, what happened to your bold statement that the 5900s would crush the 9800s in this game?



Personally I'm about ready to puke over the non stop multi player focus in so many games. I for one will immensley happy if there is NO multi player in Doom 3, and that they spent ALL of the time making a great single player game that actually has a good STORY.

Great SP and MP don't necessarily have to be mutually exclusive. Case in point Return To Castle Wolfenstein. Its SP campaign was one of the best in FPS history. And the MP was superb. Then they released Enemy Territory which is as good as MP gets.
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: Nebor
OMG It won't do 1600x1200 8xAA 32xAF @ 240fps! UNPLAYABLE. GOAT OR COW?!?!?! *foams at the mouth*
Man, that describes about half of this thread. Folks crying because their X800 is still above the persistance of vision threshold. Fan boys glotting and whining. Jeez Louise.

  1. 1) The previous generation cards are not really playable at 16x12 and the 6800 and X800 are. Where is the big deal?
    2) The new cards will be old news in 6 months. This has not changed in 10 years, so what is so special about them now that makes them immune becoming obsolete?
    3) As noted, ATI is not the OpenGL performer. nVidia has some weaknesses with DirectDraw. So what? Like this will matter in 10 years? It might put some food on my table, but probably not yours. So why do you care? Oh, is this one of those things that Porsches 'fix'?
My personal skin in this is that I have not cancelled my order to Gateway for a X800 XT PE. The ATI cards are marginally better at integrating with my NLE. My NLE uses the GPU. Gateway is 2 months late in delivery, but $100 is $100. If I went and bought a card above MSRP, Clark Howard would probably start calling me at home and telling me off (he is too nice to do that, but you get the idea; unless you have no clue who Clark Howard is). The other benchie I care about is encode/decode. The ATI card seems to have an edge in WMV, but that could be a driver issue.

BUT, I would still consider a 6800. I like it. It works. Not sure I like the local sales team ("you should get a Quadro to display video"). But hey, at least he talked to me. ATI Canada does not follow up on calls (not fair, some of them do, but not the folks I needed to talk to). Working on a work project that uses fast video. When what we are doing is in place and running, I will post a note to tell you where it is.

Edit - on my part, I forgot I had moved on from the OMG 30+ fps thread. I knew I should have had that second cup of coffee ;D
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Originally posted by: PowderBB3D
Wow, seeing the GT outperform the XT-PE like that is pretty amazing.

ATI != OpenGL.

Glad I went nVidia this round =)

Not really. It should be totally expected, in fact. I would have been remarkably surprised if the results hadn't turned out that way. The NV40 arch supports a so-called "32x0" pipeline mode, which for the portions of the rendering workload that it can be utilized, it will operate nearly twice as fast as any ATI card, with the same number of pipelines.

I was also rather disppointed, for their to have only included the X800 Pro (12-pipe) card, almost as a whipping-boy for their higher-end benchmarks, and not the regular 6800 (12-pipe) as well. It would have provided a valuable comparison for mid-range card purchasers. Of course, that would have detracted from the semi-biased "NV kicks ass! Look at ATI... it.. sucks." tone of the review. Either that, or just not have included the 12-pipe X800 Pro at all.

Overall, disappointed at the lack of comprehensiveness, and Kyle's bias bleeding through.

I was happy with this paragraph from the first part of that article though:
For those of you that think you are not going to have the hardware that you need to play DOOM 3, the fact of the matter is that many of you will be just fine, although an upgrade may still be in your future. As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce 3 video card that is two years old will deliver a solid gaming experience that will let you enjoy the game the way id Software designed it to be. That fact alone should let many of you know that you will not be left behind in experiencing DOOM 3.

So it looks like I won't need to upgrade to play D3 after all, since I normally only play at 1024x768 no AA/AF anyways. Looks like D3 is more optimized than FarCry, in terms of less-demanding system requirements.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Originally posted by: Regs
Carmack should be named Nerd of the century.

Nah, Carmack is cool. BillG is the "nerd of the century". Granted, he is a very rich nerd, which I suppose makes it alright, I dunno.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Originally posted by: ai42
FYI... I don't think many people put much thought into the the overclock bits of the article.
Anyway since quite literally D3 is going to be using transitors and bits that have never really been used before there are two conserns I have.
1. It would be theoretically possible to have a GPU defect that causes artifacts in the game.
2. Since it is utilitzing more transistors it would also stand to reason that the GPU temperature would increase (perhaps limiting maximmum overclock speed as well)

I think that 1) is probably more along the lines of what he is hinting at. I'm guessing that the 3D engine in D3, is going to use a lot of these features present in DX9-class cards, that most other previous game engines didn't, and thus, people with overclocked cards, may suddenly find D3 running with artifacts and visible defects. I think that Carmack wanted to make it clear up front what was going on, so that there weren't a hoard of n00bs posting on forums about how "Doom3 fried my video card!".

Certainly I know from my own overclocks, on my AMD XP CPU, that it would run more-or-less stably in Windows, in ordinary apps, but my OpenGL screensaver would occasionally miss triangles when rendered, and Prime95 would occasionally give rounding errors. As I learned, the floating-point unit in AMD XP CPUs is more sensitive than the integer units to overclocking, and could fail in subtle ways when the rest of the CPU's operation appeared to be fine. I think that the same thing might happen with D3.

OTOH, perhaps it will provide a "good" benchmark, to verify "stable" overclocks of advanced video cards, beyond what 3Dmark03 and ATITool (for example) can test for.

I have to admit, I didn't really care about D3, but after reading that article, and seeing that it might actually be playable on my rig, now I'm actually a bit interested in obtaining it.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Originally posted by: Blastman
No. NV has been working very closely with id on optimizing Doom3. Is it any wonder it runs comparatively well on NV cards.
beyond3d?
The quote is from me. Nvidia probably IS "cheating" to some degree,
recognizing the Doom shaders and substituting optimized ones, because I
have found that making some innocuous changes causes the performance to
drop all the way back down to the levels it used to run at.

Carmack again refers to that in the HardOCP article, too:
On the other hand, the Nvidia drivers have been tuned for Doom's primary light/surface interaction fragment program, and innocuous code changes can "fall off the fast path" and cause significant performance impacts, especially on NV30 class cards.

Reading between the lines, I would say that he is basically saying up-front, "Both ATI and NV cheat. Live with it. Both give good frame rates, so just enjoy the game and stop worrying about it."

For those that DO care, though, I suppose the other interpretation could be, "none of these benchmark numbers are accurate, because both ATI and NV cheat".

One other thing, independent of the "cheats", is that I saw mentioned that the D3 "demo1" benchmark, doesn't run the game AI/physics, only the 3D engine (and sounds?). So actual in-game framerates will necessarily be *slower* than those shown. Additionally, since the demo appears to be CPU-bound at 1024x768 no AA/AF levels, on the 6800/X800 cards, then it appears that D3 may require quite a bit of CPU power, perhaps moreso than graphics-card power. (Since the 3D engine alone is already sucking up 100% of the CPU power of the systems that they tested those cards on - once you start playing the actual game, frame rates are bound to drop lower. How much lower remains to be seen.)
 

Fuchs

Member
Apr 13, 2004
160
0
0
Originally posted by: VirtualLarry
Originally posted by: Blastman
No. NV has been working very closely with id on optimizing Doom3. Is it any wonder it runs comparatively well on NV cards.
beyond3d?
The quote is from me. Nvidia probably IS "cheating" to some degree,
recognizing the Doom shaders and substituting optimized ones, because I
have found that making some innocuous changes causes the performance to
drop all the way back down to the levels it used to run at.

Carmack again refers to that in the HardOCP article, too:
On the other hand, the Nvidia drivers have been tuned for Doom's primary light/surface interaction fragment program, and innocuous code changes can "fall off the fast path" and cause significant performance impacts, especially on NV30 class cards.

Reading between the lines, I would say that he is basically saying up-front, "Both ATI and NV cheat. Live with it. Both give good frame rates, so just enjoy the game and stop worrying about it."

For those that DO care, though, I suppose the other interpretation could be, "none of these benchmark numbers are accurate, because both ATI and NV cheat".

One other thing, independent of the "cheats", is that I saw mentioned that the D3 "demo1" benchmark, doesn't run the game AI/physics, only the 3D engine (and sounds?). So actual in-game framerates will necessarily be *slower* than those shown. Additionally, since the demo appears to be CPU-bound at 1024x768 no AA/AF levels, on the 6800/X800 cards, then it appears that D3 may require quite a bit of CPU power, perhaps moreso than graphics-card power. (Since the 3D engine alone is already sucking up 100% of the CPU power of the systems that they tested those cards on - once you start playing the actual game, frame rates are bound to drop lower. How much lower remains to be seen.)

Agree 100%
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,203
126
Originally posted by: Johnbear007
Personally I'm about ready to puke over the non stop multi player focus in so many games. I for one will immensley happy if there is NO multi player in Doom 3, and that they spent ALL of the time making a great single player game that actually has a good STORY.

That would be nice, except for one little detail - iD games in general, and Doom in specific, have basically *no story* whatsover to them. So therefore, I would be hard-pressed to give iD the benefit of the doubt here, in terms of single-player game storylines.
 

g3pro

Senior member
Jan 15, 2004
404
0
0
Originally posted by: VirtualLarry
Originally posted by: Johnbear007
Personally I'm about ready to puke over the non stop multi player focus in so many games. I for one will immensley happy if there is NO multi player in Doom 3, and that they spent ALL of the time making a great single player game that actually has a good STORY.

That would be nice, except for one little detail - iD games in general, and Doom in specific, have basically *no story* whatsover to them. So therefore, I would be hard-pressed to give iD the benefit of the doubt here, in terms of single-player game storylines.

1996 called. It wants it's comment back. :roll:
 
Apr 14, 2004
1,599
0
0
Didn't you forget to mention that the 5950 performed roughly the same as the 9800 XT?
Wasn't Doom 3 supposed to be

A: The focal point of the NV3x architecture
B. The 5xxx's saving grace? Given a lack of HL2 performance anyway.

Personally I'm about ready to puke over the non stop multi player focus in so many games. I for one will immensley happy if there is NO multi player in Doom 3, and that they spent ALL of the time making a great single player game that actually has a good STORY.
My perspective is that you can play through a single player game once or twice, maybe 3 times at most, then you toss it. You can play multiplayer games for much longer. We have Unreal Tournament for that but Doom 3 could have provided a different feel. You can only have so much fun abusing computer AI again and again.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
okay, phray showed me where I missed the line about the newer ATI drivers. Interesting that even drivers 2 versions newer than the cat4.7's were not able to close the gap with NV's OGL performance.

Now, what happened to the frame rate cap? Wasn't it Carmack who said it would be capped at 60fps or was I dreaming that?
 

Megatomic

Lifer
Nov 9, 2000
20,127
6
81
Originally posted by: hans007
Originally posted by: Megatomic
It sure would have been nice to see numbers on a low end machine, like say mine.

Barton/NF2/1GB PC3200/9800P

How long before the other sites get to bench with D3?

wow. if that is a "low end " machine...



well i am wondering how it'll run on my "ancient pile of crap"

barton 2500, sis 741 chipset, 512 ddr400 , 9600 pro
I said low end due to the was they described the systems used in the benchmarks. There was a higher end and a high end system. I figured that based on that my system was fairly low. But not really, I am being facetious. I should have put a in my post.
 

high

Banned
Sep 14, 2003
1,431
0
0
i'm pretty confident i can run the game at 1280x960 with everything on medium to high with no aa/af
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
No it doesn?t. ATI doesn?t use game specific optimizations. They?re using shader replacement for the FX series because they?re too slow to run the normal shaders. Just like in Halo, etc. etc.

Your quote was in response to Nvidias custom Doom3 demos and the reason why Carmack and id sent out their own demo.

btw the custom shaders ect have been dropped for the 5900 as Nvidias driver team has got the drivers upto the point where the card can run his ARB2 path as good or better than the 9800.
 

DoobieOnline

Golden Member
Jan 12, 2001
1,397
0
0
Thanks for the heads-up on the article Cat! I'm doubly glad I tried the GT now. I really did enjoy my X800 Pro when I had it but with the newer drivers the GT is performing a little better in the games I play and obviously will perform much better with D3.

doobie
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,282
30,128
146
One other thing, independent of the "cheats", is that I saw mentioned that the D3 "demo1" benchmark, doesn't run the game AI/physics, only the 3D engine (and sounds?). So actual in-game framerates will necessarily be *slower* than those shown. Additionally, since the demo appears to be CPU-bound at 1024x768 no AA/AF levels, on the 6800/X800 cards, then it appears that D3 may require quite a bit of CPU power, perhaps moreso than graphics-card power. (Since the 3D engine alone is already sucking up 100% of the CPU power of the systems that they tested those cards on - once you start playing the actual game, frame rates are bound to drop lower. How much lower remains to be seen.)
You are forgetting what both you and I quoted from that article, Larry
we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience.
If a Willy can handle it the average 2ghz+ XP and P4 NW system will only make it better
 

Fuchs

Member
Apr 13, 2004
160
0
0
Keep in mind a lot of the new features have to do with shading and lighting. I imagine turning off pixel shading and knocking down the bells and whistles will make this game play ok on any system built in the last year or year and a half.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,282
30,128
146
Originally posted by: Malladine
My opinion of HardOCP has risen greatly...awesome article.
Personally, I feel THG and [ H ] have both always been solid overall, they just have a tendency to interject there own subjectiveness into their work. It's one of the things I really miss about Anand being far more hands on around here. He is like a Vulcan when it comes to evaluating things, and I always went away from reading his work with a sense of "Now I know, and without anyone trying to influence me one way or the other"
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |