Originally posted by: Evdawg
Yea but i dont think that the ati cards will be as bad at doom3 as the nV were with HL2.
i really think it's funny that there's so much focused on how these cards do based upon benchmarks of games which are not only unreleased, they aren't even in late beta form - some run using drivers that are what, 4-5 months old?
while it's safe to assume ati's choice of going the generic dx9 codepath was much smarter than nvidia's proprietary choice (based on gabe newell's comments regarding how difficult and time consuming it was to optimize for nvidia's codepath), and perhaps even suggest the r3xx architecture is more sound than the nv3x's, we really have no clue as to what the final performance will be on these games when these titles actually ship (hell, it's looking like we might see a fall refresh of the next gen cards before we see hl2).
this really isn't an to "defend" nvidia; their performance and their architecture simply could be inferior, however logic would dictate that assumptions made based on the above are harldy derivitive of what the actual performance might actaully be. while image quality remains to be seen, it's been proven that nvidia's efforts in redesigning their drivers have positively affected performance (this
article covers how nvidia has started working on shader performance). heck, iq and peformance have BOTH been increased in
ut2k3 - by a substantial amount. while skepticism is certainly in order, i don't see how final conclusions can be made at this stage.
what baffles me is why we don't see how these cards run using current drivers and dx9 titles which ARE currently available, such as the 'far cry' demo. surely this would be much more relevant to this topic than making presumtions on performance based on unfinished betas while using outdated drivers...