Originally posted by: ShawnD1
How about we all just calm down until the game is actually out.
I'll keep you posted how my GF2 Ti200 handles it
Originally posted by: NFS4
Originally posted by: shady06
Originally posted by: Budman
Originally posted by: NFS4
So is HL2 still gonna be released on Sept 30th?
Yes and on Oct 1 I will watch out for Nebor's thread on how slow his card plays it.
ROFLMAO
I'm sure that the performance with the new Det 50 drivers will be much better and nearly equal to the Radeon cards...it remains to be seen if the image quality will suffer b/c of the performance increase.
Either way, something smells fishy on the NVIDIA side of things...
Originally posted by: Nebor
THey had all the time in the world to optimize the drivers, and the drivers were made available to Valve for testing.... But Valve, wanting to make ATI look great, demands that the 50.. dets not be used.... Seems pretty clear to me.
Valve and ATI attempt to gang up on NVIDIA.... but once HL2 is released, and NVIDIA can tear it apart and make drivers to run it well, Valve will no longer be able to screw nvidia over.
As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.
What performance criteria are you comparing? Let's see the Honda even start moving when carrying a load only 1/10th what the dump truck can carry. In this case the dump truck completely clobbers the Honda in performance. What do you say?Originally posted by: Regs
Well Nebor, let me make this simple comparison to get this ideological thought: "More is always better", out of your head.
Dump truck running a v-10 diesel with 500 bhps, Honda Civic running a in-line 4 cly with 120 bhps. Which one will perform better in performance?
Sometimes more, is just more.
Originally posted by: SithSolo1
I thought this would be as good a placeas any to put this, so here goes. Never knew what an nVidiot was so I looked it up:
nVidiot
:Q
Someone needs to post a good one for us fanATIcs.
Originally posted by: Shad0hawK
first off befor ethe flame happy get started i am not a fanboy for ATI or Nvidia, in fact i think we need a third company to get into the mix.
this is a big step backward for the industry, back to the if you want to play this game then you need this card glide days. i know gabe newell swears up and down they did not make the game optimized for ATI, but is he telling the truth? i do not think so, ATI has pumped lots of $$$ and only a true fool does not think they will not get a return on thieir investment. what really strikes me is the $$$per fps chart at this ATI sponsered event, only the truly blind cannot see such an obvious marketing ploy for what it is while the perpretrators basically get up there and lie about it. saying a game happens to run better on x card is one thing, putting up a chart at an event sponsered by the same company that happens to make the card the $$$per fps chart favors is simply an advertisment, and to me takes alot of credibility away from what valve is saying.
the date does not escape me either 1. so close to the release HL2 and 2. the days date...9-11. ATI and valve have flown thier collective plane into the Nvidia tower... very symbolic indeed.
gabe newell does not want reviewers to post #'s with nvidia det 5's he says nvidia has optimizations that give misleading results, but why not allow the reviewers to use tools like binkvideo that are accurate depections of ingame performance and image quality? after all valve themselves used this same tool to make the movies we see on steam for the same reasons...plus if the counter is telling us we are getting 70 fps but what we see on the screen is really 30-40 fps anyone would be able to tell right away something was up. what is it the business partners valve and ATI do not want us to see for ourselves?
very fishy indeed.
Originally posted by: ShawnD1
Look at that graph one more time and you will see that it actually says the FX5600 is the best deal.
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/HL2_benchmarks/dx9_full.gif
The title of the graph says it's FPS/$
The X axis says it's $/FPS
It says the FX5600 costs about $0.11 for each fps. It says the Radeon 9600 costs about $0.38 cents for each fps.
Under that it says
"Valve did a little homework on Pricewatch to see which card offers best frame rate for the dollar - yes, we're just as surprised as you are to see the 9600 Pro take the top honors. That doesn't meant at all that it's the best card, just the best value."
The graph says the 9600 is the biggest scam. The benchmarks say it did well and the text says it's the best deal. Whoever did those graphs is a complete retard.
Originally posted by: NFS4
You know, I was just thinking. What are the chances that a driver could magically give a game that is three weeks away from release an over 100% or more increase in frame rates?
I don't remember ATI coming out with some magic driver when they got owned in Doom 3 benchmarks (at least not that I recall). And even if they did, I doubt it was a 100% or greater increase.
If NVIDIA's PS 2.0 performance on the FX series of cards truly is abmyssal, it seems as though it would take more than one driver revision to get things back up to speed...if it all if it's a hardware defect/mis-design.
Originally posted by: Duvie
I have to say this is far more entertaining then the Intel/AMD flame wars, or should I say FAN wars.....
I have to say I am not a gamer and likely wouldn't ever play HL2 unless I upgrade my 8500DV AIW to say a 9600AIW and it came bundled with it, which at the moment I can't see why as this 8500 does its job quite well for me....
I have had about 5-6 nvidia cards in the last 3 years and about 4 ATI cards......IN the end I have been happy with Nvidias driver support and ease of driver updates, with ATI not so much. However the ATI cards usually cost me less and the quality in my CADD work and in watching DVDs has always looked better with ATI. I directly compared GF2pro to Radeon and GF4 to Radeon 8500 and it was obvious to me....
That being said I like to buy best value for the buck so I have no allegiances....
Heck I may even get an A64 if the price comes down and it actually starts putting up numbers in real world multimedia apps like it did in gaming....