Originally posted by: squidman
I had a 3dfx 6000!!! It was stunna, but it died because of power jolt, which ONLY affected the card with its stupid ass external power connector. Curses.
Exactly. And we should take them to task for it or they will just keep doing it.originally posted by Yozza
They have made an outright denial of any wrong-doing, and continued cheating to cover up for technical deficiencies in their products, which has now been revealed to date back to 3D Mark 2k1 when they were strongly in support of the benchmark.
So are you still under the impression that the scripted benches that hardware sites have been using are good benches and NV didn?t cheat on them either? Better think about that again. Nividia has potentially corrupted our ? ?REAL?? game benchmarks so what is one going to use now? Do people have to buy a card and take it home to test it before they find out it stinks?look at the hundreds of benchmarks done in REAL GAMES.
Originally posted by: squidman
Yall, dont be hatin on blastman. He done nothing wrong. he wanted to share, is all. THat being said, i would agree with megatomic: you do dramatize. Almost sounds like nVidia snatched your breakfast (makes me mad ). Cmon - let it go. Its marketing. Does intel care that fools spent tons of $$$ on like top of the line PIII 933, which went down once the 1 gig pIII came out? Do i care that i spent 300 bux on geforce 2 gts and 350 (!!!) on 1 stick of 256mb pc 133 ram in july 2000? Naw! If i did, i'd go along saying "taiwan earthquake cheated us all!". Just bad timing, is all.
Thats the most sense Ive heard yet, enough said. :beer:
Originally posted by: AEB
this whole thread could have been avoided if a) people would use more than 1 benchmark b) play more games
People always accuse everyone else of cheating, the truth is just because you lose doesnt mean someone else is cheating
Originally posted by: Blastman
Read the header ... MFH ...this is not a repost. This is about 3DMarK2001...not ....3DMark2003.
I followed the ruckus pretty closely, and I didn't see anything about the current 3DM01SE info back then. If this is a repost, can someone link the original thread that referenced the B3D post about the new shader file findings?Originally posted by: MercenaryForHire
This entire subject is a repost. nVidia's cheating in 3DMark2001 was already p00ped upon when the 3DMark03 issues were brought up.
Originally posted by: ketchup79
O wait, 3dmark isn't a game? 3dmark is very easy to make optimizations for, because it isn't a game, everything that happens in 3dmark is predictable. ATI does it, nVidia does it. It is not cheating, it's making optimizations. Think about this, you say that "adjusted" the 8500 is as fast as a ti4200. Can you think of any game where the 8500 is even close to the speed of a ti4200? No? Then these adjusted scores are invalid in my book.
Tom's show's the 4200 is:
50% faster than the 8500 in Aquanox.
23% faster than the 8500 in Dungeon Siege.
45% faster than the 8500 in UT 2003 Antalus.
and finally, as it relates to this thread:
18% faster than 8500 in 3dm2k1se!!!!!
8500, like it or not, a 4200 IS faster
Originally posted by: NYHoustonman
Tom's show's the 4200 is:
50% faster than the 8500 in Aquanox.
23% faster than the 8500 in Dungeon Siege.
45% faster than the 8500 in UT 2003 Antalus.
and finally, as it relates to this thread:
18% faster than 8500 in 3dm2k1se!!!!!
8500, like it or not, a 4200 IS faster
EXACTLY. To me, if that is all true, then the 8500 has some serious optimizations going as well.