Originally posted by: Gstanfor
When you think about the R600's rumored specs and the 115 gig or so of bandwidth on offer, about the only thing I can think of that would use that up is true 8x MSAA at high resolution. I have a feeling AMD and the fanatics warcry will be how R600 does "real" AA and doesn't "cheat" with things like CSAA. We'll see how far wrong I am on this...
Originally posted by: ronnn
Originally posted by: Gstanfor
When you think about the R600's rumored specs and the 115 gig or so of bandwidth on offer, about the only thing I can think of that would use that up is true 8x MSAA at high resolution. I have a feeling AMD and the fanatics warcry will be how R600 does "real" AA and doesn't "cheat" with things like CSAA. We'll see how far wrong I am on this...
Of course if ati has equal fps with better IQ, their pr would promote this. Anything else would be weird. Anyways rumours are that nvidia will oc the *** out of the 8800gtx and release a phantom edition for the crown. Of course nvidia will say it wasn't phantom as they hard launched at least 6 of them.
This assumes that the R600 will actually perform, which seems like a big assumption at this time.
edit: actually whing is a Canadian word for usaers
Ultra overclocking pushes Peltier-cooled graphics beyond limits
Sparkling Calibre 8800GTX OC, this time in black[/b]
TWO MONTHS AGO, possibly due to the concern on the impact that the ATI R600, sorry, AMD X2800XTX, would have had on it if it came out on time (cough), Nvidia rolled out the factory-overclocked GeForce 8800GTX edition, based on select extra good dies.
And so did quite a number of its card vendors. Most of these like BFG, Asus were water-cooled, but just one was still using the Peltier thermoelectric cooler - you guessed it right, it was Sparkle again with its Calibre card.
While water-cooled cards had default factory settings of 630 MHz GPU, and 1030 MHz (DDR-2060) memory, the Calibre P880+ had the same GPU speed, but slightly lower memory speed setting, 980 MHz (DDR-1960). Hmm, let's see if this spec difference led to any real-world performance drop.
*****Graphs and charts***Pics*****Some BS*OMITTED***************CONCLUSION:
In summary, by going from the standard to OC edition of GeForce 8800GTX, you get some 10% extra 3DMark performance, which is expected as both GPU and memory clocks are pushed roughly by that amount. If you push it to its limits, at least in this Calibre P880+ case, you get an additional 4%, consistently on both UXGA and WQXGA resolutions.
.....
I also ran a test where GPU was overclocked at the same speed, but memory ran at 'only' 2 x 1010 MHz . Guess what, there was near-zero result difference - 11538 3DMark06 vs 11553 3DMark06, or 0.1% total speed loss for 2% memory clock loss. Therefore, at ~GDDR3-2000 speeds, the GeForce8800GTX memory requirement is already maxed out - slight RAM speed changes, up or down, dont' affect the results.
So, Calibre's Peltier does a very good job in getting the G80 GPU to go quite a bit beyond even its OC settings - 650 MHz is not bad at all. Its memory cooling performance isn't yet up to par with the water cooled siblings. My belief is that Sparkle could have done better there, as the simple heat spreader used doesn't even have full cooling fins to maximise the heat dissipation area for the RAMs or NVIO. On the other hand, well, GDDR3-2060 working performance is more than good enough for the overclocked G80 anyway - the benchmarks don't improve with the memory speed bumps
Originally posted by: BassBomb
I hope their midrange is good, I've never seen them as good buys
Originally posted by: Smurf
Originally posted by: BassBomb
I hope their midrange is good, I've never seen them as good buys
The ATi x800 XL was an excellent buy!
Originally posted by: 5150Joker
Originally posted by: chizow
Originally posted by: 5150Joker
Several developers over at Shacknews have chimed in on the state of the PC gaming industry. In their view, PC gaming is dying and the incentive for them to continue to develop on the PC as a primary platform is quickly shrinking. I'd be far more concerned about that rather than how late the R600 is or whether or not someone is a fanboy.
I don't doubt it. Its kind of sad when there's more quality titles on consoles that have been out for less than a year than on the PC. PC gaming has turned into endless anticipation followed by disappointment and then back to anticipation. Throw in really expensive hardware that costs as much or more to maintain as consoles and its looking pretty grim for PC gaming. I'll pick up an XBox360 and/or PS3 in the next few months and maintain my PC simultaneously, but I doubt I'll even bother with upgrading my PC for games the next time around.
That's essentially what I've done, I moved away from constantly upgrading my PC to enjoying a large library of games on a console. I've got over 10 games on the 360 already after having owned it less than a year; with the PC I'd be lucky to buy 1 game a year.
Originally posted by: Wreckage
Originally posted by: josh6079
QFT. Nothing helps trouble shooting a driver more than having the entire world find bugs with it.Many driver bugs won't be found until the unwashed masses get their hands on the card.
I can't wait for "akshayt" to get a R600, if it has a bug he will sure find it.
Originally posted by: ronnn
Of course if ati has equal fps with better IQ, their pr would promote this. Anything else would be weird. Anyways rumours are that nvidia will oc the *** out of the 8800gtx and release a phantom edition for the crown. Of course nvidia will say it wasn't phantom as they hard launched at least 6 of them.
This assumes that the R600 will actually perform, which seems like a big assumption at this time.
Originally posted by: jrphoenix
Originally posted by: 5150Joker
That's essentially what I've done, I moved away from constantly upgrading my PC to enjoying a large library of games on a console. I've got over 10 games on the 360 already after having owned it less than a year; with the PC I'd be lucky to buy 1 game a year.
Same here.. I hadn't owned a console since the Intellivision (was that in the 80's or 90's?).. I bought a 360 and haven't bought a PC game since I have 9 Xbox 360 games now.
Apart from these products, GECUBE will also be unveiling its full series ultimate 3D graphics card using the latest chipsets from ATI with support for the new generation DX10 technology to its VIP business partners in its confidential (NDA) room. Don't miss out on this unique opportunity from GECUBE at CeBIT!
Originally posted by: Matt2
In Rollo's defense, he never flat out LIED about anything he said regarding video cards. He lied about AEG, but not about his video card knowledge.
He may have blown the issue way out of proportion regularly, he may have misconstrued the truth, but was never completely false to my knowledge.
A good example of this was the R520's (and R580's?) inability to do vertex fetch. He created a huge flame war over it and all it really did was let Nvidia users use realistic water settings in Pacific Fighters. It was as insignificant a victory for Nvidia as could be, but in the end he was right. ATI cards couldnt do vertex fetch.
AMD?s next-generation value and mainstream products are set to bring DirectX 10 and high-definition video playback to the masses.
...
All RV630 boards feature 128-bit memory interfaces and occupy a single-slot.
At the top of the mid-range lineup is the GeForce 8600GTS. The G84-based graphics core clocks in at 675 MHz. NVIDIA pairs the GeForce 8600GTS with 256MB of GDDR3 memory clocked at 1000 MHz. The memory interfaces with the GPU via a 128-bit bus.
...
Expect NVIDIA to pull the wraps off its GeForce 8600GTS, 8600GT and 8500GT next quarter in time to take on AMD?s upcoming RV630 and RV610.
Originally posted by: ronnn
More proof it exists. Props to Arnold Beckenbauer of b3d for finding this.
Apart from these products, GECUBE will also be unveiling its full series ultimate 3D graphics card using the latest chipsets from ATI with support for the new generation DX10 technology to its VIP business partners in its confidential (NDA) room. Don't miss out on this unique opportunity from GECUBE at CeBIT!
*Exclusive GECUBE Product with the ATI World?s most powerful Chipset
GECUBE will be unveiling at CeBIT its secret weapon (NDA) for the second quarter of 2007. Using the latest ATI core with support for the new generation DX10 technology, don?t miss this chance to witness its jaw-dropping performance for yourself at CeBIT!
If they've figured out how to do 6 multi-samples in one pass they might offer 12xMSAA. Or they may have hybrid modes which would explain why nVidia released theirs.When you think about the R600's rumored specs and the 115 gig or so of bandwidth on offer, about the only thing I can think of that would use that up is true 8x MSAA at high resolution.
Why would they? 8xQ is genuine 8xMSAA.I have a feeling AMD and the fanatics warcry will be how R600 does "real" AA and doesn't "cheat" with things like CSAA.
Originally posted by: SilentRunning
You missed this quote:
Originally posted by: BFG10K
If they've figured out how to do 6 multi-samples in one pass they might offer 12xMSAA. Or they may have hybrid modes which would explain why nVidia released theirs.When you think about the R600's rumored specs and the 115 gig or so of bandwidth on offer, about the only thing I can think of that would use that up is true 8x MSAA at high resolution.
Why would they? 8xQ is genuine 8xMSAA.I have a feeling AMD and the fanatics warcry will be how R600 does "real" AA and doesn't "cheat" with things like CSAA.