Hardware availability instead of just software previews.And it's obvious why we didnt get FEAR or Oblivion benchies.... What would we have to look forward to the 8th for?
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.
Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.
That be a neat trick.Originally posted by: thilan29
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.
Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.
Maybe Dailytech used a different part of the game to benchmark than Dailytech.
Originally posted by: josh6079
That be a neat trick.Originally posted by: thilan29
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.
Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.
Maybe Dailytech used a different part of the game to benchmark than Dailytech.
A lot - it's almost like a next generation game. It uses much larger areas and much more shaders than the original game.How much more intensive is Extraction Point compared to the original.
Originally posted by: MS Dawn
flickerpiss nosecum
Originally posted by: BFG10K
A lot - it's almost like a next generation game. It uses much larger areas and much more shaders than the original game.How much more intensive is Extraction Point compared to the original.
Originally posted by: gramboh
A few things I picked up from the DT preview and comments that haven't been mentioned in the thread (that I've seen) which I found interesting:
1) Using two PCI-e connectors thus you use two rails and don't need a PSU with a super strong single rail (does this make sense)
[/quote]Originally posted by: gramboh
2) DT says NV has ramped up production for weeks and should be lots of cards on Nov 8 for a real hard launch (not 7800GTX 512MB style)
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.
Link
While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.
The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.
This could explain the lower power consumption.
Originally posted by: Sentry2
Hopefully the 8800GTX's will go for $599 too. Probably $649 though.
Originally posted by: Avalon
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.
Link
While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.
The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.
This could explain the lower power consumption.
I didn't see anyone address this, but I thought I should point out that your paragraph does not in any way imply that G80 will be on an 80nm process. It says G72 and G73 will, and then further goes on to say that G72, G73, and G80 will be ATI's competition.
Anyways, here's some Oblivion at 1600x1200, default settings in game, HDR on, AA off, distant landscape, buildings, and trees in control panel:
60FPS average in folage area, 43FPS average in oblivion gate area.
Okay, at 1600x1200 in Oblivion with all the sliders and options maxed, and HDR (no AA) it gets on average:
30 FPS at Oblivion gate
42 FPS in Foilage area
Pretty damn good if I do say so myself.
Originally posted by: Cookie Monster
Originally posted by: Avalon
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.
Link
While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.
The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.
This could explain the lower power consumption.
I didn't see anyone address this, but I thought I should point out that your paragraph does not in any way imply that G80 will be on an 80nm process. It says G72 and G73 will, and then further goes on to say that G72, G73, and G80 will be ATI's competition.
Click the link.
Originally posted by: Dethfrumbelo
Originally posted by: Cookie Monster
Originally posted by: Avalon
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.
Link
While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.
The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.
This could explain the lower power consumption.
I didn't see anyone address this, but I thought I should point out that your paragraph does not in any way imply that G80 will be on an 80nm process. It says G72 and G73 will, and then further goes on to say that G72, G73, and G80 will be ATI's competition.
Click the link.
I guess this also explains why power consumption is not as ridiculous as initally expected.
Acanthus predicted months ago that it was going to be on 80nm.
Originally posted by: nanaki333
Originally posted by: Elfear
Originally posted by: Centurin
If you want the new Ati card, you'll be waiting more than a month or two. More like 4 or 5.
Latest rumor has R600 coming out the last week of January.
ack! where'd you see that at?
i'm so torn. it's not like there's going to be many games taking advantage of dx10 immediately so i COULD wait.
they better have a damn good card coming out to be 2 months behind nvidia!
Originally posted by: Truenofan
im not a big fan of either nvidia or ati, i go for who has the best performance within my budget.....this may have a huge boost, but not even half the features are used from it. seems kinda pointless unless you wanna just go around bragging to everyone that you own one.