8800gtx preview

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
How much more intensive is Extraction Point compared to the original.

And it's obvious why we didnt get FEAR or Oblivion benchies.... What would we have to look forward to the 8th for?
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
And it's obvious why we didnt get FEAR or Oblivion benchies.... What would we have to look forward to the 8th for?
Hardware availability instead of just software previews.
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
A few things I picked up from the DT preview and comments that haven't been mentioned in the thread (that I've seen) which I found interesting:

1) Using two PCI-e connectors thus you use two rails and don't need a PSU with a super strong single rail (does this make sense)

2) DT says NV has ramped up production for weeks and should be lots of cards on Nov 8 for a real hard launch (not 7800GTX 512MB style)
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Originally posted by: KeithTalent

You are the only one I have ever seen complain about this on AT. I have never experienced flickering of any kind, but maybe I am just lucky

Text

 

thilanliyan

Lifer
Jun 21, 2005
12,036
2,248
126
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

Maybe Dailytech used a different part of the game to benchmark than Dailytech.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: thilan29
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

Maybe Dailytech used a different part of the game to benchmark than Dailytech.
That be a neat trick.
 

thilanliyan

Lifer
Jun 21, 2005
12,036
2,248
126
Originally posted by: josh6079
Originally posted by: thilan29
Originally posted by: josh6079
Those scores look really degrading to the X1950XTX.

Here Anandtech scored ~20 more than they did with the same card at a higher resolution with the same amount of AA in Quake 4.

Maybe Dailytech used a different part of the game to benchmark than Dailytech.
That be a neat trick.

Sorry I meant different timedemos...and yeah I'm late...twas already covered.

I have to say...I am impressed by these scores. Hopefully there will be plenty available at launch for decent prices.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
How much more intensive is Extraction Point compared to the original.
A lot - it's almost like a next generation game. It uses much larger areas and much more shaders than the original game.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Originally posted by: BFG10K
How much more intensive is Extraction Point compared to the original.
A lot - it's almost like a next generation game. It uses much larger areas and much more shaders than the original game.

Indeed. I'm sorry to say, but not even my 7950GX2 could handle it maxed out.
 

jonnyGURU

Moderator <BR> Power Supplies
Moderator
Oct 30, 1999
11,815
104
106
Originally posted by: gramboh
A few things I picked up from the DT preview and comments that haven't been mentioned in the thread (that I've seen) which I found interesting:

1) Using two PCI-e connectors thus you use two rails and don't need a PSU with a super strong single rail (does this make sense)

No. Not no it doesn't make sense, but no you are wrong. Well... no, depending on the PSU you're using. Some PSU's do put each PCI-e on a separate rail, but MOST have both PCI-e's on the same rail.

Originally posted by: gramboh
2) DT says NV has ramped up production for weeks and should be lots of cards on Nov 8 for a real hard launch (not 7800GTX 512MB style)
[/quote]

This is true. They've been producing cards and shipping stock all week getting ready for the launch.
 

40sTheme

Golden Member
Sep 24, 2006
1,607
0
0
I really want to see some things on the R600... This card is nice, but ATi has some time to look at that one and say "How can we improve on that?" So I really want to see what they're concocting.
 

imported_Truenofan

Golden Member
May 6, 2005
1,125
0
0
im not a big fan of either nvidia or ati, i go for who has the best performance within my budget.....this may have a huge boost, but not even half the features are used from it. seems kinda pointless unless you wanna just go around bragging to everyone that you own one.
 

Sentry2

Senior member
Mar 21, 2005
820
0
0
I hope this launch will be like the 7800GTX launch last year. You could buy one on launch day for $599 without having to worry about price gouging. People who think this will be like the 7800GTX 512 launch are wrong. Basically a few cards available on (some before) launch day and then gone almost instantly. I'm guilty. I bought 2 at Monarch for $699 ea. but then I sold them 2 months later and still turned a $200 profit. Hopefully the 8800GTX's will go for $599 too. Probably $649 though.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.

Link

While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.

The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.

This could explain the lower power consumption.

I didn't see anyone address this, but I thought I should point out that your paragraph does not in any way imply that G80 will be on an 80nm process. It says G72 and G73 will, and then further goes on to say that G72, G73, and G80 will be ATI's competition.

 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Whoah -- it barely consumes more power than the X1950XTX!!! That's impressive! Idle still sucks down extra power, though -- probably the extra memory channels along with the huge die.

HL2/Prey/Q4 show how this card can stretch its legs at 1600X1200 (or above, by implication) with AA. Very good performance so far! This thing in SLI (for those who can afford it) is going to be nuts!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Sentry2

Hopefully the 8800GTX's will go for $599 too. Probably $649 though.

hahah you are funny. Let us recall the days of $199 Geforce 4200 when it first came out or $399 9800Pro top of the line card at the time that delivered top notch performance. Last time I checked, inflation was around 2-2.5% and there is hardly a good reason for pricing a graphics card at $599 and especially $649 considering in 2002/3 a top card cost $399.

This is clearly a sign of poor economies of scale. Since so few people buy high end cards, NV and ATI are forced to increase the prices on their top cards to cover their R&D costs. Higher prices result in fewer and fewer people buying high end cards, and NV and ATI are forced again to raise their prices and the cycle continues.

Considering prices of hard drives, ram and cpus have more or less stayed the same or decreased, videocards should have remained similarly priced. Yet it seems to be the only system component that continues to rise in price regardless.
 

Sentry2

Senior member
Mar 21, 2005
820
0
0
I wasn't saying they'll be $599. I said hopefully. Prices haven't inflated over the past year that much guy. Go laugh at someone else. Damn can't we at least hope anymore?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Avalon
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.

Link

While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.

The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.

This could explain the lower power consumption.

I didn't see anyone address this, but I thought I should point out that your paragraph does not in any way imply that G80 will be on an 80nm process. It says G72 and G73 will, and then further goes on to say that G72, G73, and G80 will be ATI's competition.

Click the link.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Link

Someone has picked up a 8800GTX and is benching them!!

Quick results -

Fear 1600x1200 all in game setttings maxed out with soft shadows enabled.
FEAR bench one
Min: 41
Avg: 83
Max 197

Note - 100% over 40 FPS.

F.E.A.R.

All possible settings maxed. 4xAA, 16xAF, 1600x1200, everything maximum, soft shadows on.

Fear Bench Two

one]http://s2.supload.com/image.php?get=fear1600x1_b7c75e2203a3dca51.jpg[/L]
Min: 40
Avg: 76
Max 175

Note - SOFT SHADOWS :Q (I thought SS cant be done with AA?)

Fear Bench Three

FEAR, 1600x1200, 4xAA, 16xAF, all settings maxed except soft shadows.
Min: 41
Avg: 81
Max 185

Fear Bench Four

FEAR maxed out at 1600x1200 with 16xAF, 16xAA:

Min: 16
Avg: 34
Max 84

Pretty Impresive.

How the control panel looks like using 8800GTX


3dmark06 on A64 4000+
6500~ but check out the S.M 3.0, S.M 2.0 scores. Impressive.

3dmark05

12000~ on the A64 4000+

Some oblivion Performance:
Anyways, here's some Oblivion at 1600x1200, default settings in game, HDR on, AA off, distant landscape, buildings, and trees in control panel:

60FPS average in folage area, 43FPS average in oblivion gate area.

Okay, at 1600x1200 in Oblivion with all the sliders and options maxed, and HDR (no AA) it gets on average:

30 FPS at Oblivion gate

42 FPS in Foilage area


Pretty damn good if I do say so myself.

Final note - All this using beta drivers never intended to the public. I geuss, the shipping drivers/Review drivers are alot different than this one.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Originally posted by: Cookie Monster
Originally posted by: Avalon
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.

Link

While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.

The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.

This could explain the lower power consumption.

I didn't see anyone address this, but I thought I should point out that your paragraph does not in any way imply that G80 will be on an 80nm process. It says G72 and G73 will, and then further goes on to say that G72, G73, and G80 will be ATI's competition.

Click the link.

I guess this also explains why power consumption is not as ridiculous as initally expected.

Acanthus predicted months ago that it was going to be on 80nm.


 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Dethfrumbelo
Originally posted by: Cookie Monster
Originally posted by: Avalon
Originally posted by: Cookie Monster
But it looks like G80 is on 80nm.

Link

While ATI prepares to roll out five 80nm grahics processor units (GPUs) in November, Nvidia plans to speed up its 80nm-production schedule of by rolling out revised versions of its G72 and G73 chips to compete with the rival, according to a Chinese-language Commercial Times report.

The revised versions of G73 (codenamed G73-B1) and G72, along with the highly anticipated G80, will be targeted to compete with ATI solutions from mid-range to high-end market segments, the paper noted.

This could explain the lower power consumption.

I didn't see anyone address this, but I thought I should point out that your paragraph does not in any way imply that G80 will be on an 80nm process. It says G72 and G73 will, and then further goes on to say that G72, G73, and G80 will be ATI's competition.

Click the link.

I guess this also explains why power consumption is not as ridiculous as initally expected.

Acanthus predicted months ago that it was going to be on 80nm.

All it has to be is a single core GPU. Then, i get a full page of apology from acanthus
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: nanaki333
Originally posted by: Elfear
Originally posted by: Centurin
If you want the new Ati card, you'll be waiting more than a month or two. More like 4 or 5.

Latest rumor has R600 coming out the last week of January.

ack! where'd you see that at?

i'm so torn. it's not like there's going to be many games taking advantage of dx10 immediately so i COULD wait.

they better have a damn good card coming out to be 2 months behind nvidia!


Just wait and see...If anything the g80 will be more available and maybe slightly more reasonably priced.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Truenofan
im not a big fan of either nvidia or ati, i go for who has the best performance within my budget.....this may have a huge boost, but not even half the features are used from it. seems kinda pointless unless you wanna just go around bragging to everyone that you own one.

Aye...DX10 is a ways off and any games that use DX10 are even further off. By the time games really NEED and by NEED I mean comparing an X1900 to a 6800, there will be a refresh of these cards with more performance.

I'm going to say that a good 90% of gamers don't have a monitor capable of taking advantage of this card. The other 10% are stuck waiting for something this card can do that a current x1950 (or SLi/crossfire config) cannot do currently. I'm not talking about FPS numbers here. That's not what I'm interested in at all. There is a point where a game is 100% smooth and not jittery and very playable. What I'm talking about is using all the features of the card. That won't happen soon at all.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |