X800pro or 6800GT? ***POLL***

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShinX

Senior member
Dec 1, 2003
300
0
0
Wanna kno something funny ? If the 6800 NU didnt take a bashing from the X800 Pro , the GT wouldnt even be here. I wonder what would happen to the GT if we OCed a 12 PIPE X800 Pro or unlocked the pipelines
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
I?d take the X800pro. In pure fillrate the X800 pro is a little faster ?

X800pro ? 12 x 475 = 5700
6800GT ? .16 x 350 = 5600

And in quite a few reviews the X800pro is competing with the 6800U let alone the 6800GT. (Hint ?.a lot of reviews like Firingsquad and Anandtech didn?t turn off the Brilinear filtering on the 6800 and ran benches with the hacked up 61.11 drivers).

In this hardware.fr review in 8 (generally) newer games ?

UT3
Farcry
Tomb Raider
Splinter Cell
Il-2 FB
Warcraft III
Collin McRae 04
FIFA 2004

the X800pro matched the 6800U at ? 1600x1200 - 4AA/8AF ? in overall performance in those 8 games.


The other thing that concerns me on the 6800 is that its HDR (High Dynamic Range) shader rendering performance was terrible compared to the R420 in the techreport review ?

X800 XT ? ?131.29fps
6800U ? ? ?58.62
9800XT ? ? .45.11

R420 was running twice as fast. I not sure what to make of that yet. But the 6800 has to run 32bit to render HDR stuff properly. So I have some concerns about the 6800?s 32bit performance (which it is going to have to use for PS3.0)
 

Robor

Elite Member
Oct 9, 1999
16,979
0
76
Originally posted by: Dean
Considering the x800pro competes well with the 6800ultra now and can also show very good overclocks, x800pro vote from me which is even more amazing considering its running on 12 pipelines compared to 16. When the final 6800gt's arrive they will be toting single slot designs, which will translate to lower overclocks.
I think anyone planning for a serious overclock is going to use a (better) aftermarket cooling solution.
 

Regs

Lifer
Aug 9, 2002
16,665
21
81
the X800pro matched the 6800U at ? 1600x1200 - 4AA/8AF ? in overall performance in those 8 games.

You picked one benchmark out of many to show us that it only matches the NV40? In all other benchs though NV40 is the clear better performer.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Are we going to hand pick benchmarks to prove our points? For every hand picked benchmark you show an X800 Pro winning we can find one where it loses.

The cycle gets old.

BTW to the above poster. That is the saddest display yet. You show a benchmark where the Nvidia card is running at 8X AA vs 6X on the ATI card? Pathetic!
 

TStep

Platinum Member
Feb 16, 2003
2,460
10
81
Too early to tell which one is the sleeper bargain, if it can be called a bargain at these prices. At $300plus, it better clean my house when its not churning out 3D graphics. I abstain and will hold onto my 9800np overclocked for a while longer.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: TimisoaraKill
I would take the X800 Pro , when you load the cards @ MAX the X800 Pro beat the 6800U hands down ,..is not only this , ATI has better D3D support , this is good for me since i am mostly in SIM wars like Op Flaspoint and Americas Army.
Take a look how the 6800U perform when fully loaded :


http://www.driverheaven.net/reviews/6800x800pro/maxxing.htm

yeah but this comparison is somewhat unfair because everyone knows Nvidia cards suck at 8AA (mixed mode). Besides you cant simply say 6800GT or X800Pro is better because it all depends on what games you play. If you play Far Cry a lot, it is a better card, esp since u cant buy the GT right now. This is like comparing the newely announced 3.466 P4 EE 1066FSB to a 3400+ socket 754 A64 in gaming. We all know what the technology is, but not enough extensive benchmarks or reviews have taken place to be able to make a reasonable judgement. So far Nvidia improved Far Cry performance slightly, but it is not enough and it loses in all intensive shader games. Perhaps driver updates will fix that, maybe Ati will improve its drivers as well. It seems to me 6800GT is a better overall card as it is much closer to performance with 6800Ultra than x800pro is to x800xt. So if ppl pick x800pro over 6800gt, then essentially they are almost saying it's almost as good as 6800ultra as the performance difference is around 10% at best and 50mhz overclock! I think ppl are choosing x800pro for availability reasons, and the fact that it does seem to be faster in the latest new games. But with time Nvidia might turn things around slightly. I picked the GT card BTW.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I picked the GT, but only if you can OC the crap out of it cause the Pro seems to be doing just fine stock.

The X800 might be better because even though they are beta drivers, they show much less CPU limitation. Out of the box performance with X800.
 

DerwenArtos12

Diamond Member
Apr 7, 2003
4,278
0
0
this is awesome guys thanks. Now what are you all thinking about when PCI-E becomes the newest high end, I know ATI wil natively support it with this generation of cards while NV will not support it natively. How do you think this will affect things? I am predicting not only huge gains coming from PCI-E but the native support ATI has will be vbery evident.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
"Native" or not, it does not matter. At least not with NV's bridge chip. It will be used for quite some time to first- convert the native AGP 6 series to PCX.. then when the 6 series (and beyond) are "native" PCX they will be used when needed to put AGP models on the market.

Their is no latency introduced with the NV HIS chip, and no disadvantage to using one with it at all to a "native" PCX card.
It really was the smart thing to get designed and done because having it done (and done right) allows Nvidia to save much more money than ATI who will be concurrently producing native PCX and AGP boards.

BTW, ATI is as we speak creating a bridge chip of their own to use as Nvidia has. Its just not done so they are forced to do it the hard way (produce both kinds) for now. But it is not known if their bridge chip will be able to do BOTH PCX->AGP, and AGP->PCX. Its probably likely since they will only be needing it, that it will be PCX->AGP, as I suspect whenever they finish the bridge chip they will start producing only PCX boards and bridging models to AGP..

same exact thing NV is doing/has already done.
But NVs chip has already been examined and has no detrimental flaws that could ever affect cards as "slow" as this generation.

It is cheaper for ATI/NV to use the bridge chips in a time of conversion like now. They only have to create AGP native cards (when AGP is taking dominant sales, like now) when needed.. and create the few PCX cards with the chip that they need..

then when PCX is nearing dominance they can switch their line to PCX native signaling and use the bridge to create the few AGP they will still sell.

To answer your question (in case you missed my answer in that explanation), no.
Not having a "native" board, at LEAST within the Nvidia products will not hurt your gaming performance at all, and never will with these GPUs.
Also, to further clarify some things you mentioned-
Their wont be huge gains out of PCX for quite some time and when PCX is dominant NV will be producing native boards regardless.

But even if you get a 6800GT PCX board with a HIS chip, as I am.. you wont have any problems due to that fact for the life of the card.

The ONLY argument that is valid against it (and as I said, when most people are buying PCX cards, NV will be creating native PCX boards anyway), is that it is another point of failure.
But from what I know about the boards and the chip... it will be one of the last parts to quit.. especially compared to the core and the memory.. with the fan probably actually going first!

And most people, myself included get rid of a card long before even the mechanical portion (fan) quits!

So no worries!
 
Apr 14, 2004
1,599
0
0
It depends on who you believe. ATI says this sort of bridge would create a problem, and naturally Nvidia says its not an issue. I haven't seen any PCI-E benches around.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
I've investigated both sides claims and cant find much credibility to ATIs claims. From what I gathered, they are basing their statements off what *could* go wrong with a HIS chip.
Not what actually is wrong with Nvidias. They havent even seen nor studied Nvidias, thus it was hard for me to take ATIs word over NVs.
Then the kicker (for me), was when I heard ATI is also creating a chip that does the very same thing.. at that point I then threw out the notion that ANY bridge chip (as ATIs comments seemed to target) would inherently have those problems.
Because, if bridging did introduce sufficient problems then our big buddy ATI would CERTAINLY not use one and abuse the consumer as big bad Nvidia does.
 

Bar81

Banned
Mar 25, 2004
1,835
0
0
Originally posted by: Genx87
Are we going to hand pick benchmarks to prove our points? For every hand picked benchmark you show an X800 Pro winning we can find one where it loses.

The cycle gets old.

BTW to the above poster. That is the saddest display yet. You show a benchmark where the Nvidia card is running at 8X AA vs 6X on the ATI card? Pathetic!


Isn't that the truth (on both counts.) That driverheaven "review page" is probably one of the more biased benchmarks I've seen in quite some time. Can't anyone READ anymore :roll:
 
Apr 14, 2004
1,599
0
0
Probably won't matter, just as 4x to 8x was.
I don't know. 4x to 8x increased the raw bandwidth of the AGP bus, AFAIK, but the bandwidth was already so large at 4x it didn't make a difference. My guess is a bridge solution seems to be increasing the raw time of data transmission to and from the GPU which could be a big deal. It's kind of like turning down the faucet while 4x vs 8x is like buying a bigger hose. That's my understanding of the bridge argument. With CPUs I hear a socket-socket converter really kills performance.

In any case, PCI-E only seems to be helping in a few small areas, so I think sticking with AGP and getting a cheaper mobo is a good idea.
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: GeneralGrievous
Probably won't matter, just as 4x to 8x was.
I don't know. 4x to 8x increased the raw bandwidth of the AGP bus, AFAIK, but the bandwidth was already so large at 4x it didn't make a difference. My guess is a bridge solution seems to be increasing the raw time of data transmission to and from the GPU which could be a big deal. It's kind of like turning down the faucet while 4x vs 8x is like buying a bigger hose. That's my understanding of the bridge argument. With CPUs I hear a socket-socket converter really kills performance.

In any case, PCI-E only seems to be helping in a few small areas, so I think sticking with AGP and getting a cheaper mobo is a good idea.

It does not increase the raw time of data transmission. At least Nvidia's chip doesnt.
It uses the inherent latencies that are created within the GPU and its components and times its signals with it so that the latency is no greater with the HIS chip than without.


And the major advantage to PCX from what I know is that it allows a high amount of data to be transmitted BOTH ways instead of just from the video card to the system... PCX allows massive amounts of data ("16X") to be transmitted both ways.
This opens up the door for many new things but this generation of cards are far from capable of taking advantage of even a fraction of that bandwidth.. let alone has it been long enough for software developers to code for the added bandwidth and functionality.
By the time they do (and it will likely never affect gaming to any discernable degree).. these cards will be like a TNT today.
 
Apr 14, 2004
1,599
0
0
And the major advantage to PCX from what I know is that it allows a high amount of data to be transmitted BOTH ways instead of just from the video card to the system... PCX allows massive amounts of data ("16X") to be transmitted both ways.
This opens up the door for many new things but this generation of cards are far from capable of taking advantage of even a fraction of that bandwidth.. let alone has it been long enough for software developers to code for the added bandwidth and functionality.
That's also what I have heard. AGP is a 1 way bus essentially while PCI-E is 2.

It does not increase the raw time of data transmission. At least Nvidia's chip doesnt.
It uses the inherent latencies that are created within the GPU and its components and times its signals with it so that the latency is no greater with the HIS chip than without.
Do you have a source and/or benches of Nvidia's PCI-E solution anyhwere?
 

Edge3D

Banned
Apr 26, 2004
274
0
0
Originally posted by: GeneralGrievous
And the major advantage to PCX from what I know is that it allows a high amount of data to be transmitted BOTH ways instead of just from the video card to the system... PCX allows massive amounts of data ("16X") to be transmitted both ways.
This opens up the door for many new things but this generation of cards are far from capable of taking advantage of even a fraction of that bandwidth.. let alone has it been long enough for software developers to code for the added bandwidth and functionality.
That's also what I have heard. AGP is a 1 way bus essentially while PCI-E is 2.

It does not increase the raw time of data transmission. At least Nvidia's chip doesnt.
It uses the inherent latencies that are created within the GPU and its components and times its signals with it so that the latency is no greater with the HIS chip than without.
Do you have a source and/or benches of Nvidia's PCI-E solution anyhwere?

There is no possible way to validate it other than have a Nvidia card produced in native PCX and one in native AGP with the HIS chip.
It will be a while until both will have been released. With the HIS chip version coming first.

Simply put, I'm confident from the multiple sources I have read that the HIS chip will do and be exactly what I described that I am buying one with the bridge.
Its fairly easy to deduce through logic that ATI couldnt have known what they were talking about (it was FUD and marketing).. if ANYONE is to be believed it must be Nvidia as they are the only ones with their own chips !

Contrary to most of the scientific deadheads who talk down religon- Technology and Science are based in faith just as much as religon.
Sometimes the theories on paper turn out to be true, and sometimes not.

This example isnt predicting the future of the universe (and the past), its a relatively simple semiconductor. Less faith is needed here. I dont think real world benchmarks are needed.

I can assure you that everything I've stated is true. I do all the reading I do to look out for MYSELF(!), not promote some stupid corporation as I see many do on these boards.
As for my source, I have far too many sources over the years to validate this information. I'd have to take you back to college with me and start from the basics, work through some electrical engineering then study NVs bridge implementation.
But you could find out all the necessary information through some careful googling.
Of course, some are never satisfied until they are "proven" to their hearts content.. and some refuse to accept any degree of proof due to ulterior motives.. in this example that would be fanboyism.

Nvidia is actually very happy with their bridge... its exactly what ATI wishes they had already!
 

DerwenArtos12

Diamond Member
Apr 7, 2003
4,278
0
0
I just have to thank you all for your posts, this is exactly what I was hoping for in this thread.

EDIT: Were over 100 votes and so far the GT is in the lead.
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
I am sort of waiting to see what the X800 AIW will be like. I am headed towards ATI as they have a better track record with Pinnacle NLEs. Kind of helps when they partner together for branded capture cards I would guess.
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Well, overclocking the GT may not be something to look forward too. I've read so far that the chip should be able to do 500MHz, but you will be lucky to be able to do 450MHz. For this reason, Nvidia is not releasing the Extreme Edition of their card. They will ship those select few, who make it to 450, to select manufacturers for some packages such as gainward.

Right now, the question here still lies with the overclockability of the GT. I don't think there can be a definite decision right now, but if I had to make one, I would vote for the X800 Pro.

As much as Nvidia's architecture is new, it is still based off of the older core of the NV30. They just fixed all the wrong things, so they are very familiar with the core just as ATI is familiar with theirs. Both cards however do have new features where there may be new way to get performance out of driver optimization geared towards them. Plus seeing that PS2 is nolonger than much of a worry with PS3 on the horizon. Performance can be tweaked some more.

So my answer, which I first voted for the GT, must change to the X800.

So 53-69

I would take the X800 Pro , when you load the cards @ MAX the X800 Pro beat the 6800U hands down ,..is not only this , ATI has better D3D support , this is good for me since i am mostly in SIM wars like Op Flaspoint and Americas Army.
Take a look how the 6800U perform when fully loaded :
That is a terrible review. And what you said is terrible. Do you know how bad 8x kills performance? The performance drop is so insane it's not even worth thinking about turning it on. However the IQ kills. They should have been benched at 4x/16x and then we can see who is the winner. Also he should have compared the X800 XT against the 6800 Ultra. Obviously an ATI fanboy at work.

ATI is so much better at D3D support that it doesn't support SM3, which is sort of a biggie.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |