x800 xl

johnnq

Member
May 22, 2005
39
0
0
i'm about to buy my first computer and im goin to get a gigabyte x800 xl for like 320 off of newegg.
1 there are cheaper ones for about 270 with fans (gigabyte xl has a heat sink). are the features different from these cards? do the stock fans make more than 30db on just microsoft word?
2 should i just get a super cheap video card for like $10 and wait 3-4 months for the next generation of cards?
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Depends on if you can live without some of the cool games out and coming out for two to three months.

The X800XL is a great card and it should last you for a good long time. I love mine, works great and fast.

Even if you waited for 2-3 months for the new cards coming out your going to have to pay a huge premium for them. Personally the only reason to wait imo is to get a higher end card cheaper, not the latest and greatest as they are very rarely a decent value considering their premium costs.

So with all that said I would get the X800XL now and be happy with it running everything on the market fine. Or wait and save up cash for a price drop on an even better card than the X800XL.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: baddog121390
for 320 i would get a 6800gt instead. a bit more performance + SM3.0

where can you get a PCI-Express 6800GT for $320 cheapest I've seen them is $350?
 

CMC79

Senior member
May 31, 2003
313
0
71
I bought a Connect3d x800xl for $249 with free shipping from Monarch. It's a quiet card--quieter than the evga 5900XT that it replaced--I've been very happy with it. Unless you are really into silent running you could save $50 or $60 and buy a game or two instead. I was wary of investing in "old technology" (because of the whole SM2 vs. SM3 debate), but $250 for a 16pipe card was too good to pass up, and I couldn't justify the money for a PCI-E 6800GT (especially when the AGP models are so much cheaper). Regardless of silent or not, the x800xl is probably your best bang for the buck in PCI-E.

 

dfloyd

Senior member
Nov 7, 2000
978
0
0
For that price its hard to beat the X800XL.

And in some cases it does surpass the 6800GT. The cards are really close in most comparisons though.

Imo the 6800GT is not worth the extra cost. If it was within $20 might be, but not when its $70 more.
 

BillyBobJoel71

Platinum Member
Mar 24, 2005
2,610
0
71
the gt is low cost now, mine was 400 but that was at a circuit city. 6800gt = x800xl, even though the gt was meant to be the x800 pro rival, but it beats it.
 

johnnq

Member
May 22, 2005
39
0
0
heh the xl beats the pro most of the htime.i thought i knew what i was looking at...what is sm2 and sm3?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: SuperTyphoon
the gt is low cost now, mine was 400 but that was at a circuit city. 6800gt = x800xl, even though the gt was meant to be the x800 pro rival, but it beats it.

Since the 6800gt is 16 pipe and the x800pro is 12 pipe, it's not a fair comparison, and the gt would spank the pro in most cases, except maybe HL2. The xl, though, is 16 pipe, so it pretty much equals the gt in performance.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
It's a fair comparison, because the 12-pipe X800P is clocked much higher than the 16-pipe 6800GT. The problem is that the X800P has slower memory, and at the resolutions and settings these cards are expected to run, the GT's extra RAM speed is important.

SM2 and SM3 are the Shader Models of each card, meaning what Direct3D feature set they support in hardware. Neither card is literally SM2 or SM3, but rather SM2+ and SM3+. The point is that the GF6 series has a more advanced featureset. The main visible difference can be seen in the HDR modes of Splinter Cell: Chaos Theory and Far Cry, both of which require FP blending (SM3+), which is only supported by the GF6 cards at this time.

You could wait, but I don't think there'll be a next-gen card faster than the X800XL (or 6800GT) available for $300 in 3-4 months. I may be very wrong, but I think you'll only see $400+ cards initially, just like with the GF6 and X800 launches. That said, something faster is always right around the corner. At some point, you're just going to have to pony up and enjoy.
 

Capt Caveman

Lifer
Jan 30, 2005
34,543
651
126
In regards to the wide price range for X800XLs, you need to decide if things like dual dvi and vivo matter to you. If not, get one of the least expensive ones.
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Speaking of Splinter Cell Choas Theory, I have not seen it on a GF6 card but on my X800XL it is literally the best looking game I have ever seen. It seriously rivals things I am seeing on the Xbox 360 and the PS3 from E3. The shadows, the sound, the texture quality, the lighting, just amazing.

What is this special feature you get with the GF6 and any chance you got a side by side screenshot? I ask because it really is hard to imagine this game looking much better.

Edit:

And not sure this matters to you but if you like DirectX 9 videos then Nvidia 6800 cards appear to not have the acceleration that they claimed they did. In fact they appear to have none at all. I tried running a video on my 6800 NU and it was bad, I mean like less than ten fps bad.

Also I am reading a bit more on SM 3.0. I cant find any actual info on what it does now, it sounds like for the future it can help but it does appear that the articles I am seeing are claiming two oppisite things. One that SM 3.0 wont be that much of a difference and others claiming its a huge difference, hopefully someone can post some side by side screenshots and clue us in to the advantages.
 

akshayt

Banned
Feb 13, 2004
2,227
0
0
if you can wait sometime then take agp ti 4200 128mb and get next gen mainstream / high end
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: dfloyd
Speaking of Splinter Cell Choas Theory, I have not seen it on a GF6 card but on my X800XL it is literally the best looking game I have ever seen. It seriously rivals things I am seeing on the Xbox 360 and the PS3 from E3. The shadows, the sound, the texture quality, the lighting, just amazing.
How was the banding due to running at Sm1.1?

What is this special feature you get with the GF6 and any chance you got a side by side screenshot? I ask because it really is hard to imagine this game looking much better.
Almost any SC review has this.

[/quote]And not sure this matters to you but if you like DirectX 9 videos then Nvidia 6800 cards appear to not have the acceleration that they claimed they did. In fact they appear to have none at all. I tried running a video on my 6800 NU and it was bad, I mean like less than ten fps bad.[/quote]
You must have a Pentium 2? My five year olds A64 3000+/6800NU plays SIL very well?

Also I am reading a bit more on SM 3.0. I cant find any actual info on what it does now, it sounds like for the future it can help but it does appear that the articles I am seeing are claiming two oppisite things. One that SM 3.0 wont be that much of a difference and others claiming its a huge difference, hopefully someone can post some side by side screenshots and clue us in to the advantages.
The advantages of SM3 have only been in offering performance increases at this point, there are no SM2 vs SM3 screenshots.

 

dfloyd

Senior member
Nov 7, 2000
978
0
0
Rollo,

There was no banding at all, not even in the slightest. The game looks absolutly amazing. It truly is about the best looking game I think I have ever played.

No I have a Athlon 64 3200+ and I got less than ten fps on my 6800 NU. Its because HD acceleration is broke in all 6800 cards. Are you sure your running true HD DX9 videos? Maybe its the SLI thats making it run that way, but a single 6800 NU ran it like a slide show. And its not a memory issue as I was running one GB of Corsair XMS Xtreme CLS 2 ram at the time when I had the 6800 NU. So no I seriously doubt its my CPU.

Ahh on the SC CT thing I thought some were claiming that the FP Blending would increase image quality. Not just speed it up. Honestly speeding it up is not a issue, I have everything on max at 1280 x 1024 with 4X FSAA and it looks stunning on my X800XL. And runs without any slowdown ever.

Edit: Also on the video thing is a very well known issue. Just do a search on Anandtech forums and you will find quite a few forums dealing with it. Some are even talking class action lawsuit as Nvidia claimed it had Hardware acceleration for this, guess what it has none. So technically your card should have little to do with the speed of it, at least from all the benchmarks and reviews I have seen. They test the video and the majority of people were at or near one hundred percent cpu utlilization while playing it. With or without the so called acceleration enabled. I was as well. It was a false claim by Nvidia and it is well known that the 6800 series does not run these videos well. The 6600GT does run them well, as it is enabled properly in these cards.
 

johnnq

Member
May 22, 2005
39
0
0
i know what dual dvi is but whats vivo? does that mean i can plug a playstation into my computer? or is that only with a tv tuner.
 

BillyBobJoel71

Platinum Member
Mar 24, 2005
2,610
0
71
if you only want to play games and programs, don't worry about vivo. im pretty sure it has something to do with tv tuners but i dfon't know exactrly.
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
V: Video
I: In
V: Video
O: Out

So that means the card has video and and video out so yes you could hook up your playstation to the computer. The tv tuner is not indcluded in vivo. You could still hook up a vcr, or satellite, or cable box to a vivo input but you could not change channels and such from within the pc itself, you would have to do that from the source (the sattelite, the ps2, etc)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: dfloyd
Rollo,

There was no banding at all, not even in the slightest. The game looks absolutly amazing. It truly is about the best looking game I think I have ever played.
http://www.firingsquad.com/hardware/splinter_cell_chaos_theory_1/page2.asp
Clearly we can see the greater intensity HDR lighting brings to Chaos Theory. In the second batch of shots you can see Sam?s cheekbones more clearly as the green hue of light from Sam?s IR goggles shines on his face. Images 15 and 16 show clear differences as well.
In the image above you can clearly see that in the 3.0 shader mode, the light is brighter than when running in 1.1 mode


No I have a Athlon 64 3200+ and I got less than ten fps on my 6800 NU. Its because HD acceleration is broke in all 6800 cards.
It's not "broken"- it states on nVidias web page that most 6800s do not support WMV9 acceleration- why do you think they do? BTW- PCIE 6800s supposedly will support WMV9 hardware decode, if MS ever releases a .dll to enable it. Also BTW, AFAIK ATI does not support this either and is waiting on the same .dll for their shader based workaround.

Are you sure your running true HD DX9 videos?
I've tested those videos on what probably amounts to more cards than you've owned in your life over the last eight months or so. (ten different cards if you're wondering)

Maybe its the SLI thats making it run that way,
SLI has nothing to do with this;however, my dual 6800GTs run SIL1080 smooth as glass. (thanks to my 3800+ no doubt)

but a single 6800 NU ran it like a slide show. And its not a memory issue as I was running one GB of Corsair XMS Xtreme CLS 2 ram at the time when I had the 6800 NU. So no I seriously doubt its my CPU.
Must be your system configuration then. After reading this misleading information, I plodded up two flights of stairs to my guest bedroom where my five year olds A64 3000+/1GB Corsair VS/reference model AGP 6800NU reside. After I uninstalled the damn RealPlayer my wife installed to watch interviews of people on Survivor :roll: and installed WMP10- bingo! Silky smooth SIL1080 except for two split second hitches near the end.

Ahh on the SC CT thing I thought some were claiming that the FP Blending would increase image quality. Not just speed it up. Honestly speeding it up is not a issue, I have everything on max at 1280 x 1024 with 4X FSAA and it looks stunning on my X800XL. And runs without any slowdown ever.
Errr, you're running it at SM1.1, which takes a lot less power?

Edit: Also on the video thing is a very well known issue. Just do a search on Anandtech forums and you will find quite a few forums dealing with it.
I don't need to search- I was a big part of those debates.

Some are even talking class action lawsuit as Nvidia claimed it had Hardware acceleration for this, guess what it has none.
Yep people were talking class action suit last December. Then reality kicked in and no one has said much about it since. :roll:

So technically your card should have little to do with the speed of it, at least from all the benchmarks and reviews I have seen. They test the video and the majority of people were at or near one hundred percent cpu utlilization while playing it.
My computers run it in the high 70s-low 80s.

With or without the so called acceleration enabled. I was as well. It was a false claim by Nvidia and it is well known that the 6800 series does not run these videos well. The 6600GT does run them well, as it is enabled properly in these cards.
Sorry- but this is wrong as well. While the 6600GTs have the working hardware WMV acceleration, they are waiting on MS like the PCIE 6800s. The only people who've seen this working are at review sites that nVidia gave updated beta versions of WMP10, AFAIK.

You shouldn't post so much misinformation about this stuff, people might believe you?

 

coomar

Banned
Apr 4, 2005
2,431
0
0
my 9600xt has vivo, i still use a vga box b/c the picture quality is better, though next-gen card may be better with the vivo
 

dfloyd

Senior member
Nov 7, 2000
978
0
0
You posted a link in reply to me stating there was no banding. I read the link and I do not see it saying anything about banding, it does claim that his cheekbones are slightly more defined and 3.0 is a little brighter, no where do I see anything about banding so may I ask what the link is supposed to show? At least in the context of what you quoted and replyed to.

"It's not "broken"- it states on nVidias web page that most 6800s do not support WMV9 acceleration- why do you think they do? BTW- PCIE 6800s supposedly will support WMV9 hardware decode, if MS ever releases a .dll to enable it. Also BTW, AFAIK ATI does not support this either and is waiting on the same .dll for their shader based workaround."

This above quote I find most amusing of all. Obviously you have not read your box. My box, and most everyone I have talked to whom has read theirs) claims that my 6800 NU has WMV9 Hardware acceleration. And I never said Ati did or did not, what I stated was that it is broken in Nvidias cards. According to the box the 6800 is supposed to have it, well according to tests, benchmarks and just about every single person who has tried it, it does not. So yes it is broken, that or Nvidia out and out lied and advertised false claims on their products.


"I've tested those videos on what probably amounts to more cards than you've owned in your life over the last eight months or so. (ten different cards if you're wondering)"

This quote I find highly amusing as well. You know what happens when you make assumptions? I do. I have been in the computer field since 1985 and have been in tech support for the past twelve years. I do believe I have worked with or owned quite a few video cards in that time (I would guesstimate in the hundreds). So your ten somehow seems very small by comparison and quite pointless to the argument. Are you trying to make yourself seem like some kind of expert? Owning and handling a video card does not make one a expert. Heck I am far from even dreaming of being a expert on video cards. I am just stating what I see and dont see.

"SLI has nothing to do with this;however, my dual 6800GTs run SIL1080 smooth as glass. (thanks to my 3800+ no doubt)"

It may not have anything to do with it. I never said it did. You claimed the videos were running fine on your machine so I said maybe it helps. And the statement that your vidoes are running smooth as glass thanks to your 3800+ must be true. As obviously according to you, me, and Nvidia the card is not doing anything to help out. (Well actually their box claims it is, but according to Nvidias website it is not). So what are we discussing? The fact that my original statement was correct?

"Errr, you're running it at SM1.1, which takes a lot less power?"

Again what is your point? If SM 3.0 takes so much more power then what use is it? So I can brag about his jaw line looking a little better and having brighter lighting? Thats just silly imo. Do you understand relatitivity?

"I don't need to search- I was a big part of those debates."

Glad you were, my point was that they are there and my point had been proven. Again you give more evidence to back up my claim, so I ask you one more time. What are we discussing?

"My computers run it in the high 70s-low 80s."

Thats very impressive, I wish I knew how as a 1080i HD DX9 video on my 6800NU never dreamed of coming close.

"Sorry- but this is wrong as well. While the 6600GTs have the working hardware WMV acceleration, they are waiting on MS like the PCIE 6800s. The only people who've seen this working are at review sites that nVidia gave updated beta versions of WMP10, AFAIK.

You shouldn't post so much misinformation about this stuff, people might believe you?"

Ok how is my statement false? In your fist sentence you agree with me by saying "While the 6600 GTs have the working hardware WMV acceleration....." I stop at that point because that is what I said. You agreed with me and told me I was wrong in the same sentence. What are you talking about? Your not making any sense. You go on to say the only people who have seen it working are reviewers. So how does that make my statement false? Reviewers are real people whom have seen and used this feature. So again how is my statement false? I never said every person on earth can use it now, I said the feature is working in the 6600 GT cards, and apparently because reviewers can use it, then it must be working. And even then I still dont think you are correct. I have seen people post who have 6600 GTs and claim that they had less CPU utilitzation with the HW acceleration turned on. So I am sorry, but I will believe the many people I have seen claim that.

Ok all that said this is just silly. After years of this why do you continue to argue with anyone who seems to say anything that could be considered slightly bad about your chosen product? Its just silly. I mean think about it, its almost like a religion to some people. Welcome to the church of product labels, we bow down and follow blindly to justify our decisions. It really cracks me up overall. Personally I dont care who wins. I own Nvidia cards, I own Nvidia motherboards, I own Ati cards, and many more. I am not arguing for or against one card by saying that Nvidia claimed to have features on their cards (At least according to their boxes that the cards ship in, you know the ones that sit on the shelves, the ones you read to purchase the card).

So reply if you wish Rollo, but I dont think I will be up for you telling me I am wrong by agreeing with me anymore. Just dont see the point, so will be last post here for me
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Pete
The main visible difference can be seen in the HDR modes of Splinter Cell: Chaos Theory and Far Cry, both of which require FP blending (SM3+), which is only supported by the GF6 cards at this time.

Actually, Far Cry's HDR works fine with the SM2 path.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |