I agree with the other posters. You should take a close look at the driver quality settings. I play COD2 at 1650 x 1080 all settings maxed, on a 6800GT. I don't know what frame rate I am getting but it plays smooth as butter.
Originally posted by: Woofmeister
A few points. First, your card stock core and memory settings are very low. Most of us with 7800GTs are running well over 500 MHz/1100 MHz. Try using the "Coolbits" registry hack to detect optimal settings for your core and memory. Bumping those values will yield big performance gains.
Second, both of the games you have focused on are notorious hardware hogs which cannot be maxed out with any current hardware configuration. Not with SLI, not with Crossfire. Both FEAR and COD2 are meant to challenge the next generation of cards. You are asking too much of your current 7800GT when you expect to run either game at max settings. Having said that, there's a difference between a perfectly satisfying graphics experience and being able to run at full maximum. I'm running FEAR at 1920 x 1200 with my single 7800GT and it looks absolutely spectacular. Use the in game FEAR time demo to experiment with all the performance and video settings and you're sure to arrive at a good compromise between performance and quality. I'm only averaging 35 FPS in FEAR and the game is perfectly smooth. That's with vertical sync enabled and at 1920 x 1200, so you ought to be able to scale much higher if you work at it.
Third, who the Hell is "Point of View?" Had to do a Google search on them and only found one video card review. Next time try EVGA, XFX, BFG or ASUS; they're preferred NVIDIA vendors and it shows in their performance.
Originally posted by: Woofmeister
A few points. First, your card stock core and memory settings are very low. Most of us with 7800GTs are running well over 500 MHz/1100 MHz. Try using the "Coolbits" registry hack to detect optimal settings for your core and memory. Bumping those values will yield big performance gains.
Second, both of the games you have focused on are notorious hardware hogs which cannot be maxed out with any current hardware configuration. Not with SLI, not with Crossfire. Both FEAR and COD2 are meant to challenge the next generation of cards. You are asking too much of your current 7800GT when you expect to run either game at max settings. Having said that, there's a difference between a perfectly satisfying graphics experience and being able to run at full maximum. I'm running FEAR at 1920 x 1200 with my single 7800GT and it looks absolutely spectacular. Use the in game FEAR time demo to experiment with all the performance and video settings and you're sure to arrive at a good compromise between performance and quality. I'm only averaging 35 FPS in FEAR and the game is perfectly smooth. That's with vertical sync enabled and at 1920 x 1200, so you ought to be able to scale much higher if you work at it.
Third, who the Hell is "Point of View?" Had to do a Google search on them and only found one video card review. Next time try EVGA, XFX, BFG or ASUS; they're preferred NVIDIA vendors and it shows in their performance.
Originally posted by: Rollo
So you know for a fact that licensing revenues of game engines are not where developers "make money" and that they should think about the cards of the past and what most people have?
Originally posted by: T101
I could do that temporarily, but I can not afford to equip both computers with SLI setup.
Requiring SLI setup, that would be like wanting to sell only to a small portion of the user-base. A very bad business idea.
Originally posted by: DeathReborn
Originally posted by: Woofmeister
A few points. First, your card stock core and memory settings are very low. Most of us with 7800GTs are running well over 500 MHz/1100 MHz. Try using the "Coolbits" registry hack to detect optimal settings for your core and memory. Bumping those values will yield big performance gains.
Second, both of the games you have focused on are notorious hardware hogs which cannot be maxed out with any current hardware configuration. Not with SLI, not with Crossfire. Both FEAR and COD2 are meant to challenge the next generation of cards. You are asking too much of your current 7800GT when you expect to run either game at max settings. Having said that, there's a difference between a perfectly satisfying graphics experience and being able to run at full maximum. I'm running FEAR at 1920 x 1200 with my single 7800GT and it looks absolutely spectacular. Use the in game FEAR time demo to experiment with all the performance and video settings and you're sure to arrive at a good compromise between performance and quality. I'm only averaging 35 FPS in FEAR and the game is perfectly smooth. That's with vertical sync enabled and at 1920 x 1200, so you ought to be able to scale much higher if you work at it.
Third, who the Hell is "Point of View?" Had to do a Google search on them and only found one video card review. Next time try EVGA, XFX, BFG or ASUS; they're preferred NVIDIA vendors and it shows in their performance.
Point of View & PNY are hardly recognised names but performance is very similar to ASUS, XFX, EVGA etc equivalents.
1280x1024, Trilinear, 4xAA, medium bodies, smooth smoke, high texture settings. That was before I got the SLI setup, now it's 16x12.
He looks like he is running with a 2T command rate, which combined with stock clocks will hold him back a bit.
Originally posted by: v8envy
Originally posted by: Rollo
So you know for a fact that licensing revenues of game engines are not where developers "make money" and that they should think about the cards of the past and what most people have?
Got my 2 cents to throw in. I worked for a company that decided to optimize their product for the lower end of the current mainstream. Usual design/develop/test/fix/publish cycle ensued, with additional resources spent on optimizing and making do. And when the product finally * did * ship, it looked and worked like utter unbridled ass compared to other software of the day. By the time it showed up on retail shelves, lower end of the mainstream machines had capabilities light years beyond what the product was optimized for.
The product was a pretty spectacular flop, and the company started its downward implosion cycle as a result, IMO.
Other thing to think about. People drool over screenshots. So games which look spectacular on highest end hardware available yet are playable on mainstream hardware are precisely what game devs are shooting for. It makes no sense to have your 'best' graphics be on mainstream hardware -- your graphics will be comprable to last generation games. Your competitors will eat you alive.
Originally posted by: Hacp
So your telling him to overclock his card which will void his warrenty? At least put a disclaimer like : this will void your warrenty.
Originally posted by: Avalon
I hear this complaint a lot lately regarding FEAR and COD2. People buying $400 cards and not getting anywhere near the performance level they got with their $400 cards of two years ago on games that came out back during that time.
And I agree, I don't like this shift either. What we think of as high end almost looks like it's becoming mid range, and the enthusiast sector now becoming the high end.
That's why I just bought a $50 6600. I bought something I knew would be ass, but dirt cheap, as opposed to being disappointed with a $400 card's performance and thinking it was ass as well.
Originally posted by: Acanthus
2t doesnt do jack for A64s, you wont notice the difference.
Youre talking 2% or less.
Originally posted by: CaiNaM
Originally posted by: Acanthus
2t doesnt do jack for A64s, you wont notice the difference.
Youre talking 2% or less.
less
there was a review which compared 1t and 2t timings (i can't recall which review), and iirc, you gained 2% in a "best case" scenario; more often than not the difference was inconsequential.
Originally posted by: DeathReborn
He looks like he is running with a 2T command rate, which combined with stock clocks will hold him back a bit.
Originally posted by: Acanthus
Originally posted by: T101
I could do that temporarily, but I can not afford to equip both computers with SLI setup.
Requiring SLI setup, that would be like wanting to sell only to a small portion of the user-base. A very bad business idea.
1. The game runs fine on lower settings
2. Youre doing something wrong, because im running higher settings than you in fear.
Do you have soft shadows on? thats a massive performance hit for a very tiny IQ increase.
Third, who the Hell is "Point of View?" Had to do a Google search on them and only found one video card review. Next time try EVGA, XFX, BFG or ASUS; they're preferred NVIDIA vendors and it shows in their performance.
Originally posted by: Rollo
Originally posted by: T101
I hope these issues will be resolved with patches. Because, I can't belive that developers would be so unintelligent to require hardware that does not yet exist for the game to be playable.
Perhaps the developers of these games wanted the best graphics experience possible, and knew those with SLI would be able to enjoy it?
You've got a SLI motherboard, why not put another 7800GT on it and see if you think the same?
[/q
Yes, developers want to make games for .01% of the computer market, that would be bright. You are seriously out of touch with reality. Maybe you should get your five year old
(who most likely doesn't exist, since he's been five for two years) a quad duel core opteron with SLI 512MB GTXs to play legos. Or maybe your Nvidia inside source has the next gen for you already.
Originally posted by: Rollo
Originally posted by: edplayer
Originally posted by: Rollo
Why shouldn't developers be allowed to make games that only high end hardware can run?
The only thing preventing them from doing so is common sense. They are in business to make money, not tech demoes
So you know for a fact that licensing revenues of game engines are not where developers "make money" and that they should think about the cards of the past and what most people have?
I see.
Beyond that, FEAR seems to be selling well, even though there's not a single card on the market that can run it at 16x12 4x8x? Apparently your theory doesn't apply if it's a good game people want to play?
Originally posted by: lopri
I play COD 2 set to max everything (including insane corpse) @1920x1200 with 4AA/8AF. NV control panel quality setting = High Quality. Never drops below 40FPS.