(To the moderator) I fail to see how my post does not belong here. Had you read it in its entirety, you should have realized that I was not asking for support on anything. There is no trouble to shoot! Quite to the contrary, I am attempting to find someone extremely knowledgable in graphics and rendering who might go into some detail on the phenomenon I describe, in hopes of sparking a highly technical discussion on the subject. Browsing the topics listing leads me here, as the type of person I am looking for would be most likely to browse the highly technical forum. If you deem it appropriate to lock my thread again, at least grant me a specific reason and/or suggestion on where it belongs. I would appreicate it.
So, let's try this again...
Hello all. I'm lookin for input on a question that I find myself unable to answer. If anyone has any pertinent information to contribute, I would appreciate it greatly!
I have a friend who has a decently-equipped gaming rig, sporting an Athlon 2100+, 1GB DDR RAM, ATI Radeon 8500, etc. In many of his games, the framerate increases when he raises the resolution. For example, increasing the framerate in Tribes 2 from 800x600 to 1024x768 results in a net gain of 15-20 FPS. The same thing occurs in other games, such as Jedi Knight 2. Furthermore, enabling extra graphical features such as trilinear filtering, hi-res textures, and volumetric fog tend to result in a framerate increase as well. When we bump these setting sback down, sure enough, the framerate goes back down too. We popped in a GeForce 3 Ti500 and the same thing happened. Not that this is a bad thing, but it does raise an interesting question (to me, at least). How can this be? My friend came to me since I know a fair amount about computer hardware, but I couldn't help him. Can you help me answer this one? Thanks guys!
To answer stebesplace's question(s), I do not think any of the games offer software rendering as an option. I could be wrong, but isn't software rendering practically obsolete in modern 3D shooters and action games? This phenomenon can be reproduced in virtually any recent game.
Again, I'm just looking for technical information/explanations. There's no problem to troubleshoot here. Thanks again.
So, let's try this again...
Hello all. I'm lookin for input on a question that I find myself unable to answer. If anyone has any pertinent information to contribute, I would appreciate it greatly!
I have a friend who has a decently-equipped gaming rig, sporting an Athlon 2100+, 1GB DDR RAM, ATI Radeon 8500, etc. In many of his games, the framerate increases when he raises the resolution. For example, increasing the framerate in Tribes 2 from 800x600 to 1024x768 results in a net gain of 15-20 FPS. The same thing occurs in other games, such as Jedi Knight 2. Furthermore, enabling extra graphical features such as trilinear filtering, hi-res textures, and volumetric fog tend to result in a framerate increase as well. When we bump these setting sback down, sure enough, the framerate goes back down too. We popped in a GeForce 3 Ti500 and the same thing happened. Not that this is a bad thing, but it does raise an interesting question (to me, at least). How can this be? My friend came to me since I know a fair amount about computer hardware, but I couldn't help him. Can you help me answer this one? Thanks guys!
To answer stebesplace's question(s), I do not think any of the games offer software rendering as an option. I could be wrong, but isn't software rendering practically obsolete in modern 3D shooters and action games? This phenomenon can be reproduced in virtually any recent game.
Again, I'm just looking for technical information/explanations. There's no problem to troubleshoot here. Thanks again.