Geforce4mx underperforms Geforce 3 - Benchmarks

BFG20K

Member
Jan 8, 2002
48
0
0
Got this from a mac enthusiast site:Link

At 1600x1200, the geforce 3 is almost 50% faster than the geforce4mx. Wow, that directx7 geforce4mx is a real POS. I suppose its not suprising when you take a geforce 2 card and all you do is slap on some faster ram and up the core speed.

Sham on you nvidia.

Looks like the card with the best bang for the buck from Nvidia is teh Geforce 3 Ti200.
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
I doubt those numbers are very relevant to the PC versions.

The GF3 numbers are extremely low as well, the GF3 in a decently fast PC will score over 100 FPS at 1600x1200x32, they got 70.
Either the MacOS drivers suck, or them G4's REALLY suck at Quake3.

I'll hold my comments until there are some benches on an x86 box.
 

AGodspeed

Diamond Member
Jul 26, 2001
3,353
0
0
You know, it's actually not too bad at all. This card looks like it's going to be priced somewhere below $100, and so at that price range it's perfect for what it's meant for. Really now, this card looks to be as fast as the GeF3 Ti200 but with a bunch of extra features (like dual monitor support, iDCT, etc.).

I think this card will do very well in its intended market. Now on to the benchs for the full GeForce4.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
I can't confirm this but in the past, Apple has always downclocked their video card compared to PC parts (this is the case with NVIDIA and ATI video cards). Did the article mention what speed the Geforce 4MX was running at? Remember there are several Geforce 4MX flavors that will be released. Oh well. Next week we should know for sure.
 

Killrose

Diamond Member
Oct 26, 1999
6,230
8
81
Anything with an MX association in it's name is going to be low teir. But put an Ultra somewhere in the mix and then you really have something.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76


<< BFG20K??? lol

lets see what 10K has to say about this...:roll;
>>



Bwahaha, I can't wait.
 

BFG20K

Member
Jan 8, 2002
48
0
0
I personally don't give a rat's a$$ what bfg10k has to say about my screen name and the fact of the matter is that this post is about the geforce 4mx. If you care to know what bfg10k thinks about bfg20k, then start a thread in the offtopics forum.

Anyway, I think its lame on nvidias part to release a new product with an increased increment number in its title without introducing an increase in performance. Alot of people out there are going to choose the geforce 4 mx over the geforce 3 ti200 because they'll think the geforce4mx is better.

I doubt that any of the three versions of the Geforce 4mx will perform any where close to the Geforce 3 ti200 and this can be seen in the benchmarks provided, and at the same time, at least for the first couple of months after its introduced, its price will not be much less than that of a Geforce 3 ti 200. Nvidia should label their products with a 'buyer beware' sticker.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76


<< I personally don't give a rat's a$$ what bfg10k has to say about my screen name and the fact of the matter is that this post is about the geforce 4mx. If you care to know what bfg10k thinks about bfg20k, then start a thread in the offtopics forum.

Anyway, I think its lame on nvidias part to release a new product with an increased increment number in its title without introducing an increase in performance. Alot of people out there are going to choose the geforce 4 mx over the geforce 3 ti200 because they'll think the geforce4mx is better.

I doubt that any of the three versions of the Geforce 4mx will perform any where close to the Geforce 3 ti200 and this can be seen in the benchmarks provided, and at the same time, at least for the first couple of months after its introduced, its price will not be much less than that of a Geforce 3 ti 200. Nvidia should label their products with a 'buyer beware' sticker.
>>



I think when customers see the price differences, they will realize which is the best card. The Ti200 cards are coming out now with 128MB of memory so it seems like it aint going anywhere.
 

Vernor

Senior member
Sep 9, 2001
875
0
0
There's something smelly here.

Selling a Geforce '4' that actually has a crippled nfiniteFX engine is not a smart thing for Nvidia to do.

The **** will hit the fan when Unreal 2 and Doom 3 come out.
 

Burnsy

Member
Dec 30, 2001
90
0
0
Well DUH!! Look at the codenames for them:

GeForce 3 = NV20
GeForce 4MX = NV17

In case you didn't read the article on the NV17 then let me summarise.

NV17 is basically a crippled NV20, it only has partial pixel shader and vertex shader capabilities.
Although it does have increased DVD decoding capabilities (iDCT)

It was supposed to be nVidia's new mobile GPU but nVidia in their infinite wisdom (sarcasm detector going off the scale) decided to release it as a desktop part and give it the moniker of GeForce 4. As vernor said before me, this is not a wise move by nVidia.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,001
126
These results serve to go towards confirming something I've suspected all along: the GF4 MXs will likely have 64 bit DDR memory. If you think about it makes perfect sense because of how close the raw memory clock speeds are to the Ti cards (in fact the fastest MX has the same memory speed as the slowest Ti) and also because of that little outlying card with 166 MHz SDR.

When you look closely enough that card isn't such an outlier after all because 550 goes to 275, 400 goes to 200 and the 166 card sits nicely in third place. If the MXs have 128 bit DDR they would cannibalise the Titanium sales unless nVidia substantially raised their prices. And since the definition of an MX is a budget card I don't think that's going to happen.

It's all speculation on my part but it's extremely likely that I'm right.

lets see what 10K has to say about this

Absolutely nothing. I have no time for sad little individuals who lack the self-esteem to even come up with their own names and instead have to use someone elses.

Anyway, I think its lame on nvidias part to release a new product with an increased increment number in its title without introducing an increase in performance.

Ever heard of the GF4 Titanium series? Because they most certainly will raise the performance bar. But nVidia also needs a budget line because not everyone can afford the Tis and that's exactly where the MXs come in. If you can't handle the concept of budget cards go and buy the Titaniums. I really don't see why you're whining at all.

Or did you think that the GF4 MXs would smash even the Ti500 and cost less at the same time? In a dream world perhaps. <rolleyes>
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76
Well the GF2 MX cards are all slower than the GF DDR in decently high resolutions at 32 bit color, I dont see anyone complaining about that.

In fact the GF2 MX got quite the warm welcome when it first showed up.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0


<< At 1600x1200, the geforce 3 is almost 50% faster than the geforce4mx. Wow, that directx7 geforce4mx is a real POS. >>



Hello, its a budget card, not a GF3 killa, and my math shows it has 70% of the performance of the GF3 at that resolution and the gap narrows as the res goes down. IMHO, not too damn bad, hardly a POS.
 

EMAN

Banned
Jan 28, 2000
1,359
0
0
These results serve to go towards confirming something I've suspected all along: the GF4 MXs will likely have 64 bit DDR memory. If you think about it makes perfect sense because of how close the raw memory clock speeds are to the Ti cards (in fact the fastest MX has the same memory speed as the slowest Ti) and also because of that little outlying card with 166 MHz SDR.

Think again. If you look at the graph it surely isn't 64bit memory. If it had that it wouldn't even get that close to a geforce3. They never mentioned what model geforce mx4 they were using. It could be the 440model which is way slower than regular geforce 3.

Remember the geforce4mx has only 2 pipelines while geforce3 has 4 which could lead to more performance.
 

Alex

Diamond Member
Oct 26, 1999
6,995
0
0


<< lets see what 10K has to say about this

Absolutely nothing. I have no time for sad little individuals who lack the self-esteem to even come up with their own names and instead have to use someone elses.
>>



 

Beatles

Banned
Nov 6, 2001
389
0
0


<< I personally don't give a rat's a$$ what bfg10k has to say about my screen name and the fact of the matter is that this post is about the geforce 4mx. If you care to know what bfg10k thinks about bfg20k, then start a thread in the offtopics forum.

>>


Hehehehehehehe
 

DrDavid

Member
Jul 6, 2001
59
0
0


<< Well the GF2 MX cards are all slower than the GF DDR in decently high resolutions at 32 bit color, I dont see anyone complaining about that.

In fact the GF2 MX got quite the warm welcome when it first showed up.
>>



well, since they call it GF4, most people will assume it is better than the GF3 series, but nvidia cant fool us (on this board) coz we know how to read specs...
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76


<< well, since they call it GF4, most people will assume it is better than the GF3 >>


And the same would be true for the GF2 MX vs the GF(supposedly GF1 then), no?
And still it was well recieved, cause it provided good price/performance.
 

Deeko

Lifer
Jun 16, 2000
30,213
12
81


<< I personally don't give a rat's a$$ what bfg10k has to say about my screen name and the fact of the matter is that this post is about the geforce 4mx. If you care to know what bfg10k thinks about bfg20k, then start a thread in the offtopics forum. >>


lol...a bit testy are we?
 

Rand

Lifer
Oct 11, 1999
11,071
1
81


<<

<< well, since they call it GF4, most people will assume it is better than the GF3 >>


And the same would be true for the GF2 MX vs the GF(supposedly GF1 then), no?
And still it was well recieved, cause it provided good price/performance.
>>



Indeed, many people did and still do mistakenly assume the GF2 MX offers superior performance to the GF DDR.
It's unfurtunate, and debatebly slightly misleading but unfortunately the avcerage consumer isnt going to see anything beyond GF2 vs GF1 or GF4 vs. GF3, and simply assume the newer card has to be better.

Nonetheless, the MX was initially extremely popular in the enthusiast market as it offers... at that time unheard of performance at a budget price.



<<
Think again. If you look at the graph it surely isn't 64bit memory. If it had that it wouldn't even get that close to a geforce3. They never mentioned what model geforce mx4 they were using. It could be the 440model which is way slower than regular geforce 3.
>>



I tend to agree, if it was 64bit DDR memory bus then it seems unlikely in the extreme it would come so close to the GF3. A 64bit DDR memory bus would be less efficient then the current MX's 128bit SDR bus though nVidia's so called crossbar memory architecture should somewhat offset that.

I think the lower performance can be most attributed to the MX having only 2 pixel pipelines with it's associated two texturing units per pipe, so even at identical clockspeeds the MX will only have half the brute force of the GF3/4... of course bandwidth limitations will decrease the differences but even so, it's safe to say that even at identical clockspeeds the GF3/4 should hold a very decent performance edge over the GF4MX
 

BFG20K

Member
Jan 8, 2002
48
0
0
First off, I better address the lame remarks of bfg10k,


<< I have no time for sad little individuals who lack the self-esteem to even come up with their own names and instead have to use someone elses. >>



Wow, you really must have really taken my adoption of the classic doom weapon personally, bfg10k. You aren't the only one who happened to enjoy the classic. I also fail to see the correlation between self esteem and screennames.

The fact that you are so insulted by this just goes to show how little significance your life has. Your anandtech forums persona has become so important to you, which btw is the only significant relationship you have with other people, that my adoption of that specific name is seen as a personal attack.

Therefore I find it really ironic that you should call me a 'sad little individual' for that is what you are yourself.

Anyway... back to the Geforce 4mx debate.

The fact is that the Geforce4 mx is a POS, and that the Geforce 4mx is no were near as innovative as the Geforce 2mx was.

First off, Geforce 2mx was a card that was based on the same tech as teh Geforce 2 gts line. The Geforce 4mx on the other hand is a derivative of the Geforce 2 line. It doesn't offer support for directx8, and this is a bad for gamers because it slows down developers implementation of new features into newer software, since teh Geforce 4 mx is going to be a mainstream card.

Second of all, the performance gap between teh Geforce 2 mx and the original Geforce DDR was no way as great as the performance gap between the Geforce4mx and the Geforce 3 ti200. Since the Geforce 4mx is going to cost in the low $100s, the Geforce 3ti200 is clearly the better choice.

Thirdly, the geforce 4mx simply isn't innovative... it is introducing a supposidly new technology when in reality it is nothing but an old card with some improvements. It isn't bringing directx 8 suppport to the masses... which is something that we would all benefit from as gamers.

Anyway... this is my opinion... bring it on mofo's.
 

gregor7777

Platinum Member
Nov 16, 2001
2,758
0
71
I agree it's an odd step for nvidia. I liked what they did with the gf2's. The standard came out first (GTS) then the variants (Ultra, MX). The lines were drawn, and with the exception of the MX and GF DDR being so close in performance, the lines were pretty clear.

Now it's all over the place. I like it because I have a zilion choices, and it seems due to lack of outside competetion, nvidia created some for itself.

Example, my GF3 ti450 128mb just dropped $10 @ newegg overnight.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
I smell a flame war.

Also, I don't Really see where the GF4MX fits in.

GF4MX 420 has SDR ram. That already makes it slower than a GF3 Ti200. The Ti200 is priced at 125-150 (depending on where you get it from).

GF4MX 420 (IMO), could possibly be slower (Unless the GF4 has some badass chipset design) than a GF2 64MB DDR (Ti, GTS, Pro, Ultra) . Which goes under $100. Where would this card fit in? It would have to be lower than 100 bucks. I really don't see why nvidia even bothered making a GF4MX. If you want Cheap Performance, Get a GF2.

Confuses me.
 

Vernor

Senior member
Sep 9, 2001
875
0
0
We're not talking about raw power or speed, but about missing features.


If their engineers couldn't fit everything into a low-cost solution, they should have waited, as they did with the original MX.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |