BenSkywalker
Diamond Member
- Oct 9, 1999
- 9,140
- 67
- 91
DUDE? There are no holiday numbers!!!
My apologies, made you look bad by editing my post while you were posting more inaccuracies.
DUDE? There are no holiday numbers!!!
My apologies, made you look bad by editing my post while you were posting more inaccuracies.
I paid that post every bit of attention it deserved. You didn't even glance at the links I provided-
Over the Christmas holiday quarter nV outsold ATi 2:1. Enthusiasts that hold their breath from benchmarks are a miniscule portion of the overall market.
nV is currently in FY 2011. I have no idea where you get your financial news from, but I provided the links in that thread, read them.
Meh- I'll repost them and quote them, don't want misinformation being spread
http://phx.corporate-ir.net/phoenix.zhtml?c=116466&p=irol-newsArticle&ID=1392142&highlight=
At the very least you're being grossly misleading by claiming Holiday numbers!!! Regardless, no holiday sales were reported in this financial statement!!!
People aren't really looking at it this thoroughly on that front. GTX480 512SPs, GTX470 448SPs, GTX 260 384SPs, GTX 250 320SPs- not saying it is how they will go, but using nV's setup it is possible that they only yield 3% of their chips as 480s and it is possible they could hit close to a 95% useable yield rate. This is one of the issues I have with believing a lot of the talk about how they are going to stop production. No, it isn't ideal to use your largest die for 4 seperate chips, but it certainly is within their ability to do so if they decided to. Not saying they will go that route, not even saying they are absolutely going to launch, but when I look at all of the 'nightmare' scenarios I am reading about, seems like it makes a lot more sense to sell off ~1/2 good chips for ~$200 then to throw them away. Think 5830- somehow nV can't do the same thing when they by design make their chips with that in mind?
I am referring to nvidia making a chip specific for the HPC market, and then whittling out (instead of just disabling) some unnecessary HPC functions like double precision floating point that most average Joe consumers won't use.
Honestly, I think you are a bit over your head in this discussion, when I stated explicitly what numbers were reported I meant exactly what I stated
Quarterly Highlights Fiscal Year Highlights
($ in millions except
per share data) Q4 FY2010 Q3 FY2010 Q4 FY2009 FY2010 FY2009
[COLOR="Red"](date of product) (July-Sept) (April-Jun) (July-Sept) (Oct-Sept) (Oct-Sept)
[/COLOR]Revenue $982.5 $903.2 $481.1 $3,326 $3,425
========= ========= ========= ========= =========
GAAP:
Gross margin 44.7% 43.4% 29.4% 35.4% 34.3%
Net income (loss) $131.1 $107.6 ($147.7) ($68.0) ($30.0)
Income (loss) per
share $0.23 $0.19 ($0.27) ($0.12) ($0.05)
Non-GAAP: (1)
Gross margin 44.7% 40.7% 28.1% 38.6% 39.9%
Net income (loss) $131.1 $77.4 ($145.3) $141.4 $160.3
Income (loss) per
share $0.23 $0.13 ($0.27) $0.26 $0.29
========= ========= ========= ========= =========
They report on their sales approximately a quarter after the sales.
You think a 480 GTX will beat hd5970?
If the lead is only 5% ATI might be able to beat that with a memory swap (7 Gbps GDDR5) and a small GPU core speed increase.
However, I think Fermi's major advantage could be minimum FPS (with full DX11 turned on) rather than average FPS.
That is true, but think about this:
Why does Nvidia need to beat ATI in games? How will being faster than HD5970 help anything? Maybe Nvidia is doing the right thing by focusing on HPC instead of gaming?
The following excerpt from Tom's hardware really brings the situation into perspective.
http://www.tomshardware.com/reviews/future-3d-graphics,2560-3.html
I find the 5% performance advantage very hard to believe.
- A lot of games simply run better on NV's architecture such as Far Cry 2, World in Conflict, Borderlands. Just doubling GTX280 would net NV a lead in these games over 5870 that's a lot more than 5%. But NV increased the efficiency of the shader units/architecture and is also packing a significant advantage over 5870 in memory bandwidth.
No one predicted 9700 Pro to be the card that it ended up. NV could have simply doubled everything on GTX280 as ATI did over 4870 and had been ahead of the 5870, but they didn't take this approach. This leads me to believe they have chosen a design that is superior to simply "doubling" of the previous generation. I am guessing 20-30% performance increase over 5870 on average for 480, with 470 about 5-10% faster.
And there's games that run better on ATI hardware so what???
... eye infinity, nv has had that for years...
meh, Toms is just brown nosing again....eye infinity, nv has had that for years, if the ATi GPU was so damn flash, they could have done more with compute....it still doesnt have as many features as nV...
Like having a powerful motor with tiny wheels................
Yes, but DX11 games will focus on geometry and tessellation as well. In this area, NV will be superior to ATI based on the Fermi architecture for future games. They have also shifted their focused towards complex shaders as they increased Stream processors from 240 to 512. NV was generally was superior in texture heavy games, while ATI has a historical bias towards performing well in shader heavy games. This may not be true anymore.
I just think since NV's tech is more forward looking, but it may be 12 months or more before the advantages of this architecture are actually visible over 5870 (we would need DX11 designed games from the ground up).
I just think since NV's tech is more forward looking, but it may be 12 months or more before the advantages of this architecture are actually visible over 5870 (we would need DX11 designed games from the ground up).
Hopefully Nvidia sponsors some Tessellation heavy games, because I am guessing a lot of the console ports will be favoring ATI cards goes (re: ATI has the hardware contract for Xbox 3)
Speaking of consoles, I have noticed most of the new TVs being sold today are 1080p. This begs the question how much extra GPU power will we really need for our PCs? (it looks like there won't even be a difference in resolution pretty soon, if anything I would actually expect the console to need more GPU power than a PC since the pixels on large 1080p TVs would be larger)
You can get A LOT more detail at 1080p resolution than is currently capable. There are issues of storage, but just because TVs are 1080p doesn't mean that we don't need graphics power.
..., if anything I would actually expect the console to need more GPU power than a PC since the pixels on large 1080p TVs would be larger)
What?
why would a tv display (1920x1080) that is "larger" need more gpu power than a computer monitor that is also 1920x1080?
The pixels on the larger TV display are larger, therefore they would benefit from more anti-aliasing in order to look smooth to the human eye.