n7
Elite Member
- Jan 4, 2004
- 21,281
- 4
- 81
Originally posted by: ShawnD1
Deep down you know he's right. First games were only CPU, then we shifted to GPU. Folding @ Home was all CPU, and now the majority of units are processed by GPU. OpenCL is in the works, and Microsoft is working on a DirectX version of OpenCL. Video coding can now be done on the GPU.
The CPU will probably never die, but it's less and less important every day. My two year old E6600 will run Fallout 3 just as well as your i7 will, and it's all because lots of tasks can be shifted to the GPU. If all floating point operations could be done by my graphics card, the CPU would almost never need upgrading. I could just use the same CPU over and over again and upgrade the graphics card. Gaming already works like this, and Intel is in serious trouble if most other tasks shifted in that direction.
Deep down i'd like to know what you're smoking.
I'm sorry, but no one cares about graphics cards.
People on AT are a tiny fraction of the market.
The general user gets a whole PC, a whole package.
That whole package includes an Intel platform (CPU/mobo w/ onboard GPU) or AMD platform. Occasionally an Intel/AMD CPU + nVidia mobo/onboard GPU.
The majority of people are quite content to watch their movies, surf Facebook & Youtube, type their Word documents & download their iTunes with their onboard video cards.
Now if by some miracle software development suddenly all switches to being optimized for the GPU, sure, the GPU might become more important.
But as far as i can see, it's more likely we'll see integration of the GPU into the CPU than vice versa, since for everyone other than the gaming market or other very specialized markets, a dedicated high end GPU is not needed.
Intel & AMD already have complete platforms...they have no need for nVidia, which is exactly why nVidia is so panicked at their [lack of] future & why they keep spewing nonsense about how necessary they are to computing.