Originally posted by: n7
Originally posted by: Idontcare
"At the heart of this issue is that the CPU has run its course and the soul of the PC is shifting quickly to the GPU. This is clearly an attempt to stifle innovation to protect a decaying CPU business."
It must be fun being delusional :laugh:
Deep down you know he's right. First games were only CPU, then we shifted to GPU. Folding @ Home was all CPU, and now the majority of units are processed by GPU. OpenCL is in the works, and Microsoft is working on a DirectX version of OpenCL. Video coding can now be done on the GPU.
The CPU will probably never die, but it's less and less important every day. My two year old E6600 will run Fallout 3 just as well as your i7 will, and it's all because lots of tasks can be shifted to the GPU. If all floating point operations could be done by my graphics card, the CPU would almost never need upgrading. I could just use the same CPU over and over again and upgrade the graphics card. Gaming already works like this, and Intel is in serious trouble if most other tasks shifted in that direction.