Intel GMA 3000 Graphics

tcsenter

Lifer
Sep 7, 2001
18,815
484
126
Looks like Intel's 'good enough' graphics just got a lot...umm...gooder?

Intel GMA X3000 and GMA 3000 Integrated Graphics (.pdf)

Feature Highlights:

- Full DX9.0c compliance (X3000 only)
- Up to 667MHz core! (X3000 only)
- Vertex Shader 3.0 (hardware)
- Pixel Shader 3.0 (partial hardware)
- T&L engine (hardware)
- MPEG-2 acceleration with iDCT + MC (hardware)
- WMV9/HD and DXVA acceleration (hardware)
- Programmable pipelines with full 32-bit precision
- DVMT 4.0 up to 256MB shared
- Improved GMCH for faster memory access and lower latencies

Unlike its predecessors, there will be two different graphic cores. X3000 will be the higher-end part with more features and higher clocks, whereas 3000 will be an economy part with lower power consumption but also lower performance.

X3000 reportedly will be available on Intel's G965 Express, which is already shipping to OEMs. Intel has drivers available for download.
 
Oct 4, 2004
10,515
6
81
Is this supposed to negate the need for a Radeon X1300/Geforce 7300? Or does the old law of "even the cheapest discrete video card is better than IGP" still apply?
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: theprodigalrebel
Is this supposed to negate the need for a Radeon X1300/Geforce 7300? Or does the old law of "even the cheapest discrete video card is better than IGP" still apply?

we'll only know with benchies...
 
Jan 31, 2002
40,819
2
0
Intel Exec: You know, it's been about ten years. We might want to get on board with that "Hardware TnL" thing everyone's been talking about.



Important tidbit missing from the OP, though. Instead of:
Programmable pipelines with full 32-bit precision
How about
Eight Programmable pipelines with full 32-bit precision
?

- M4H
 

fbrdphreak

Lifer
Apr 17, 2004
17,555
1
0
Originally posted by: MercenaryForHire
Intel Exec: You know, it's been about ten years. We might want to get on board with that "Hardware TnL" thing everyone's been talking about.



Important tidbit missing from the OP, though. Instead of:
Programmable pipelines with full 32-bit precision
How about
Eight Programmable pipelines with full 32-bit precision
?

- M4H
Yep, rumor is 8 pipes. That will be awesome if it works well with low power; perfect for notebook IGP's
 

soydios

Platinum Member
Mar 12, 2006
2,708
0
0
The number of pipelines is rather key information. Benchies will tell all, whenever they arrive. One thing I'm curious about, is can it accelerate playback HD-DVD/BluRay to full framerates?
 

tcsenter

Lifer
Sep 7, 2001
18,815
484
126
Is this supposed to negate the need for a Radeon X1300/Geforce 7300? Or does the old law of "even the cheapest discrete video card is better than IGP" still apply?
Well...it at least will negate the need for upgrading to a discrete video card for lots of people whose desired games will run adequately on GMA 3000 but would not have on GMA 900/950 (or dare I say it...Extreme II).

There are lots of games that require more rendering power and features (e.g. hardware T&L) than GMA 900/950 but would run fine on something with the feature set and performance of Radeon 9550/9600 or FX5500. I see this all the time, where a person can't run Simms 2, Zoo Tycoon 2, or any number of other fairly low-end 3D games on their integrated Intel or VIA graphics, but upgrading to a lowly Radeon 9550/9600 or FX5500 AGP does the trick.

"IF" GMA 3000 can at least bring the performance of these cards, it should be a significant improvement that 'could' translate into reduced sales at the low-end graphic card segment for ATI and NVIDIA.

As mentioned, before we know anything for sure, we need to see some competitive benchmarks.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: tcsenter
Is this supposed to negate the need for a Radeon X1300/Geforce 7300? Or does the old law of "even the cheapest discrete video card is better than IGP" still apply?
Well...it at least will negate the need for upgrading to a discrete video card for lots of people whose desired games will run adequately on GMA 3000 but would not have on GMA 900/950 (or dare I say it...Extreme II).

There are lots of games that require more rendering power and features (e.g. hardware T&L) than GMA 900/950 but would run fine on something with the feature set and performance of Radeon 9550/9600 or FX5500. I see this all the time, where a person can't run Simms 2, Zoo Tycoon 2, or any number of other fairly low-end 3D games on their integrated Intel or VIA graphics, but upgrading to a lowly Radeon 9550/9600 or FX5500 AGP does the trick.

"IF" GMA 3000 can at least bring the performance of these cards, it should be a significant improvement that 'could' translate into reduced sales at the low-end graphic card segment for ATI and NVIDIA.

As mentioned, before we know anything for sure, we need to see some competitive benchmarks.

I have no idea why people percive Sims 2 as a relatively easy game hardware wise... it puts the fubar on my 6600GT@585/1145...i guess you can run it almost bearably on a MX440 at 1024, if you don't mind it looking so jaggy it pokes out your eyes (my gf seems to have fun with it...

I'm sure this will easily exceed a 9600, at least it had better, you can already get an integrated X300 on ATI boards can't you? And that came out years ago...
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: MercenaryForHire
Intel Exec: You know, it's been about ten years. We might want to get on board with that "Hardware TnL" thing everyone's been talking about.



Important tidbit missing from the OP, though. Instead of:
Programmable pipelines with full 32-bit precision
How about
Eight Programmable pipelines with full 32-bit precision
?

- M4H

One of the posted points though read as follows:
- Pixel Shader 3.0 (partial hardware)
So, IMO you'll be lucky if it performs as well as 4 real pieplines.

Kind of ironic how the pixels are cpu assisted and the vertices full hardware - exactly the opposite of previous intel integrated chips.
 

tcsenter

Lifer
Sep 7, 2001
18,815
484
126
I have no idea why people percive Sims 2 as a relatively easy game hardware wise... it puts the fubar on my 6600GT@585/1145...i guess you can run it almost bearably on a MX440 at 1024, if you don't mind it looking so jaggy it pokes out your eyes (my gf seems to have fun with it...
Actually, Sims 2 will run with semi-playable frame rates on Intel Extreme II and GMA 900/950 but at low detail and resolution. When stating I had seen a number of people complain they 'can't run' Sims 2 on Intel integrated graphics, usually they meant 'can't run with playable frame rates at their desired game detail and resolution'.
Is this all part of the 965G chipset that is having so many problems? (Headline: Intel G965 is pretty awful)
In all fairness, G965 is only just shipping to OEMs, I would expect there to be immature graphics BIOS and drivers at this stage.

When 865G with Extreme II graphics was just released, Intel's game compatibility list for 865G looked like this:

15 titles listed (only 9 of which have no issues)

Now, it looks like this:

Many more titles listed (and many more supported)
 

Lord Banshee

Golden Member
Sep 8, 2004
1,495
0
0
seems that there is OpenGL problems no Doom3 or Quake4 lol...

Hmm says it can play FEAR... thats a WOW in my head
 

ltcommanderdata

Junior Member
Oct 28, 2005
4
0
61
Originally posted by: Lord Banshee
seems that there is OpenGL problems no Doom3 or Quake4 lol...

Hmm says it can play FEAR... thats a WOW in my head
I wouldn't worry about it. The drivers they are using right now are 14.21. Full hardware PS3.0 support isn't added until 14.24 and hardware VS and T&L isn't added until 14.26.

http://www.hkepc.com/bbs/itnews.php?tid=638462&starttime=0&endtime=0

It's probably running so bad because it's in GMA950 mode. At 667MHz and with 8 unified shaders it should easily surpass ATI's X700 based IGP and nVidia's 7300 based IGP, both of which look to only have 4PS+2VS. Hopefully with proper drivers it'll give a X1300HM if not a vanilla X1300 a run for it's money. If it is released in August it'll also be the first DirectX 10 compatible graphics chip beating the G80 and the first PC chip with unified shaders beating the R600. If it isn't a flop, it could certainly give AMD reason to buy ATI.
 

tcsenter

Lifer
Sep 7, 2001
18,815
484
126
I saw something when visiting Intel's P965 graphic support pages but didn't realize what it meant until reading the information from HKEPC:

"Although the chipset hardware itself supports T&L, future graphics drivers will be required to utilize this feature."
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
*bump*

- Pixel Shader 3.0 (partial hardware)

I don't see information ANYWHERE that suggests the PS is anything but hardware. Besides, its said in some sites once the DX10 spec is finalized, DX10 and PS/VS 4.0 will be supported.

The "partial" is nothing but to discredit it. I know it can't be better than X300 or whatever in performance, but full compatibility is almost certain. Most don't realize how much software VS 3.0 and improved drivers make GMA950 superior in support/performance to GMA900. Of course, the performance is still crap, but they at least fixed the compatibility problem for the most part.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
So, IMO you'll be lucky if it performs as well as 4 real pieplines.

Kind of ironic how the pixels are cpu assisted and the vertices full hardware - exactly the opposite of previous intel integrated chips.

Again I would repeat, the information by the first poster is wrong. NOTHING suggests it supports "partial" hardware for PS.

Bringing full programmability+unified shaders+future DX10 support/SM4 support=partial PS support really does not make sense, especially when GMA900/950 has full hardware support.

The performance of the part depends on the implementation details. How much vertex shaders there are, how well it handles the limited memory bandwidth, the interaction with the functional units with each other, how powerful the T&L units are. Will GMA X3000's T&L beat the first hardware T&L video card the Geforce 256?? Doesn't mean because its hardware, performance is automatically good. Plus, Intel's 3D drivers seem to suck compared to Nvidia's and ATI's, quite a lot to go at least on the performance side.

(BTW, the hardware geometry and hardware VS is only for X3000, the 3000 is functionally identical to the GMA950)

I expect performance equal to Radeon Xpress 200 integrated along with excellent support for later games.
 

tcsenter

Lifer
Sep 7, 2001
18,815
484
126
Oh wow, out of all the specs listed, I got one half-wrong based on the numerous conflicting specs reported by various tech sites, all of whom were using insider sources or rumors. Intel had not yet released or confirmed most of the specs at the time of my post.

Don't get your panties in such a bunch. In fact, you have not offered any verifiable support that I was wrong in the first place.
 

R3MF

Senior member
Oct 19, 2004
656
0
0
sounds fantastic for ultra-portable laptops using Merom CPU's.

something with an 11.1" screen and 1366x768 resolution will be very nice early next year using the above components.
 

gxsaurav

Member
Nov 30, 2003
170
0
0
guys u r forgetting, for what purpose GMA X3000 is made, it's not supposed to be a graphics card replacement, it's supposed to be for low cost PCs, office PCs & HTPCs out there

Even i heard last, that with a simple driver update it will support full DirectX 10 in hardware. Now one thing we should keep in mind, that GMA X3000 is made keeping Windows Vista in mind, it's like tailor-made for Vista, even with a 4X1 architecture, it is enough to run Windows Vista in full hardware & quality, i tried Vista beta 2 on a FX 5200, 5900XT & 6600GT, & there was no difference in UI performance

seriously, no one who wants to play games specifically on the computer, will stick to this solution anyway, they will get a PCIe card anyway. But for Office, Low-Cost HTPC with HDMI & HDCP, or low cost school, embeded Computers, running vista, this IGP is enough
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
I wouldn't worry about it. The drivers they are using right now are 14.21. Full hardware PS3.0 support isn't added until 14.24 and hardware VS and T&L isn't added until 14.26

Actually if you take a look at earlier HKEPC articles suggesting the same thing, you'll notice Intel's document with the changes. Intel's document reports Hardware PS 3.0 post-14.24, but HKEPC says 14.24. A minor point out to the errata.

Oh wow, out of all the specs listed, I got one half-wrong based on the numerous conflicting specs reported by various tech sites, all of whom were using insider sources or rumors. Intel had not yet released or confirmed most of the specs at the time of my post.

Don't get your panties in such a bunch. In fact, you have not offered any verifiable support that I was wrong in the first place.

In
If you read the whitepapers(http://download.intel.com/design/chipsets/applnots/31334302.pdf, you'll realize they did confirm hardware PS 3.0 support. Actually there is more than that:

http://www.beyond3d.com/forum/showthread.php?t=32643

Intel has released open source drivers for G965 a month or so ago and people read the drivers to find out the hardware details.

-All programmable shading is handled in unified execution units, codenamed "GEN4 EU". Fixed function subsystems "call" those units.
-All programmable ALUs are scalar to maximize usage efficiency. That means they work on a single component at a time, not vectors.
-Triangle setup and related operations are also done in the EUs. In traditional architectures, a special-purpose unit would exist for it.
-Fog and alpha testing are implemented as parts of the pixel shader, which is expected of all DX10 architectures.
-Math functions (EXP, LOG, SIN, etc.) are implemented in a 16-way "Mathbox" external unit with both full and partial precision.
-Taylor expansions are sometimes faster than the Mathbox, because they don't require values to be in such specific bounds.
-The geometry shader is already used to implement some OpenGL functionality, including wireframe rendering.
-The pixel shader works on blocks of 16 pixels. It is unknown whether that is also the case for vertices and primitives.


According to the info found out by Beyond3D members, it exceeds the DX10 requirement of "unified" shaders, as lot more are unified. Video functions use same units as shader units.

The "partial" support for hardware PS 3.0 might have came from the fact that all units are unified, rather than special purpose. Of course people know that doesn't make it partial, just... different. You don't seem to be reacting much less aggressively than I do.

 

tcsenter

Lifer
Sep 7, 2001
18,815
484
126
Actually there is more than that: Intel has released open source drivers for G965 a month or so ago and people read the drivers to find out the hardware details
The discussion is dated Aug. 11th, which cites a release date of Aug. 8th for the drivers, a full six days after my post.

My time machine was in the shop being repaired, else I could have known in advance details released or confirmed not less than one week after my post.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: ltcommanderdata
Originally posted by: Lord Banshee
seems that there is OpenGL problems no Doom3 or Quake4 lol...

Hmm says it can play FEAR... thats a WOW in my head
I wouldn't worry about it. The drivers they are using right now are 14.21. Full hardware PS3.0 support isn't added until 14.24 and hardware VS and T&L isn't added until 14.26.

http://www.hkepc.com/bbs/itnews.php?tid=638462&starttime=0&endtime=0

It's probably running so bad because it's in GMA950 mode. At 667MHz and with 8 unified shaders it should easily surpass ATI's X700 based IGP and nVidia's 7300 based IGP, both of which look to only have 4PS+2VS. Hopefully with proper drivers it'll give a X1300HM if not a vanilla X1300 a run for it's money. If it is released in August it'll also be the first DirectX 10 compatible graphics chip beating the G80 and the first PC chip with unified shaders beating the R600. If it isn't a flop, it could certainly give AMD reason to buy ATI.


Wrong, the X700 is an 8 Pixel Pipeline with 6 Vertex Shader, I really doubt that the new Intel Graphic Media Deccelerator can outperform it.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |