Anand has Lynnfield Preview Up

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: SickBeast
Intel was recruiting graphics people en masse a year or so ago but I haven't heard anything since then.

I don't see why the drivers can't produce themselves with the right tools, for the most part. The hardware is essentially software to begin with.

I agree...but that would mean getting the software developers to use those tools. A good example of how tough that is would be the ill-fated Itanium.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Viditor
Originally posted by: Nemesis 1
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...

True to form :Q YA but Viditor Larrabee will stilldo well with only NV to compete with . Since AMD is about to get the boot from cpu business:lips: But not to worry Viditor . I understand Apple is snoopying around the watering hole.

Does this happen before or after the giant aliens land?

Befor.The watchers are already here they never left. You every look at your money.

Those funny symbols and pretty art work . Your an intelligent man. Learn what that writting says . Because that what it is . You won't finf proper spelling or grammer as we understand it . But if ya try hard I an sure you can figure it out. People in europe understand this stuff.

 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: taltamir
Originally posted by: Spicedaddy
Originally posted by: taltamir
with 633 mhz turbo mode and a 100$ or lower mobo, no you are not better off getting an i7

Turbo mode is for people who don't overclock, and good P55 mobos won't be 100$ or lower when they come out, while X58 boards keep getting cheaper every month.

If you're on Core 2: wait and see.
If you're on older stuff and want to upgrade: not worth the 3-4 month wait IMO.

if you are on something older than core2 than wait a few months and buy a used C2Q system for peanuts.

Anyways, the X58 is just so WASTEFUL, the X58 does NOTHING except connect to the CPU, Southbridge, and Video cards... moving the video card interconnect to the CPU and have the CPU connect to the southbridge directly just makes a whole lot of sense.

The problem with disabling turbo mode and overclocking the chip, is that you end up using a lot more power.

I kinda got to say I agree. With one exception. I get to change mind after we see were LarrAbee plays best . Cough Cough X58

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Nemesis, this is not the correct forum to address what you have just said to me. Let me just say, I would have made that exact same comment had you responded to my PM or not.

I don't want you to go away. I think you are an interesting member of the forum here.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Viditor
Originally posted by: Idontcare
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...

I thought Intel had some 400 or 500 engineers/programmers dedicated to just writing the drivers for Larrabee? Or is that an internet urban legend that has no base in reality?

Firstly, (as you know) it's not always quantity...quality in engineers is at least as important. The rumours I have heard is that the best engineers are still working for ATI and Nvidia.
Also, I believe that the quantity is a bit short as well...IIRC, Nvidia has over 1,800 software engineers on the books.

Of course this is all rumour and speculation...

Oh yeah, quality for sure but I haven't met an Intel engineer that made me question whether quality was present. Management empowering the engineers versus the marketing dept (netbust vs. AMD64) now that is a whole other thing.

At any rate I thought they nabbed some high-profile guys?...not that high-profile actually equates to quality, I've also met plenty of high-quality "nobodies" in the org chart as well.

Regarding NV's engineering headcount...that includes the hardware design side too, doesn't it? I was just speaking about people hired to work full-time on developing drivers and software infrastructure to support the GP side of the GPGPU business. If NV has 1800 headcount for that then hot-damn they are serious about the best defense is a good offense!

At any rate I have yet to get my head into the GPU game so I bring with me very few facts to this (or any other) GPU gun-fight. If my recollection of 400 software driver developers isn't ringing any bells with you then I'd take that as proof positive my active imagination is making up memories of past threads that likely never existed.

I'm happy to know that, time to retire that data point
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: SunnyD
Let the whining about no full 2x PCIe-16x slots begin.

yes... really... how many people need TWO 4870x2 cards for a total of 4GPUs? and even if you do... how much performance do they lose from being 2x8xpcie-v2
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...


Says the person who said Larabee isn't a GPU.
Not the most informed opiinion I would say.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Phynaz
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...


Says the person who said Larabee isn't a GPU.
Not the most informed opiinion I would say.

Felix, Oscar, you guys just make me LOL :laugh:
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Originally posted by: Idontcare
Originally posted by: Viditor
Originally posted by: Idontcare
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...

I thought Intel had some 400 or 500 engineers/programmers dedicated to just writing the drivers for Larrabee? Or is that an internet urban legend that has no base in reality?

Firstly, (as you know) it's not always quantity...quality in engineers is at least as important. The rumours I have heard is that the best engineers are still working for ATI and Nvidia.
Also, I believe that the quantity is a bit short as well...IIRC, Nvidia has over 1,800 software engineers on the books.

Of course this is all rumour and speculation...

Oh yeah, quality for sure but I haven't met an Intel engineer that made me question whether quality was present. Management empowering the engineers versus the marketing dept (netbust vs. AMD64) now that is a whole other thing.

At any rate I thought they nabbed some high-profile guys?...not that high-profile actually equates to quality, I've also met plenty of high-quality "nobodies" in the org chart as well.

Regarding NV's engineering headcount...that includes the hardware design side too, doesn't it? I was just speaking about people hired to work full-time on developing drivers and software infrastructure to support the GP side of the GPGPU business. If NV has 1800 headcount for that then hot-damn they are serious about the best defense is a good offense!

At any rate I have yet to get my head into the GPU game so I bring with me very few facts to this (or any other) GPU gun-fight. If my recollection of 400 software driver developers isn't ringing any bells with you then I'd take that as proof positive my active imagination is making up memories of past threads that likely never existed.

I'm happy to know that, time to retire that data point

That's just the software guys...
Actually, they have twice as many software engineers as they do hardware engineers.
They inherited a lot of people when they took over SGI's graphics division.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: taltamir
Originally posted by: Spicedaddy
Originally posted by: taltamir
with 633 mhz turbo mode and a 100$ or lower mobo, no you are not better off getting an i7

Turbo mode is for people who don't overclock, and good P55 mobos won't be 100$ or lower when they come out, while X58 boards keep getting cheaper every month.

If you're on Core 2: wait and see.
If you're on older stuff and want to upgrade: not worth the 3-4 month wait IMO.

if you are on something older than core2 than wait a few months and buy a used C2Q system for peanuts.


Anyways, the X58 is just so WASTEFUL, the X58 does NOTHING except connect to the CPU, Southbridge, and Video cards... moving the video card interconnect to the CPU and have the CPU connect to the southbridge directly just makes a whole lot of sense.

The problem with disabling turbo mode and overclocking the chip, is that you end up using a lot more power.

That's not how Intel works. Intel follows the Apple model, the old stuff is discontinued, and they just create new low end stuff to fill the bottom price points. They almost never let stuff filter down.
Now AMD on the other hand...
Really at this point, core 2 systems are probably about as cheap as they'll ever be (the quad cores at least), if you're planning an upgrade that isn't i7 or i5, it should probably be phenom 2.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: taltamir
Originally posted by: SunnyD
Let the whining about no full 2x PCIe-16x slots begin.

yes... really... how many people need TWO 4870x2 cards for a total of 4GPUs? and even if you do... how much performance do they lose from being 2x8xpcie-v2

What about for the next 2-3 years if you keep the motherboard? Most people upgrade CPU/Mobo less often than video cards.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Any single slot video card gets the SAME bandwidth from an X58 and P55... only when you use TWO slots do you lose bandwidth.
So in the next 2-3 years? well, how many people are gonna buy a budget P55 today and then try to run a 2 HD7870x2 cards on it several years down the road for quad GPU at... what.. 1000$ worth of video cards with no CPU or ram upgrade?
By then we would all be using PCIe V3 or V4

How many people are buying 2 x 4870x2 cards now and trying to run it on their 2x8x pcieV1 mobo with antique ram and CPU, wishing they had spent a lot more money for a 2x16x pcieV1 mobo? most run it on a single 16x pcieV2.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Fox5
Originally posted by: taltamir
Originally posted by: Spicedaddy
Originally posted by: taltamir
with 633 mhz turbo mode and a 100$ or lower mobo, no you are not better off getting an i7

Turbo mode is for people who don't overclock, and good P55 mobos won't be 100$ or lower when they come out, while X58 boards keep getting cheaper every month.

If you're on Core 2: wait and see.
If you're on older stuff and want to upgrade: not worth the 3-4 month wait IMO.

if you are on something older than core2 than wait a few months and buy a used C2Q system for peanuts.


Anyways, the X58 is just so WASTEFUL, the X58 does NOTHING except connect to the CPU, Southbridge, and Video cards... moving the video card interconnect to the CPU and have the CPU connect to the southbridge directly just makes a whole lot of sense.

The problem with disabling turbo mode and overclocking the chip, is that you end up using a lot more power.

That's not how Intel works. Intel follows the Apple model, the old stuff is discontinued, and they just create new low end stuff to fill the bottom price points. They almost never let stuff filter down.
Now AMD on the other hand...
Really at this point, core 2 systems are probably about as cheap as they'll ever be (the quad cores at least), if you're planning an upgrade that isn't i7 or i5, it should probably be phenom 2.

i meant buy used OR buy it on sale (stock clearing for new better parts), i know intel doesn't reduce cost of older parts, but that doesn't mean that the older parts (now of lower value) just evaporate overnight. Stores still have them in stock, people still own them... etc.
 

imported_Shaq

Senior member
Sep 24, 2004
731
0
0
Originally posted by: taltamir
Any single slot video card gets the SAME bandwidth from an X58 and P55... only when you use TWO slots do you lose bandwidth.
So in the next 2-3 years? well, how many people are gonna buy a budget P55 today and then try to run a 2 HD7870x2 cards on it several years down the road for quad GPU at... what.. 1000$ worth of video cards with no CPU or ram upgrade?
By then we would all be using PCIe V3 or V4

How many people are buying 2 x 4870x2 cards now and trying to run it on their 2x8x pcieV1 mobo with antique ram and CPU, wishing they had spent a lot more money for a 2x16x pcieV1 mobo? most run it on a single 16x pcieV2.

A 4870x2 seems like a lot now but won't be anything special by next fall much less a year or two after that. And they are bottlenecked on a PCI 1.0 slot that was the norm last year when it came out. It is even worse for Nvidia cards. I bet that a 3.0 slot will show some gains when the new cards are released this fall. Those 2 x8 slots were also considered pretty good a year ago. You'd probably be surprised what video cards people will buy and use them in old computers since they are a pretty hassle-free upgrade. And it won't have to be $1000 video cards to show a bottleneck as those cards will be on Ebay for $100 next year (4870X2).

I usually upgrade CPU's every 2 years and the X58 is very inexpensive for an "extreme edition" motherboard and I see the $40 extra as a very good deal. For XFire/SLI it is a no-brainer for it can almost double performance in some cases. If you don't think it's worth it that's fine.
 

Jabbernyx

Senior member
Feb 2, 2009
350
0
0
Originally posted by: taltamir
i meant buy used OR buy it on sale (stock clearing for new better parts), i know intel doesn't reduce cost of older parts, but that doesn't mean that the older parts (now of lower value) just evaporate overnight. Stores still have them in stock, people still own them... etc.
It just plain amazes me how much power you can get for <$150 nowadays :

Used 775 mobo that OCs - $40
New E5200 - $70
4GB RAM - $25

$135 for a ~3.5GHz system that's about equivalent to a ~3.8GHz older C2D. Wow...
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: taltamir
Any single slot video card gets the SAME bandwidth from an X58 and P55... only when you use TWO slots do you lose bandwidth.
So in the next 2-3 years? well, how many people are gonna buy a budget P55 today and then try to run a 2 HD7870x2 cards on it several years down the road for quad GPU at... what.. 1000$ worth of video cards with no CPU or ram upgrade?
By then we would all be using PCIe V3 or V4

How many people are buying 2 x 4870x2 cards now and trying to run it on their 2x8x pcieV1 mobo with antique ram and CPU, wishing they had spent a lot more money for a 2x16x pcieV1 mobo? most run it on a single 16x pcieV2.

Currently a GTX280 needs a full PCIe slot to perform properly. The performance dropoff is quite dramatic in a slower slot (approximately 20%).

IMO it's unacceptable to release an entire platform that bottlenecks ANY of the graphics cards currently on the market. Hopefully we are speculating incorrectly here, but I have a feeling that just may happen. At the very least, it will be a factor in the long run (which intel never seems to care about when they design their platforms - they want you to upgrade every 2 years so you keep them in business).
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Originally posted by: Idontcare
Originally posted by: Phynaz
Originally posted by: Viditor
My own OPINION is that Larrabee will be a flop...I have heard that none of the major driver designers have yet been enticed over to the project, and without some massive and completely new drivers for Larrabee, it will be relegated to the low end. At least for the first few years...


Says the person who said Larabee isn't a GPU.
Not the most informed opiinion I would say.

Felix, Oscar, you guys just make me LOL :laugh:





:laugh:


 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Jabbernyx
Originally posted by: taltamir
i meant buy used OR buy it on sale (stock clearing for new better parts), i know intel doesn't reduce cost of older parts, but that doesn't mean that the older parts (now of lower value) just evaporate overnight. Stores still have them in stock, people still own them... etc.
It just plain amazes me how much power you can get for <$150 nowadays :

Used 775 mobo that OCs - $40
New E5200 - $70
4GB RAM - $25

$135 for a ~3.5GHz system that's about equivalent to a ~3.8GHz older C2D. Wow...

yap ... buying used FTW.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: SickBeast
Originally posted by: taltamir
Any single slot video card gets the SAME bandwidth from an X58 and P55... only when you use TWO slots do you lose bandwidth.
So in the next 2-3 years? well, how many people are gonna buy a budget P55 today and then try to run a 2 HD7870x2 cards on it several years down the road for quad GPU at... what.. 1000$ worth of video cards with no CPU or ram upgrade?
By then we would all be using PCIe V3 or V4

How many people are buying 2 x 4870x2 cards now and trying to run it on their 2x8x pcieV1 mobo with antique ram and CPU, wishing they had spent a lot more money for a 2x16x pcieV1 mobo? most run it on a single 16x pcieV2.

Currently a GTX280 needs a full PCIe slot to perform properly. The performance dropoff is quite dramatic in a slower slot (approximately 20%).

IMO it's unacceptable to release an entire platform that bottlenecks ANY of the graphics cards currently on the market. Hopefully we are speculating incorrectly here, but I have a feeling that just may happen. At the very least, it will be a factor in the long run (which intel never seems to care about when they design their platforms - they want you to upgrade every 2 years so you keep them in business).

but it ONLY HAPPENS IN SLI! why do you keep on saying it bottlenecks single cards? it doesn't! It offers the exact same bandwidth for a single card!

I agree that it WILL be an issue if you decide to run SLI or xFire a few years down the road... but I just don't find it a realistic scenario... what i see as a realistic scenario is just using one slot card, which gets the same bandwidth.
 

Scotteq

Diamond Member
Apr 10, 2008
5,276
5
0
Originally posted by: SickBeast
Originally posted by: taltamir
Any single slot video card gets the SAME bandwidth from an X58 and P55... only when you use TWO slots do you lose bandwidth.
So in the next 2-3 years? well, how many people are gonna buy a budget P55 today and then try to run a 2 HD7870x2 cards on it several years down the road for quad GPU at... what.. 1000$ worth of video cards with no CPU or ram upgrade?
By then we would all be using PCIe V3 or V4

How many people are buying 2 x 4870x2 cards now and trying to run it on their 2x8x pcieV1 mobo with antique ram and CPU, wishing they had spent a lot more money for a 2x16x pcieV1 mobo? most run it on a single 16x pcieV2.

Currently a GTX280 needs a full PCIe slot to perform properly. The performance dropoff is quite dramatic in a slower slot (approximately 20%).

IMO it's unacceptable to release an entire platform that bottlenecks ANY of the graphics cards currently on the market. Hopefully we are speculating incorrectly here, but I have a feeling that just may happen. At the very least, it will be a factor in the long run (which intel never seems to care about when they design their platforms - they want you to upgrade every 2 years so you keep them in business).



There's something I dont' understand here, so maybe you guys can help me out...


We know that:

(1) Any single GPU on the market today runs perfectly well in a single x16 slot.

(2) Older chipsets have to split bandwidth when running multiple X16 slots
(2a) This didn't used to be an issue because older generation cards weren't capable of saturating an x8 slot anyhow. But is is now with the newer GPUs.

(3) This limitation is 'Fixed' with the new x58/ICH10, which can support 3 x16 slots.



Now, the two issues here I'm having trouble understanding:


The first is where's the part about "The User/Builder Is Responsible For Purchasing The Parts To Support His/Her Intended Usage"?

- - Simply Put, if I want to build/upgrade a system then it is my responsibility to suss out the detailia. At that point, it's on me to acknowledge some limitation and either replace the offending part or choose to live with it.


The second is "How is it fair to hold a manufacturer - who's product was perfectly adequate at the time of it's release - retroactively responsible because progression has created the *potential* for a bottleneck should the user not pay attention to component details?"


- - Should be self explanatory.


Because, (To be perfectly blunt), this sounds like the same Jack~A$$~Hattery that gets us the endless tirades against Microsoft because their brand new OS doesn't run on the 15 year old sh*tbox which "The Moron Of The Day" chooses to inflict upon it.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I'm not talking about X58 OR P55; I'm talking about Lynnfield as I figure it's what this thread is about.

With its on-die PCIe controller, the Lynnfield is 1/2 speed PCIe for all single cards, and I assume 1/4 speed for dual cards (if that is even an option on the platform).
ok, now we know where the confusion is at.
P55 is lynfield, and lynfield is full speed for single cards, 1/2 speed for duel cards. while X58 is full speed single card and full speed duel card.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |