First of all, I said it's a professional card, so it has the cost doubled just because companies will pay it and can write it off as a business expense.
They could make a 48 GB 7900 XTX if they really wanted to, but they'd rather sell you a $4,000 W7900 than a $1,500 (or probably less) 7900...
Since both AMD and NVidia seem to be allergic to releasing anything for under $200, Intel could actually gain some market share and good will by making smaller GPUs targeted at the lower end of the market. Even some people who'd normally stick with AMD/NV might grab a low cost Intel card just to...
So the rest are bad, but the 7800XT which is the exception isn't good. I like you would greatly prefer it to only cost $350 and that the rest of the market would shift right along with that, but I don't see that coming to pass anytime soon. If anything we'll be lucky to see the new price tiers...
Yeah it's the same person, apparently he changed his account name. There's a few people who have done it for whatever reason. Unless there's some other way to tell that I'm not aware of you'd have to go back and look at an old post where someone replied to them as name in any quoted text from a...
I'm waiting for the 13" model that I ordered to ship, but after reading some of the reviews I'm definitely excited for it to arrive.
I'm somewhat curious to see if improvements in the media engines means that it could actually encode any of my Final Cut projects faster than my M1 MBP. I...
Maybe there's a market there for that, but anyone doing anything for an actual job is probably better off spending the money to get an H100 or MI300 that has even more memory, far more resources, and the bandwidth to keep all of the hardware fed.
Running LLMs is going to be done by an NPU...
I think that it'll be most popular with CAD users or people dealing with 3D graphics.
For serious AI work you'll want a top end workstation if you're doing the work locally. If you aren't a $200 Chromebook that can SSH into a workstation is about as good as anything else.
I disagree. If someone says they'll make something that's 3x better than anything else on the market, but only make something that turns out to be 2x as good, it's not a good product? It's still 2x better than anything on the market. What if they never told you it was supposed to be 3x better...
The M3 Max has a lot of the die devoted to the GPU, but for encoding video/audio it'll use dedicated hardware blocks assuming it's a codec Apple supports. That's what makes is so fast. There are certainly users who need all of those GPU cores, but for a lot of Apple users they're seldom going be...
I wouldn't read too much into anything related to what the market wants based on purchases of Apple hardware. The big APU is just half of the equation and a lot of that big APU is the dedicated hardware acceleration for the sort of things Mac users tend to do with their Macs. The average PC...
Probably improvements in some of the dedicated logic Apple bakes in to the hardware. They probably found a faster implementation that used more transistors that the node shrink gave them the room for. It might be something that was in M3.
Didn't really matter as long as it was profitable for the miner, which it was.
I think it was mostly supply constraints at the time. TSMC didn't have spare wafers, everyone was trying to get substrate which was apparently in very short supply for a few months, AMD had to fulfill agreements...
I wonder if I can create a time paradox by making up a rumor and attributing it to an MLID video only for him to use that rumor as the basis for the video.
Yeah, it's a bunch of random guys in their basements strategic command bunkers!
They'd probably be best of breaking into the data center and professional markets first. The margins there are high enough to support the costs of development. They'll still need GPU tech for all of their desktop/laptop CPUs and they can spend the time getting drivers ironed out there before...
The demise of Intel is vastly overstated. They might be taken out behind the woodshed next generation due to a confluence of AMD having a great new product and Intel being unable to juice theirs quite as much, but it's nowhere near as bad as what AMD crawled through during the Bulldozer era when...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.