The AMD Mantle Thread

Page 142 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Marketing slides? I'll believe it when I see it..


While there's no doubt that the DirectX API has limitations that should be addressed, that "marketing slide" makes it appear much worse than it really is.

Look at BF4. On PC, you get far superior frame rate, graphics and overall experience compared to the PS4 and Xbox One. The PS4 and Xbox one cannot even do 1080p with scaled back visual fidelity, and they still regularly fall far below 60 FPS in large multiplayer battles despite having higher draw call capacity and supposedly greater efficiency.
That is what i am talking about.Even a 7850 can run BF4 at 45 to 50fps high settings on 1080p while Ps4 barley running BF4 on 900p at 60fps and high to mid settings.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You can only get those very high frame rates when you have a heavily overclocked recent Intel CPU. Even 4.5GHz haswells run at 90-95% CPU. For the majority of gamers that do not have this setup, lower CPU usage is a big deal.

I think you're exaggerating a bit. Yes, BF4 multiplayer is very CPU bound during large battles due to the extra work the CPU has to put in not just for draw call submissions, but physics processing for destruction and game state tracking as well..

But I don't think it "requires" an overclocked CPU to get great frame rates. According to GameGPU, you only see really high CPU usage on dual and non hyperthreaded quad cores.



 

Spjut

Senior member
Apr 9, 2011
931
160
106
When DICE was creating BF3, NVidia hadn't released multithreaded drivers. So that probably had a lot to do with why they couldn't implement it properly. Also, it was quite a difficult thing for NVidia to do. Since then, the drivers have improved significantly in their multithreaded capabilities to be sure.

As for BF4, I think DICE could have done it if they wanted to, but since it was a G.E game, they didn't.

Not necessarily. After all this time, AMD still hasn't added support for multithreaded rendering in their drivers. Like I said before, it's a difficult undertaking to do so, and to be frank, I think their driver team lacks either the resources or competence to do so.

This recent PClab.pl review has shown that NVidia's drivers are FAR more optimized for multithreading than AMD's are. It's something that has helped NVidia big time as their hardware has a significant advantage on game engines that are multithreaded; which the next generation of game engines will all be.

BF3 was released in october 2011, Nvidia's drivers were publically known to support multithreaded rendering since april 2011 (the time of Ryan Smith's post). And that was for us normal users, Dice has continuous contacts with AMD's and Nvidia's driver teams during the whole development.

Come on now, Johan Andersson at Dice is even one of the driving forces behind Mantle. Do you really think he would go through all the extra work for adding Mantle, if BF4 simply could have used DX11 multithreaded rendering instead?
Dice requiring DX11 multithreaded rendering for BF3 and BF4 is hard for AMD to say no to, and Dice has all the reason in the world to want to keep development costs down by just using DX11.

And here is where you lose me completely. If you think AMD's driver team lacks the competence or resources to support multithreaded rendering, then how can they be good enough to create a whole new lower level API?

And why would both AMD and Dice, Oxide and Nixxes spend resources on Mantle, which also has the negative that it works only on GCN-based cards, if the solution just was to use multithreaded rendering or OpenGL instead?
If everyone in the business wanted multithreaded rendering, and especially big players like EA/Dice are pushing for it, AMD wouldn't be in the position to say no.

And in the link you posted, don't mistake those differences for DX11 multithreaded performance at work. The only games that has a clear lead on the GTX 780 is The Witcher 2 (which also is DX9 only) and BF4 when using quad cores. Crysis 3 performs better too when comparing the results for the quad core, but the R9 290x on the dual core actually beats out the 780 on both the dual and the quad.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Why are there such horrible lod and draw distance issues in the game then? Dice have truly optimized the crap out of the game to get it to run and look as good as it does, but its anything but perfect.

Nothing made by man is perfect, but what are these horrible LOD and draw distance issues you're talking about?

I've played the single player campaign several times, plus I've dabbled in the multiplayer as well, and I've never noticed draw distance issues.

Is there pop in? Yes, any engine that streams data (like all modern gaming engines do) is going to have some level of pop in. But it's not very noticeable in my opinion.

If you have really bad pop in, then it's probably because you don't have enough VRAM and system memory. Textures and objects are cached in the VRAM for quick deployment when needed. But if you don't have enough VRAM, it will have to be stored in the main memory which will take longer to load. Or if you don't have enough system memory, then it will have to load from the HDD or SSD which is even slower.

On a 2GB card, you will get notably worse pop in than on a card that has 3 or 4GB. On my 4GB GTX 770s, I've noticed BF4 routinely uses over 2GB..

What if mantle let's them completely draw everything in the world with still great performance? No more pop in, no more lod switching. That would be a killing blow to nvidia if its the case. I don't know if that is the type of thing mantle can make possible but if it is no one who is serious about gaming will use nvidia hardware anymore.
Mantle can't do that, because draw distance is affected by hardware as well; plus as I mentioned earlier, data streaming is a function of the engine so there will ALWAYS be some level of pop in. A well designed native DX11 engine will hide it well though, and BF4 seems to do that provided you have the hardware.

If you have a 1GB video card and only 4GB of memory however, then you're going to have really bad object and texture pop in, and no API is going to be able to help you.
 
Last edited:

Stewox

Senior member
Dec 10, 2013
528
0
0
"Percieved effect" is not a metric I care for...no should any technical person.

Customers do. Includes me.

"Percieved" might not be the correct term here.

Well if rumors are to be believed, AMD paid a nice sum of money to EA to implement Mantle in their Frostbite 3 engine.

:|


And here is where you lose me completely. If you think AMD's driver team lacks the competence or resources to support multithreaded rendering, then how can they be good enough to create a whole new lower level API?

You lack understanding what driver development is like. DX and OGL drivers are hacks, mediocre hacks, driver developer has no access to game source code, they guess, and guess some more, hopefully they guess correct and it works,the back-and-forth clusterpick communication, it's an extremely inefficient way to optimize games which are heavily impacted by each day of delay a released game does not work.

It boggles the mind why people would be defending this trainwreck. Let the failtrain crash already. Mantle is coming! With all it's glory and all it's powah.



So is this now a 2014 thing?


I don't care anymore, no more time for this bleh, I'm jogging (faster pace), reading book (human race get off your knees), baking holiday pastry, even more exercising downstairs in basement while listening to the alex jones radio show on an old nokia cellphone via DD-WRT flashed and modified Linksys WRT320N router with DIY external antenna with 3 meter H1000A extension cable ...:awe:
Right Now: morning, recovering data from a friends half-broken HDD, listening to bob marley - get up,stand up; drinking warm camomiles tea with raw lemon juice with beyond tangy tangerine 2.0, scanning the HDD with chkdsk for bad sectors, oh and I just turned on that himalayan salt crystal lamp to clean my air in the room and a nice anti-electrosmog effect ^_^

There is no way I'm going to care about mantle at all until like 3rd of january or something. What the heck were they thinking, better not release it now, it can only make me mad

... oh looks like the wmp playlist switched to ACDC - For Those About To Rock :whiste:
 
Last edited:

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
You lack understanding what driver development is like. DX and OGL drivers are hacks, mediocre hacks, driver developer has no access to game source code, they guess, and guess some more, hopefully they guess correct and it works,the back-and-forth clusterpick communication, it's an extremely inefficient way to optimize games which are heavily impacted by each day of delay a released game does not work.
Indeed, this is one of the things that developers have been praising on Mantle.
If something doesn't work, you know that it's most likely in your own code, it's huge step forward and a time saver.

You can actually code your game and not worry about features that simply do not work with current drivers or that new driver will 'optimize' how you access data and suddenly break everything.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
BF3 was released in october 2011, Nvidia's drivers were publically known to support multithreaded rendering since april 2011 (the time of Ryan Smith's post). And that was for us normal users, Dice has continuous contacts with AMD's and Nvidia's driver teams during the whole development.

Yeah, but the game (and particularly the engine) had been in development for 2 years. Anyway, the reason why I said what I said is because DICE had some slides where Repi lamented that they had waited for 2 years for multithreaded drivers, and still were:

Look at slide number 34.

Come on now, Johan Andersson at Dice is even one of the driving forces behind Mantle. Do you really think he would go through all the extra work for adding Mantle, if BF4 simply could have used DX11 multithreaded rendering instead?
DX11 multithreading isn't on the same level as Mantle. Mantle would be much more effective at increasing draw calls and reducing overhead, as it doesn't have the legacy baggage that Direct3D has to carry.

But I think it was also a political decision. AMD doesn't have multithreaded drivers, so using that feature would have given NVidia a huge advantage until Mantle shipped.

Dice requiring DX11 multithreaded rendering for BF3 and BF4 is hard for AMD to say no to, and Dice has all the reason in the world to want to keep development costs down by just using DX11
Development costs don't matter as AMD is footing the bill.

And here is where you lose me completely. If you think AMD's driver team lacks the competence or resources to support multithreaded rendering, then how can they be good enough to create a whole new lower level API?
Making multithreaded drivers is apparently pretty damn complicated, and it took NVidia quite some time to implement it themselves. I know for a fact that AMD has developed or toyed with multithreaded drivers, but for them, it either resulted in no gains, or less gains. I saw a post from an AMD employee on Rage3d forums that stated that if I remember correctly.

But for NVidia, multithreaded drivers resulted in very significant gains in performance. What am I to make that?

And why would both AMD and Dice, Oxide and Nixxes spend resources on Mantle, which also has the negative that it works only on GCN-based cards, if the solution just was to use multithreaded rendering or OpenGL instead?
If everyone in the business wanted multithreaded rendering, and especially big players like EA/Dice are pushing for it, AMD wouldn't be in the position to say no.
If AMD is footing the bill for development, then there's no financial risk involved. And Mantle would definitely be superior to DX11 multithreaded rendering as it's a thinner and more efficient API.

And in the link you posted, don't mistake those differences for DX11 multithreaded performance at work.
The link had nothing to do with DX11 multithreading specifically (that uses deferred context rendering and command lists), but general driver multithreading.

The only games that has a clear lead on the GTX 780 is The Witcher 2 (which also is DX9 only) and BF4 when using quad cores.
Assasin's Creed IV and Crysis 3 as well.

Also did you notice that the quad core was underclocked to 2.3ghz, and the dual core was overclocked to 4.6ghz? MHz definitely makes a big differencel.

Anyway, the point I was making, was that NVidia's drivers rely much more on multicore processors than AMD's, because they have a greater level of optimization.. AMD has the edge with two theads, while NVidia gains the edge in more than 2 threads.

The games that performed best on NVidia were games that were strongly multithreaded. BF4, Crysis 3, Witcher 2 and Assassin's Creed IV being the best examples.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
You are the one making assertions without any evidence, but I'll indulge you.

http://www.youtube.com/watch?v=fFkLvopNKRk#t=238

EVE Online runs well at max with ~100 ships and however many drones flying about, and that's on DX9. They just recently added DX11 support, so draw calls are even less of a bottleneck now.

Better but still way behind Mantle. I don't see 100 ships and the camera distance is pretty far off, and fps doesn't look fantastic either. Playable yes, jaw-dropping no. I'd assume this is on a heavily overclocked CPU while Mantle performance remains constant even with a 2GHz CPU, and will continue to scale with cores.

Mantle will be a godsend for Eve, I'd be surprised if they weren't already working on it.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Better but still way behind Mantle. I don't see 100 ships and the camera distance is pretty far off, and fps doesn't look fantastic either. Playable yes, jaw-dropping no. I'd assume this is on a heavily overclocked CPU while Mantle performance remains constant even with a 2GHz CPU, and will continue to scale with cores.

Mantle will be a godsend for Eve, I'd be surprised if they weren't already working on it.

I think you should read up in the UI of EvE...or just play the game before you speak anymore of that game.
If you create an accont I'll toss 10 million ISK at your account...just to be helpfull.
 

jj109

Senior member
Dec 17, 2013
391
59
91
Better but still way behind Mantle. I don't see 100 ships and the camera distance is pretty far off, and fps doesn't look fantastic either. Playable yes, jaw-dropping no. I'd assume this is on a heavily overclocked CPU while Mantle performance remains constant even with a 2GHz CPU, and will continue to scale with cores.

Mantle will be a godsend for Eve, I'd be surprised if they weren't already working on it.

This is on DX9 with full shaders and effects on some random person's computer running FRAPS, not a stripped down game engine demo.

Yes, it is 100+ ships. You can see the 'battlereport' which lists all participants. Also, you can see that all the ships are rendered within the pips even when zoomed out, but you clearly dismissively scrubbed through the video without any attention.

I'm done arguing this. It's a complete waste of time when you just make up stuff as you go.
 

Stewox

Senior member
Dec 10, 2013
528
0
0
Well if rumors are to be believed, AMD paid a nice sum of money to EA to implement Mantle in their Frostbite 3 engine.

Again.

Those aren't rumors, that's BS that was spread by fanboys, kids and other trolls across the web in every forum thread imaginable.

Actually, I haven't heard it when I was listening to the whole talk, so, I randomly found a video where they written the question down ...

Actually these PR conpsiracy theories are really big, seems like a lot of people think it's too good to be true.


http://www.youtube.com/watch?v=3exPJu_F8xk&t=8m20s
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
http://www.youtube.com/watch?feature=player_detailpage&v=QIWyf8Hyjbg#t=1910

If you think that spreadsheet-like Eve battle at long range is anything like this then you're sadly mistaken I'm afraid.

Thousands of units running at 160+ fps on a single graphics card that is the limiter. With the same stuff on screen the Eve battle would be a complete slideshow.

Seriously check the video at 32:35 - http://www.youtube.com/watch?feature=player_detailpage&v=QIWyf8Hyjbg#t=1954 all those units, still running at 100 fps on a single GPU that hasn't even had any GPU-specific optimizations.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
http://www.youtube.com/watch?feature=player_detailpage&v=QIWyf8Hyjbg#t=1910

If you think that spreadsheet-like Eve battle at long range is anything like this then you're sadly mistaken I'm afraid.

Thousands of units running at 160+ fps on a single graphics card that is the limiter. With the same stuff on screen the Eve battle would be a complete slideshow.

You think the graphics in EvE is rendered in Excel, then you are sadly mistaken.
Add servers, netwok, lantency...and you will find out that mocking EvE is really stupid.
Just googling stuff about EvE will make you look uninformed to those of us that have played EvE for years and actually read dev blogs and understand the engine...so please stop.

And sadly for you I linked you to a video with +3000 players in a EvE battle...and the game did NOT run at 1 FPS...but +30 FPS...real world...not canned tech demo.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Found this amusing link from last year.

http://www.pcgamer.com/2012/03/30/valve-actively-seeking-linux-devs/

We are running into a bunch of performance issues in Linux drivers (e.g. 50 millisecond draw calls because thedriver is compiling a shader).

We'd like to hire someone to work on these performance issues. If you know of anyone we should be talking to, I'd appreciate getting connected with them.

Gabe Newell
Valve, Bellevue​
He really should have been talking to AMD I guess.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
If you'd been following the discussion you'd know that this is the main reason for DX stalls Lonbjerg. You know that drop to about 5 fps in your precious ARMA 3, and why you won't see stuff like that with Mantle.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
If you'd been following the discussion you'd know that this is the main reason for DX stalls Lonbjerg. You know that drop to about 5 fps in your precious ARMA 3, and why you won't see stuff like that with Mantle.

I run ARMA3 @ 1600x1200, max settings excpet draw distance that I set around ~7 km..and I don't drop under 30 FPS while gaming

Why do you insist on talking false about games you have never tried?

This is me, having fun...recording WHILE i play:
http://www.youtube.com/watch?v=SkHQlId8GfI&feature=c4-overview&list=UUb2qvg7CbWkPLE4wc9ULT4g

Show me where ARMA3 is limited by drawcalls in that video?
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Because Lonbjerg, because you run the game at those settings, its perfect right?

ArmA 3 performance usually depends on the server I play on. Usually its terrible, unless I do singleplayer. Even then, you can't put too many vehicles or people because the game slows down to a claw.

Not really the best example.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Because Lonbjerg, because you run the game at those settings, its perfect right?

ArmA 3 performance usually depends on the server I play on. Usually its terrible, unless I do singleplayer. Even then, you can't put too many vehicles or people because the game slows down to a claw.

Not really the best example.

Any online game depends on the server you log on.
But are we again, going away from the API...and onto stuff like server performance, lantency ect.?

If the API was the "bottleneck"...it would show in singleplayer mode too...
 
Feb 19, 2009
10,457
10
76
This is on DX9 with full shaders and effects on some random person's computer running FRAPS, not a stripped down game engine demo.

Yes, it is 100+ ships. You can see the 'battlereport' which lists all participants. Also, you can see that all the ships are rendered within the pips even when zoomed out, but you clearly dismissively scrubbed through the video without any attention.

I'm done arguing this. It's a complete waste of time when you just make up stuff as you go.

I've played a lot of Eve, it took them many years of optimization to be able to handle a 200 ship fleet battle above 15 fps.. many years. Even in recent times, a 600 ship slugfest will bog down everyone's computer. They had to resort to displaying ships as icons/text, disabling turret independent tracking, missiles are merely decorative objects without physics/interactions etc.

It was a good effort. But still extremely bottlenecked by current technology.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I've played a lot of Eve, it took them many years of optimization to be able to handle a 200 ship fleet battle above 15 fps.. many years. Even in recent times, a 600 ship slugfest will bog down everyone's computer. They had to resort to displaying ships as icons/text, disabling turret independent tracking, missiles are merely decorative objects without physics/interactions etc.

It was a good effort. But still extremely bottlenecked by current technology.

Try looking at he link I provided again...a +3000 ship battle.

What is up with the attempts to ignore facts over hyperbole?

Read some devblogs:
http://community.eveonline.com/news/dev-blogs/battle-for-6vdt-h/
 

Noctifer616

Senior member
Nov 5, 2013
380
0
76
I've played a lot of Eve, it took them many years of optimization to be able to handle a 200 ship fleet battle above 15 fps.. many years. Even in recent times, a 600 ship slugfest will bog down everyone's computer. They had to resort to displaying ships as icons/text, disabling turret independent tracking, missiles are merely decorative objects without physics/interactions etc.

It was a good effort. But still extremely bottlenecked by current technology.

I imagine the problem is similar to what happens in WoW. In 25 man Heroic raids during the initial pull when everyone is using all their cooldown and pots you see the game go down to 15 fps or lower even on high end machines. There was even input lag issues which were resolved by lowering the amount of heals in raids (pets not effected by group heals, some heals had their power increased while number of ticks was decreased).

The game runs into CPU bottlenecks quite often in 25 man raids and you see the GPU sitting at 20-50% load while the game is between 10-20 fps.

I would imagine that Mantle would help a lot by enabling better multythreading, especially on systems with AMD CPU's.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I imagine the problem is similar to what happens in WoW. In 25 man Heroic raids during the initial pull when everyone is using all their cooldown and pots you see the game go down to 15 fps or lower even on high end machines. There was even input lag issues which were resolved by lowering the amount of heals in raids (pets not effected by group heals, some heals had their power increased while number of ticks was decreased).

The game runs into CPU bottlenecks quite often in 25 man raids and you see the GPU sitting at 20-50% load while the game is between 10-20 fps.

I would imagine that Mantle would help a lot by enabling better multythreading, especially on systems with AMD CPU's.

Not for the server side of things...which is CPU land...it's not a GPU limitation, it's a CPU...not tied to the graphics FYI.

Is guesswork now the "tool" to promote mantle?
 
Feb 19, 2009
10,457
10
76
You know whats funny, I played an old space RTS/RPG game, Space Wolves series.. and with only about 20 enemies on the screen with bullets/missiles, etc, it was lagging. GPU usage? Less than 10%.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Try looking at he link I provided again...a +3000 ship battle.

What is up with the attempts to ignore facts over hyperbole?

Read some devblogs:
http://community.eveonline.com/news/dev-blogs/battle-for-6vdt-h/

You do realize how freaking slow that battle was? There was so much time dilation (the EVE servers slows down time to deal with the insane strain) and the framerate was laughable.

Its pretty bottlenecked still. Mantle might solve that, though needs better servers too obviously.
 
Status
Not open for further replies.
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |