• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD's next big Mantle Game

We need a new RTS game, was dissapointed with generals 2 getting scrapped.

their hasn't been anything decent since c&c generals and forged alliance imo
 
Yes, I can see what you mean about across the board conflicting with various titles, though I don't see the comparison when between the two (the recent nvidia driver / mantle) when, as far as I'm aware, mantle doesn't yet support the titles that you linked?
 
Yes, I can see what you mean about across the board conflicting with various titles, though I don't see the comparison when between the two (the recent nvidia driver / mantle) when, as far as I'm aware, mantle doesn't yet support the titles that you linked?

Thats right, it doesn't, in the same was this new Driver from Nvidia is selective in the games that it improves.

This Driver is not in competition with Mantle, it certainly does not have similar results to Mantle.

Mantle is what it is and Nvidia's Drivers are what they are.

I was asked what is Mantle and what does it do, i gave the requested answer, i said nothing at all about Nvidia's Drivers, it has nothing to do with them
I was not the one making the comparison. Despite this; yet again it was turned into Nvidia vs AMD for completely invented reasons. you cannot turn around in this forum without that crap happening!
 
Last edited:
My curiosity was generated when you wrote : 'I said Nvidia's Driver does see improvements "in Some Games" andybird said it gives similar performance to Mantle, its doesn't, not even close.'

Hence my question about the comparison. How can it be 'not even close' when there's nothing to compare? (AFAIK).
 
Last edited:
My curiosity was generated when you wrote : 'I said Nvidia's Driver does see improvements "in Some Games" andybird said it gives similar performance to Mantle, its doesn't, not even close.'

Hence my question about the comparison. How can it be 'not even close' when there's nothing to compare (AFAIK).


If there is nothing to compare, if the Driver does not support the same games, then its not similar performance in that sense, nor is it similar performance in any other sense as it does not give the same level of CPU scaling in anything that it does scale in.
 
Thats not what i said.
I said Nvidia's Driver does see improvements "in Some Games" andybird said it gives similar performance to Mantle, its doesn't, not even close.

Since my quote of you was a Direct quote of what u did Say it Is what u said


Nvidia's new driver only give a boost where the CPU is already very powerful, like a £450 3960K, does not help with CPUs that do not cost £450, thats why Nvidia only used the 3960K in their PR

As i guess you may not have a nvidia 700 series card to try i'm informing you there a boosts with CPU's less powerful than a 3960k
 
The starswarm demo is a little misleading, as it also supports an option called "defered contexts" which when enabled on Nvidia machines gives a similar performance boost... The most recent nvidia drivers also give a good boost in this demo

So far mantle seems to give around a 5% boost in a lot of situations, and bigger when severely CPU limited, like low end cpu with high end gpu, or crossfire

Basically it is a little handy boost in 2 games with a few others announced, in all likelihood it will be deprecated by DX12

Still, Civ games have historically been cpu limited, so a good choice for Mantle marketing, will be interesting to see if they use command lists again (as they made a pair of 580's exceed a pair of 7970's in Civ5)

Dan Baker of Oxide games helped make DX11, and he was also the graphics lead on Civilization 5. Civ 5’s DX11 mode has some of the highest draw calls of any DX11 game ever made, so he knows a thing or two about how DirectX 11 works. He was also a pioneer and an industry expert in experimenting with DX11 deferred contexts, another technique that doesn’t improve performance like anyone thought it should—but NVIDIA still pushes developers to use it. Again, why? Because deferred contexts improve performance for the first five minutes of gameplay, then the accumulated contexts eventually slow down the framerates below an engine without them. But how long do benchmarks take? ;) Shrewd, no? You should read Dan’s comments in this interview. I think you’ll find them very enlightening.
 
there are boosts with CPU's less powerful than a 3960k

This. If nothing else, the CPU is no longer constantly required for shader compilation due to the new shader cache. Even if your processor can handle that, it leaves more CPU available for physics, AI, animations...

You either see a boost with 337.50 - whether it's 5% or 50% is another matter entirely - or you're doing it wrong.

Any gains I've mentioned in other threads are on my stock 1090T (had to return my Z87 board, faulty)
 
Some people have recorded some MASSIVE gains from the new Nvidia Driver.....
I'm also pretty sure you're wrong about the driver too.

And lets not get into blanket statements when you're calling Mantle the only low level API in existence.

It is the only low level API for PC ATM. Glide was the last one and that was over a decade ago.

Whether DX11 will match Mantle due to improvements is one thing,but DX11 is not low level AFAIK.

Edit!!

You might be right. There is actually at least another one in development and technically X3D is meant to be low level.

As i guess you may not have a nvidia 700 series card to try i'm informing you there a boosts with CPU's less powerful than a 3960k

There are boosts with the newer NV drivers(I saw a few percent),but what about with £70 CPUs though with £70 cards??

Mantle is about AMD making their own CPUs more competitive.

Some of the gains can be massive with lower end setups.

The only problems I can see that of course it is locked to AMD,and that Thief seems a meh game.

TH tests Thief with Mantle:

http://www.tomshardware.com/reviews/thief-mantle-benchmarks,3773.html

An FX4170 just about matches a Core i7 4770K!

X343g7k.png


ujEdKhK.png


unCjboT.png


Now,if those are the kind of gains on cheap setups we are seeing,then what about games like Star Citizen and the one in the OP??

It could be the difference between a new system or just buying a new graphics card.

Thief seems to be quite lightly threaded,and with Mantle anyone with an AMD CPU or an older Intel one is going to see decent gains,even with lower end cards.

There are plenty of people using Core2 quads and older Core i3 CPUs(Core i3 530 and the like) for example who are going to see a gain.

DX12,Mantle,etc are going to see the biggest gains with older systems and budget systems - it means they can get another round of graphics card upgrades.

Remember,the vast majority of gamers worldwide are not hardware enthusiasts,and do not upgrade hardware as much as people on UK based computer forums like OcUK.
 
Last edited:
Are you being serious? It's an open forum.

Tell you what Humbug, I'll let you crack on with it.

You have not been following the Nvidia threads though have you. Look at page 4 of the 337 thread and the deleted posts and why they came along.

It was because one poor chap posted a link to an article.I bet he wish he had not. The worst thing is that instead of looking at the articles linked to in the report,he was just jumped on and the mods needed to delete the posts.


Anyone who does not see magical massive gains with the drivers are instantly castated as "doing it wrong" and so forth. Yet the same people ignore repeatedly when big gains are seen with AMD own whatitsnames on cheap/not high end setups.

This website tested a GTX760:

http://blackholetec.com/drupal7/article/review-nvidia-geforce-33750-beta-driver

It was in the article he linked to.

The chap used an A10 and a stock and underclocked Core i7 2600.

The GTX760 can be had for as low as £150.

It is the kind of the graphics card someone might buy when upgrading an older system.

The A8 7600 approximates well to an older Core2 Quad/Phenom II X4.

The only decent boost was seen with Skyrim(20%) with the Core i7 at 1.9GHZ,but nothing was seen with the A8 7600. All the other games he tested did not really show any gains.

This is why I will try and bench Skyrim with my setup,even though I didn't see much gains with other games even after upgrading from a 5 month old driver to the latest one(I got jumped on too). There are few others who did not see massive gains either who were ignored too.

But the gains from Thief were hilariously huge with £70 to £130 graphics cards and older/weaker CPUs with the newer AMD drivers.

With the FX4170,you are seeing upto 40% improvements in average framerates and 60% improvements in minimums with an R9 270,which is like £120 to £130.

Even a poxy HD7770 saw improvements.

That is with a CHEAP card.

It blows out of the water the whole "no one will use a R9 290/GTX780 with a Pentium" argument. Yet with a £70 to £80 CPU equivalent and a £70 to £130 graphics card we see decent gains. Of course its on purpose ignored by many repeatedly so they can keep saying "no one will use a R9 290/GTX780 with a Pentium" and so forth.

Unless of course people are thinking a few websites showing these kind of gains are lying toads and paid by AMD(convenient).

What if DX12 shows these kind of improvements?? It could be the start of a golden age for PC gaming as people will need to worry less about their hardware and more about the games.

Unfortunately,this is kind of lost in the high end benchmark war.

Anyway,thats me done for the thread(hopefully).
 
Last edited:
Thats not what i said.
I said Nvidia's Driver does see improvements "in Some Games" andybird said it gives similar performance to Mantle, its doesn't, not even close.

I said deferred contexts gives similar gains in STAR SWARM, which it does, 100% gains in average fps, which is what your slide shows
 
I love how so far all of the mantle titles are games I want to play, been addicted to civ5 and ill be addicted to the next one.

well done amd for, by pure chance, backing everything I want :-D
 
When this game hits...

1. why is mantle delayed int his game
2. when will it be patched
3. why is it not working
4. will it support my card
 
More than mantle at present, though. So, hard to compare the two surely?

It's not hard. Bench Thief, bench BF 4 where some people said they saw gains. In the latest, go in a MP server, in a CPU limited zone (near point B for instance on the road or D in the corner) and see what FPS you've got.
 
It's not hard. Bench Thief, bench BF 4 where some people said they saw gains. In the latest, go in a MP server, in a CPU limited zone (near point B for instance on the road or D in the corner) and see what FPS you've got.

It's not the procedure, it's the number of comparisons, which at present, are very few. Of course, that will change when mantle gets more titles under its belt, which can only be a good thing. I suppose having another api is the next best thing to us having a third player in the graphics market.
 
When this game hits...

1. why is mantle delayed int his game
2. when will it be patched
3. why is it not working
4. will it support my card

Pretty much what I'm expecting, which is why mantle excites me very little. They day they ship a game with full support is the day I get excited.
 
Starswarm test on rig in sig, 1080p, extreme.

Default:

== Results ================================================
Test Duration: 360 Seconds
Total Frames: 13234

Average FPS: 36.76
Average Unit Count: 4038
Maximum Unit Count: 5474
Average Batches/MS: 714.64
Maximum Batches/MS: 1631.72
Average Batch Count: 19742
Maximum Batch Count: 122244
===========================================================

Turning Deferred Contexts on:

== Results ================================================
Test Duration: 360 Seconds
Total Frames: 17090

Average FPS: 47.47
Average Unit Count: 4291
Maximum Unit Count: 5641
Average Batches/MS: 971.05
Maximum Batches/MS: 2874.18
Average Batch Count: 21390
Maximum Batch Count: 112223
===========================================================

That's a 30% improvement.

Anyway, it's about time somebody kicked Microsoft's arse into a thorough reworking of DX and eliminating the bloat, and I'm glad AMD have picked up the gauntlet.
 
Dan Baker of Oxide games helped make DX11, and he was also the graphics lead on Civilization 5. Civ 5’s DX11 mode has some of the highest draw calls of any DX11 game ever made, so he knows a thing or two about how DirectX 11 works. He was also a pioneer and an industry expert in experimenting with DX11 deferred contexts, another technique that doesn’t improve performance like anyone thought it should—but NVIDIA still pushes developers to use it. Again, why? Because deferred contexts improve performance for the first five minutes of gameplay, then the accumulated contexts eventually slow down the framerates below an engine without them. But how long do benchmarks take? ;) Shrewd, no? You should read Dan’s comments in this interview. I think you’ll find them very enlightening.

Like everything else in this, its all smoke an mirrors on the NV side.

I said deferred contexts gives similar gains in STAR SWARM, which it does, 100% gains in average fps, which is what your slide shows

Which it doesn't

Starswarm test on rig in sig, 1080p, extreme.

Default:

== Results ================================================
Test Duration: 360 Seconds
Total Frames: 13234

Average FPS: 36.76
Average Unit Count: 4038
Maximum Unit Count: 5474
Average Batches/MS: 714.64
Maximum Batches/MS: 1631.72
Average Batch Count: 19742
Maximum Batch Count: 122244
===========================================================

Turning Deferred Contexts on:

== Results ================================================
Test Duration: 360 Seconds
Total Frames: 17090

Average FPS: 47.47
Average Unit Count: 4291
Maximum Unit Count: 5641
Average Batches/MS: 971.05
Maximum Batches/MS: 2874.18
Average Batch Count: 21390
Maximum Batch Count: 112223
===========================================================

That's a 30% improvement.

Anyway, it's about time somebody kicked Microsoft's arse into a thorough reworking of DX and eliminating the bloat, and I'm glad AMD have picked up the gauntlet.

DirectX

===========================================================
Oxide Games
Star Swarm Stress Test - ©2013
C:\Users\########\Documents\Star Swarm\Output_14_04_11_1437.txt
Version 1.10
04/11/2014 14:37
===========================================================

== Hardware Configuration =================================
GPU: AMD Radeon R9 200 Series
CPU: AuthenticAMD
AMD Phenom(tm) II X6 1090T Processor
Physical Cores: 6
Logical Cores: 6
Physical Memory: 8549400576
Allocatable Memory: 140737488224256
===========================================================


== Configuration ==========================================
API: DirectX
Scenario: ScenarioFollow.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Deferred Contexts: Disabled
Temporal AA Duration: 16
Temporal AA Time Slice: 2
Detailed Frame Info: C:\Users\#####\Documents\Star Swarm\FrameDump_14_04_11_1437.csv
===========================================================

== Results ================================================
Test Duration: 360 Seconds
Total Frames: 10007

Average FPS: 27.78
Average Unit Count: 4033
Maximum Unit Count: 5460
Average Batches/MS: 463.11
Maximum Batches/MS: 933.67
Average Batch Count: 18124
Maximum Batch Count: 110446
===========================================================

Mantle

===========================================================
Oxide Games
Star Swarm Stress Test - ©2013
C:\Users\####\Documents\Star Swarm\Output_14_04_11_1444.txt
Version 1.10
04/11/2014 14:44
===========================================================

== Hardware Configuration =================================
GPU: AMD Radeon R9 200 Series
CPU: AuthenticAMD
AMD Phenom(tm) II X6 1090T Processor
Physical Cores: 6
Logical Cores: 6
Physical Memory: 8549400576
Allocatable Memory: 140737488224256
===========================================================


== Configuration ==========================================
API: Mantle
Scenario: ScenarioFollow.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Deferred Contexts: Disabled
Temporal AA Duration: 16
Temporal AA Time Slice: 2
Detailed Frame Info: C:\Users\#####\Documents\Star Swarm\FrameDump_14_04_11_1444.csv
===========================================================

== Results ================================================
Test Duration: 360 Seconds
Total Frames: 20453

Average FPS: 56.81
Average Unit Count: 4635
Maximum Unit Count: 5596
Average Batches/MS: 941.75
Maximum Batches/MS: 3639.21
Average Batch Count: 20102
Maximum Batch Count: 146378
===========================================================


100% improvement.
 
Last edited:
I get 30's with it off and 60's with DC and new drivers, and Gregster is still top of the score table with 70 odd
And of i turn off the pointless motion blur (which I would in every game anyway) i get 125+fps

Im not sure you are really trying to prove with starswarm other than it is an awful example to try to make any kind of point with as it is all over the show

Last time you ran the starswarm demo and got 50ish you were telling us it was a 38% improvment, so not sure why your dx score has tanked in the meantime
 
Last edited:
Back
Top Bottom