• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia’s ace in the hole against AMD: Maxwell is the first GPU with full DirectX 12 support

By the time DX12 games launch, we'll have 14-20nm GPU's - rendering ALL current GPU's (titanx included) as mid range cards at best.

In other words, it doesn't matter at all what current cards support.
 
Isn't it obvious? if your GPU can't run at full speed its bottlenecked by the CPU or the API which sits between them.

Or, far more likely the driver and it's various components. The likes of Pcars are not pushing anywhere near enough geometry to be the problem at hand. Unless you are saying that AMD's DDX11 drivers are even worse than we thought.
 
You have no idea ^^^

By the time DX12 games launch, we'll have 14-20nm GPU's - rendering ALL current GPU's (titanx included) as mid range cards at best.

In other words, it doesn't matter at all what current cards support.

DX12 is gona by Epic, right now devs can only do about 20% of what they really want to do, they are constantly battling with the limitations of DX11.

This is why game maps always look so sparse and barren compared with real life, especially vegetation with clumps of if randomly dotted around. massive amounts of thick lush vegetation like a proper meadow or a proper forest is a dream that will become reality in DX12, one of many examples of how we can expect games to look.
 
This is why game maps always look so sparse and barren compared with real life, especially vegetation with clumps of if randomly dotted around. massive amounts of thick lush vegetation like a proper meadow or a proper forest is a dream that will become reality in DX12, one of many examples of how we can expect games to look.

Don't forget texture diversity. Every time a different texture/material is used a state change is required, a new state change means a new batch. What breaks batching as well are shaders and dynamic lighting/shadows.
 
I'd be surprised if we see a radical change with any early DX12 game.
Tbh, I'd be surprised if we see any radical improvements in IQ while this console generation exists.

The new Fable is DX12, can't see it being any type of graphical flagship.
 
a lot of posters say that dx12 performance in current gen cards wont matter, by the time theres software the hardware will have moved on...

I wonder wether this is true for people who keep gpus for 3+ years and only pay more when they need to, presumably with the right/wrong level of DX12 support a 980 for 1080p gaming could remain relevant for significantly longer or not. It not like console performance of this generation hasnt got a very hard ceiling...

id be interested to know from eg developers what dx features might be emulated fine and what will be more night & day if not provided by gpu in hardware...then a more meaningful appreciation of value against feature support could be drawn.
 
Don't forget texture diversity. Every time a different texture/material is used a state change is required, a new state change means a new batch. What breaks batching as well are shaders and dynamic lighting/shadows.

Yeah, right now the use of textures are often repeated for different geometries, it avoids having to rebatch.
So a wider range of textures making maps visually more dynamic, and higher res with less tiling.


I'd be surprised if we see a radical change with any early DX12 game.
Tbh, I'd be surprised if we see any radical improvements in IQ while this console generation exists.

The new Fable is DX12, can't see it being any type of graphical flagship.

I'll have a visual comparison and benchmark out the moment Cryengine gets DX12 :D
 
Yep I'm that biased, I sold up and moved to Nvidia.;)

Doesn't change the fact Mantle>current DX.

That doesn't make you less biased, as seen by your screenshot. Our bench thread for DA:I shows a complete different story to your screenshot and the 970 DX11 is beating out the 290X Mantle. I don't tend to put too much stock in a dodgy looking screenshot anyways.

As for Humbug's video, he should have shown the settings used really, as there is no way that Mantle is going to give double the frames over DX11 (even if DX11 is that bad on AMD), the DX11 was a stuttering mess and that isn't something I have seen and whilst I also have run both Mantle and DX11, I can honestly say there was no noticeable difference for me. I know he is running on a crappy CPU but double the performance is never going to happen and anyone with common sense knows this.

DX12 will be the same, as seen here:


As you can see, a 3/4 fps difference in that vid between DX11 and DX12. While that is a nice boost, it isn't double the frames or a magic wand to give magical frame rates but the latency times are much better and for people with sharp reactions, in games like BF4, that is a Godsend.
 
As you can see, a 3/4 fps difference in that vid between DX11 and DX12. While that is a nice boost, it isn't double the frames or a magic wand to give magical frame rates but the latency times are much better and for people with sharp reactions, in games like BF4, that is a Godsend.

The benefit will be larger in MMORPG and RTS games.

WoW can be bottlenecked by a i5 a 4.2GHz.
 
That doesn't make you less biased, as seen by your screenshot. Our bench thread for DA:I shows a complete different story to your screenshot and the 970 DX11 is beating out the 290X Mantle. I don't tend to put too much stock in a dodgy looking screenshot anyways.

As for Humbug's video, he should have shown the settings used really, as there is no way that Mantle is going to give double the frames over DX11 (even if DX11 is that bad on AMD), the DX11 was a stuttering mess and that isn't something I have seen and whilst I also have run both Mantle and DX11, I can honestly say there was no noticeable difference for me. I know he is running on a crappy CPU but double the performance is never going to happen and anyone with common sense knows this.

DX12 will be the same, as seen here:


As you can see, a 3/4 fps difference in that vid between DX11 and DX12. While that is a nice boost, it isn't double the frames or a magic wand to give magical frame rates but the latency times are much better and for people with sharp reactions, in games like BF4, that is a Godsend.

The Frame rate isn't what you really should be looking at.. Like I have said in the past when Benchmarks is comparing Nvidia DX11 vs Mantle..
60fps on DX11 vs 60FPS on DX12 will feel much better, because of the lower frame latency..

You also need to put yourself in a situation of DirectX 11 bottleneck to see the big gains we talking about. BF4 Map Siege of Shanghai around D flag is a very god example.. Another bottleneck is when you add in more GPUs again you see much better performance increase because the poor CPU threading on DX11 is gone..
 
That doesn't make you less biased, as seen by your screenshot. Our bench thread for DA:I shows a complete different story to your screenshot and the 970 DX11 is beating out the 290X Mantle. I don't tend to put too much stock in a dodgy looking screenshot anyways.

As for Humbug's video, he should have shown the settings used really, as there is no way that Mantle is going to give double the frames over DX11 (even if DX11 is that bad on AMD), the DX11 was a stuttering mess and that isn't something I have seen and whilst I also have run both Mantle and DX11, I can honestly say there was no noticeable difference for me. I know he is running on a crappy CPU but double the performance is never going to happen and anyone with common sense knows this.

DX12 will be the same, as seen here:


As you can see, a 3/4 fps difference in that vid between DX11 and DX12. While that is a nice boost, it isn't double the frames or a magic wand to give magical frame rates but the latency times are much better and for people with sharp reactions, in games like BF4, that is a Godsend.

All Mantle users are telling you they experience what my video shows, the only one disagreeing with me is the one who doesn't have Mantle, you.

Says it all.
 
All Mantle users are telling you they experience what my video shows, the only one disagreeing with me is the one who doesn't have Mantle, you.

Says it all.

You are aware he did have mantle on his 290x? .
He went off his experiences you work of yours, Different setups i would expect different results, Both valid results imo
 
You are aware he did have mantle on his 290x? .
He went off his experiences you work of yours, Different setups i would expect different results, Both valid results imo

Yes he had an overclocked 12 thread i7 with one 290, don't know if he tested for bottleneck on Shanghai but with that sort of setup he ain't going to find much. tho possibly some.

From that he thinks he has the right to call me a lair, normally people wouldn't do that given that ^^^ sort of system and a lack of actually testing for it, whats more i provided the best possible proof of it, a video where you can actually see it happening live.

Given those circumstance normal people would just say "ok point taken" but this is Greg. he's like that Iraq Information Minister.....

You have to wonder how it is that he's getting free Nvidia games.
 
Last edited:
I am sure some people don't even bother to read what I type and instead go full rage because I own nVidia as well as AMD. As much as I prefer nVidia, I have always remained honest in both vendors.

Anyways, I agree with Skeeter and this thread is about Maxwell and DX12.
 
I am sure some people don't even bother to read what I type and instead go full rage because I own nVidia as well as AMD. As much as I prefer nVidia, I have always remained honest in both vendors.

Anyways, I agree with Skeeter and this thread is about Maxwell and DX12.

I read, just take it with a pinch of salt. From what I recall the issues you had with the 290x, all because of the nvidia monitor you have. That would not apply to majority.

The gsync/freesycn crap is annoying, it should be one standard. Not a monitor that locks you into one camp or the other. Nvidia always try to do that which annoys me personally. Look at hairworks, it's essentially TressFX locked to attempt to make nvidia cards look better. Does it do anything better than TressFX? From what I can see no. As much as I like nvidia cards and driver support, I don't like the direction they try to take PC gaming in. Makes it hard for me to want buy their products because of this. I never had these issues with them in the past, hence why I have owned and enjoyed using many of their cards.
 
Back
Top Bottom