• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

*** AMD "Zen" thread (inc AM4/APU discussion) ***

True regarding the 1000 series.

We run some benchmarks with Kaapstad 6-7 months ago with TW Warhammer and posted them here.
The Nano was losing 1fps between DX11 and DX12.

My GTX1080 @2190 was losing 20% of it's performance, and the same applied to Kaaps TXP.

So it would stand to reason then that Nvidia are holding DX12 back as much as they can because they are not ready. Neither hardware wise nor software wise, However once they are ready with both a new GPU and drivers I bet we will suddenly start seeing lots of good performing DX12 games.

My issue tho with going AMD GPU again is the loss of game works. For example the Witcher 3. Such a great game. Was Game Works. I know it's probably irrational but that would bug me to have to turn off stuff in the game just because I had a AMD card.

But then on the flip side neutral games with DX12 and Vulkan will probably work better on the AMD card than Nvidia.

---

AMD have always had company they use almost as tech demos.... for example Ashes of the Singularity is almost like studio which is an extension of AMD who showcase features of their tech stack.

Same with Oxide gaming.

They did it back in the day with Mantle and Dice.
 
Last edited:
For the latest games yeah.

I'd like to see how the dual Fury card and an overclocked 1800 with 3400mhz memory would do in non canned benchmarks.
 
So it would stand to reason then that Nvidia are holding DX12 back as much as they can because they are not ready. Neither hardware wise nor software wise, However once they are ready with both a new GPU and drivers I bet we will suddenly start seeing lots of good performing DX12 games.

By the time Nvidia gets it act together AMD will be 4 or 5 generations ahead and will probaly be pumping cash from Ryzen into graphics. It might be a question of how many people will be using Nvidia then?
 
My issue tho with going AMD GPU again is the loss of game works. For example the Witcher 3. Such a great game. Was Game Works. I know it's probably irrational but that would bug me to have to turn off stuff in the game just because I had a AMD card.

Gameworks is now going Open source so you won't be missing much at all. It'll stop being a black box, and like AMD's GPUOpen, developers can work with the source code.

http://www.game-debate.com/news/224...en-source-can-now-be-optimised-for-amd-radeon

Rise of the Tomb Raider with Pure hair is a prime example, better hair physics and detail than hairworks, and a total lost of 4 FPS for performance.

Imagine how good hairworks would have been in Witcher 3 if it wasn't an Ad Hoc solution, where a developer couldn't finetune or customise it.
 
Really starting to like his channel, goes into much more detail then other so called big channels.


480CF @ 1.3ghz beating by a big margin an overclocked TXP, and consequently the GTX1080Ti on a Gameworks game in DX12?
Costing half the money compared to the 1080Ti and 1/3 the money of the TXP?
That made my evening :)
 
480CF @ 1.3ghz beating by a big margin an overclocked TXP, and consequently the GTX1080Ti on a Gameworks game in DX12?
Costing half the money compared to the 1080Ti and 1/3 the money of the TXP?
That made my evening :)

I did a similar run a few weeks ago with a GTX1080 with the new DX12 performance driver.

So I was running a GTX1080FE on a IB Xeon E3 1230 V2/Core i7 3770.


That is the same Geothermal Valley sequence he tested except I am running a much older CPU than most reviews use.

1Qo6aRs.jpg

FXPCJAR.jpg

KcAkpkB.jpg

Average was 51.32FPS or thereabouts.

The biggest CPU bottleneck I had with my old CPU was basically the part when you walk into the area with loads of animals/.

Moved to DX12 - it was blooming weird.

gjlWVWz.jpg

I ran it loads of times before,and you would see runs look very decent and then it would just drop out.

If you dropped down the resolution it seemed to not manifest itself as badly.

I8yB2A3.jpg


XmL4Mcb.jpg

Qr7o9Hr.jpg

The second area was less CPU intensive so wasn't so variable.

Now look at the RX470 4GB(bottom set and GTX1080 top set).

O3eoSSb.jpg

Some variability but not as bad.

So if XFire scales properly in the game even a pair of RX470 8GB cards might be decent enough at qHD.
 
Last edited:
Gameworks is now going Open source so you won't be missing much at all. It'll stop being a black box, and like AMD's GPUOpen, developers can work with the source code.

http://www.game-debate.com/news/224...en-source-can-now-be-optimised-for-amd-radeon

Rise of the Tomb Raider with Pure hair is a prime example, better hair physics and detail than hairworks, and a total lost of 4 FPS for performance.

Imagine how good hairworks would have been in Witcher 3 if it wasn't an Ad Hoc solution, where a developer couldn't finetune or customise it.

Witcher 3 is fine on Polaris though - I tried it with my same old ancient CPU two weeks ago.


hJQZU3j.jpg

Lz6iMPE.jpg

4hiC08K.jpg

I was getting around 45FPS with an RX470 4GB at those settings with tessellation to the max and Hairworks to the max(GTX1080 is 80ish FPS). An RX480 8GB should be somewhat faster due to the 15% extra shaders,more VRAM and higher speed RAM.

Remember this is not with a very new CPU either.

The Polaris cards definitely seem to handle tessellation and Gameworks effects better than older ones.
 
Nice video. Vega + Ryzen will be interesting.

Wondering if I should try to sell my 1070, but don't know how much Vega is expected to be.
 
CAT: looks like something was throttling on those benches.

It didn't happen under DX11 at all - DX12 it wasn't throttling since I actually decided to monitor what was happening. So if I started from cold,it could start from the get go,with the weirdness. Switch the system off for a few hours,it would be OK initially and be a tad weird. Tried a different PSU,no change.

Basically at one point it could randomly just stutter and not recover,which is the part where the CPU was being kiboshed the worst. There was actually no pattern to it.

Deus Ex:MD is even harder on the card than ROTTR and it didn't show the same even during a big boss battle.

PS2 is even heavier on the CPU and I never seen anything like that.

It was totally random. CPU was fine,GPU clockspeeds seem fine but if you dropped resolution it happened less.

Now the AMD card didn't show this even though it was not consuming much less power.

What I did notice though the VRAM usage was much higher under DX12 - under DX11 it was well under 8GB,but under DX12 it seemed pegged at 8GB,so I think it literally ran out of VRAM. The RX470 seemed pegged at just under 4GB.

When I dropped the texture settings to High,the DX12 issue disappeared.

Yet in the second area,very high was perfectly fine. There is something very weird going on with the Nvidia drivers under DX12 in that part,and AMD seems to be relatively better.

Its why I can believe the RX480 setup doing quite well on a Ryzen setup in the game under DX12 in that part.
 
Last edited:
Oopsies! Nice graph they have there lol.
nyzqza1p7toy.png



https://www.reddit.com/r/Amd/comments/62o6ug/kill_me_please_from_totallysilencedtech/

Confused as to the fps though, surely a 1080 would get higher than that?
My RX480 flit between 110-160.
 
You should read the Reddit thread,apparently the chap has deleted the video and his twitter account. Also lol at the chap making a 2FPS different look like 10 times bigger.
 
Ryzen 5 1400 vs i5 7400 vs G4560 vs i3 6100

Disappointing, performs like an i3.


Now here is an obvious Intel shill, the same GPU usage on the same part of the game and yet the performance is different, that can only happen if the graphics settings are different, the i5 is running on lower graphics settings.

How much work do you put into finding all these nonsense Ryzen reviews Raven? you do seem to have a constant supply of them.

Oh and i watched it, you have to be quick and don't blink to catch the rare occasion where the i3 matches it, 90% of the time the 4 core Ryzen is way faster, predictably.

bnxcf.png
 
Last edited:
Now here is an obvious Intel shill, the same GPU usage on the same part of the game and yet the performance is different, that can only happen if the graphics settings are different, the i5 is running on lower graphics settings.

How much work do you put into finding all these nonsense Ryzen reviews Raven? you do seem to have a constant supply of them.

Oh and i watched it, you have to be quick and don't blink to catch the rare occasion where the i3 matches it, 90% of the time the 4 core Ryzen is way faster, predictably.

bnxcf.png


Go away, he is just a guy benching on youtube, if you had ever watched any of his vids and I have I have been subbed for months. Typical AMD fanboy response, if the result don't look good then must be an Intel shill lol.
 
Back
Top Bottom