• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fable Legends: AMD and Nvidia go head-to-head in latest DirectX 12 benchmark

the other reviews seem to corroborate PCPer's results, extremetech are the odd one out right now, which looking at Joel's history of articles raises some questions

Yeah just seen the others, Nvidia ahead then. Maybe Shrout isn't as biased as I have been led to believe :P

Looking forward to seeing DX12 in a game I actually want to play :D
 
A good thread although I have to agree with Stanners and you knew what you was doing Shankly and unsurprisingly, AMDMatt and Tommybhoy all singing off the same sheet.

Anyways, it is good to see AMD doing well in this one as they are not doing so well in AOTS but would be good to see what is what by users. Hopefully we can get hold of the benchmark soon and see if results stand fair on this one.

What ever floats your boat.
 
Very good results all-round. No need to be at each others throats. :p

Do us consumers ever get access to this benchmark?

You can signup for the closed beta now, the game is going to be free to play as well anyway (don't know what the charge structure is going to be like yet, no mention of whether its a paywall or cosmetics or a mix)
 
AMD always rush a beta driver out so that they can get the jump in early benchmarks, it's a shame they are not so quick to fix bugs which can ruin the actual gaming experience on their cards.

I bet they are not even doing full tessellation if the game supports it either, now that their cheats are widely accepted they can just lower tessellation factors to the lowest point with every game.


One thread, AMD drivers are so slow, next, they rush out a beta driver to get in early with benchmarks but apparently don't fix bugs.

Lets go back to Witcher 3 shall we... AMD, I ran the game perfectly fine on a WHQL driver from about 2 months before Witcher came out, the game ran perfectly without bugs, no need to rush anything out.

Nvidia however, rushed out a half dozen beta drivers for a Nvidia game and all of them were buggy as hell.

Then we have your tessellation BS, if AMD don't over tessellate to a degree YOU PHYSICALLY CAN'T SEE and has no improvement in IQ, they are offering 'lower' tessellation and it's somehow bad. How you haven't been banned yet I don't know. Your argument changes by the thread, is always a lie and is always trolling.
 
Last edited:
One thread, AMD drivers are so slow, next, they rush out a beta driver to get in early with benchmarks but apparently don't fix bugs.

Lets go back to Witcher 3 shall we... AMD, I ran the game perfectly fine on a WHQL driver from about 2 months before Witcher, the game ran perfectly without bugs, no need to rush anything out.

Nvidia however, rushed out a half dozen beta drivers for a Nvidia game and all of them were buggy as hell

were they :confused: must have missed that when played through witcher 3.
 
AMD always rush a beta driver out so that they can get the jump in early benchmarks, it's a shame they are not so quick to fix bugs which can ruin the actual gaming experience on their cards.
.

This is how I see it and to be fair Nvidia do the exact same. It's all about getting one over on each other, releasing a beta driver to get a few higher numbers prior to launch (makes no actual difference in game). It's PR/Advertising from both companies to try and hook in potential customers who're deciding on that GPU purchase, for that one new game coming out, in this case - Fable Legends.

Both companies are as bad as each other :)
 
Last edited:
Its strange to read about AMD 390 crashes. I have my 390 on W10, and not a single hiccup so far. Also that's the first i hear about any problem, never heard it mentioned before.
 
Wouldn't get too excited - GPUs designed with DX12 in mind will crush anything out now for DX12 performance (I mean utter annihilate) - and devs once upto speed on DX12 won't leave that untapped - either through laziness (using the extra performance potential as a buffer against sloppy coding) or actual feature usage.

There is a reason I've not bought into Maxwell (and holding out for next gen) and it certainly isn't due to the financial side (EDIT: Well I guess it is a bit in that I'm not just throwing the money around).
Yeah,but the problem is that the GTX970 and GTX980 came out a year after the R9 290 series, so I don't understand why the R9 290/390 series seem to be besting them in any of the two to three DX12 tests so far.

At least the GTX980TI seems to be doing well,so is there something lacking hardware wise in the GTX970 and GTX980 that the GTX980TI has??
 
Yeah,but the problem is that the GTX970 and GTX980 came out a year after the R9 290 series, so I don't understand why the R9 290/390 series seem to be besting them in any of the two to three DX12 tests so far.

At least the GTX980TI seems to be doing well,so is there something lacking hardware wise in the GTX970 and GTX980 that the GTX980TI has??

Because the 290 series was a high end card and the 980 a mid-end card, hardly particularly surprising that when setting get turned up the high end card pulls ahead a little.
 
One thread, AMD drivers are so slow, next, they rush out a beta driver to get in early with benchmarks but apparently don't fix bugs.

Lets go back to Witcher 3 shall we... AMD, I ran the game perfectly fine on a WHQL driver from about 2 months before Witcher came out, the game ran perfectly without bugs, no need to rush anything out.

Nvidia however, rushed out a half dozen beta drivers for a Nvidia game and all of them were buggy as hell.

Then we have your tessellation BS, if AMD don't over tessellate to a degree YOU PHYSICALLY CAN'T SEE and has no improvement in IQ, they are offering 'lower' tessellation and it's somehow bad. How you haven't been banned yet I don't know. Your argument changes by the thread, is always a lie and is always trolling.

Not sure where you have got your info but I was massively playing The Witcher 3 from launch and it ran perfectly well on my GTX 970, my Titan X and my Fury X. Great game and no problems on any driver/platform.

As for the rest, I have no interest in calling for bans and wish others would stop it. Very child like behavior.
 
Because the 290 series was a high end card and the 980 a mid-end card, hardly particularly surprising that when setting get turned up the high end card pulls ahead a little.

The thing is that both GPUs are within 10% of each other die area wise,the GM204 does have less memory bandwidth but uses better compression which Hawaii lacks and also has better tessellation.

It will be quite interesting to see how the Kepler Titan Black and GTX780TI do in this too!
 
Yeah,but the problem is that the GTX970 and GTX980 came out a year after the R9 290 series, so I don't understand why the R9 290/390 series seem to be besting them in any of the two to three DX12 tests so far.

At least the GTX980TI seems to be doing well,so is there something lacking hardware wise in the GTX970 and GTX980 that the GTX980TI has??

we've already seen with AOTS that game patches are making big changes to frame rate (e.g. 38 > 45fps from 0.50 to 0.51), and then add in that some people have shown that async does work on Maxwell cards and it seems to be a game / drivers problem in both games that are pre-beta

similarly the 290's being so close to a FuryX, there could be more changes to come in that comparison

basically its too early to call anything definitive on both sides
 
3GBFi7F.png
 
lol just to clarify that's not real, it's a punt at what he said ages ago about APU performance and misread the charts on the article and entirely my own effort :D.

The fact you might think it is though is a testament to the man :p
 
Back
Top Bottom