• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fury X Vs Titan X - Performance compared

I shall be sticking to a set path of recording and I certainly shan't be downclocking the TX, as that would be purposefully crippling it and I would then be jumped on (and rightly so). I have been pretty fair in my bench runs and my TX will clock at 1430Mhz (stock volts) with ease for benching but my Fury X does well to cope at 1120mhz (stock volts). I can do a bench with max reachable overclocks if people want to see that but for now, just doing "out the box bench runs.


Edit:

Just to add - How I am doing this is apples to apples testing. Both have nothing changed, both have the same settings, both have the same resolution and both are recorded with the same hardware. This is what apples to apples is. You want me to do apples to grapes fs123 :D
 
i dont agree with downclocking lol
but running it without good airflow is a good test
not everyone has a huge case

linus said he wants do all his tests now in a smaller case and not on a bench
fury might do a bit better
 
I shall be sticking to a set path of recording and I certainly shan't be downclocking the TX, as that would be purposefully crippling it and I would then be jumped on (and rightly so). I have been pretty fair in my bench runs and my TX will clock at 1430Mhz (stock volts) with ease for benching but my Fury X does well to cope at 1120mhz (stock volts). I can do a bench with max reachable overclocks if people want to see that but for now, just doing "out the box bench runs.

It wouldn't be crippling it, essentially what you are doing is comparing a stock Fury-X to a factory overclocked Titan-X.

I don't think you should down clock it, but, you could overlay your videos with the GPU name, it would be very easy to put underneath "Factory OC" and "Stock"

I think that would be fair.
 
Since you are redoing the vids, why not use the dynamic scaling feature from both cards and run at 4K to see the performance difference.

To compare architecture performance, you could also do a test where both cards run at 1100MHZ for example.

Don't see the point of him spending his time doing that. The fair thing to do is run both cards at stock reference settings. Then later can do one with max stable overclocks once voltage is unlocked on fury x.
 
Indeed, out of the box is the way to go. Same as the review sites do.

I agree with this if you buy them at a certain speed then test them at that speed even if they are factory overclocked or not, as that is how we buy them.

But you need to say what speeds they are.
 
Last edited:
I guess no one wants to think out of the box and compare architectural performance at identical clocks (it must be hard work moving the clockspeed slider a bit in MSI AB).

Review sites do standard comparisons because thats their job. Users could and should be doing tests which are out of the norm, that stand out.
That's what makes it interesting.

edit: There is no need to make a big deal out of my request/suggestion. It was just an idea.
 
Last edited:
Do apples to cider and grapes to wine. If its remotely drinkable ill buy it.:D

Me and you both Rob :D

I have everything stated at the start of the video's, I have MSI AB overlay running showing what is happening and I don't really see what else I can do or say to my vids in truth? People are saying do this and do that but I bought the cards as they were and used the cards as they were.... Anything else would bring questions into the results and it looks like some want me to cripple the TX to make it fairer on the Fury X and sorry, but that isn't happening.
 
Well I am glad I got to the bottom of it and will redo the vids as they are not up to a standard that is good. I am glad people mentioned the colours and I wasn't happy with them either and it looks all wrong, so I was quite happy for that to be brought up. Not sure on the conspiracy side but did make me chuckle :D

Andy - You are quite correct and I have said it several times that this card gets better when the res gets higher and even with games like SoM where I was certain that VRAM would be an issue at 1440P or 200% scaling, it has chewed through it and at very good frame rates. I would love to do some higher res recordings but the capture card I want is over £400 and I don't think the Mrs would be too happy with me spending that much :D

Anyways, I am now redoing the vids, so for those that are interested, keep an eye out. And thanks to all for the kind words. I am learning as I go and something new for me, so I don't beat myself up when I do make mistakes (you lot do it for me :p) but all settings are correct and have been since I did the first BF4 vid and now all the colours are correct, so everything should be tickity boo from here on out :cool:

(hope I haven't just jinxed myself :D)

It won't end there either dude. You will need savage hard drive speed to be able to capture the video. PCPER use a 6 drive SSD array IIRC just for 1080p.

And that's another one of your issues with doing it on video. The game will slow down if the hard drive can't keep up.
 
It won't end there either dude. You will need savage hard drive speed to be able to capture the video. PCPER use a 6 drive SSD array IIRC just for 1080p.

And that's another one of your issues with doing it on video. The game will slow down if the hard drive can't keep up.

I use an EVO 850 250GB to record the games and it fills up in minutes. I then have to move them across to a 2TB HDD (which takes ages) and when encoding, it is a nightmare, as the files are so damned big it takes Vegas forever just to read them....AND quite a lot of times it then crashes lol. People don't realise the chore to get this done lol :D
 
If you read what I wrote, Architecture performance. Apples to apples test. That would give a true idea of how each architecture performs and how much overclocking would be needed for FuryX to match a TitanX, if and when voltage is unlocked.

BS.

Clock speed is part of the architectural design so you absolutely can't start changing clock speeds. You can design a chip to have a high clock speed and do less work per a clock, or you can design card with low clock speeds that does more work per clock, e..g RISC vs CISC CPUs.



If you want to make some kind of equalizing change then probably the only thing that would make sense is adjusting clocks such that the power draw of the cards are the same. This way you get a performance per watt metric at normalized power levels. Best way to do this would be to overclock the Titian a little since the FuryX uses about 40Ws more at load.
Both AMd and Nvidia's ultimate aim is to get excellent performance per watt so this can make some sense but ultimately has almost no effect on the user.

One might think that temperature could be used to normalize but that is also an architectural design decision. the FuryX has been designed to run cool in order to reduce electron leakage. Its power consumption would explode if it was left to run hot. The 980 Ti was design to run at its operation temperature, electron leakage and power consumption at those temperatures were part of the design. Temperatures are also easily changed by the cooler.



At the end of the day all cards should be compared stock. One could also compare card at max likely overclocks, or factory overclocked cards but FuryX doesn't have these yet so there is no fair comparison.
 
It is funny that a lot of the AMD perspective buyers were disappointed with the performance and reviews yet i think it actually does plenty good enough at the 4K to force some good competition between Nvidia and AMD.

The furyX is a pixel shading monster and that really helps it to gain ground when the resolution increase because other bottlenecks become less important such is tessellation and geometry. The design just has some efficiency flaws at 1o80P, partly driver perhaps but likely the pixel shaders simply don't get worked enough and the HBM bandwidth is not really needed.

they got themselves all hyped up for a fanboy slaughterfest to find out it only "matches" and inplaces both wins and loses.

I personally think AMD nailed it. I don't want one as I don't game at 4k yet. But it looks great it's cool it's quiet and it performs well in the extreme area which is where 980ti users should be using their hardware.

Many reviewers for unknown reasons slated it for not having 8GB which still boggles me as you can clearly see it handles 4k just fine.....

Anyway as long as those who bought them are happy :)
 
I use an EVO 850 250GB to record the games and it fills up in minutes. I then have to move them across to a 2TB HDD (which takes ages) and when encoding, it is a nightmare, as the files are so damned big it takes Vegas forever just to read them....AND quite a lot of times it then crashes lol. People don't realise the chore to get this done lol :D

I understand as I attempted to do such a 1080p recording once and promptly gave up. That is why I think you should really just record frame rates and use screen shots to compare images. If there is some aspect that one of the cards seems to suffer from like stutter then a video would make sense to show it but otherwise it is a lot of work and a huge PITA and personally, I find it very hard to watch the side by side comparisons and get any clear idea of the differences. Small changes in synchronization could cause large differences in FPS during any particular moment.
 
Back
Top Bottom