Soldato
- Joined
- 30 Nov 2011
- Posts
- 11,375
Sorry, it was howling that said that
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
It started because Lokken said that a FuryX was faster than a 980ti at 4K and that they scaled better meaning that FuryX Xfire was consistently faster than 980ti SLI... he didn't say "only using reference cards", or "only using cherry picked AMD supporting games".
OC FuryX are not going to be available, so comparing a non-existent product to ones that actually exist is going to be difficult. That review was "apples for apples" because it compared two real products that you can buy. The 980ti's in that review were not manually overclocked, so saying "its not fair because you can overclock the FuryX" is disingenuous, because you can also overclock the 980ti's that were used.
Yes, I would love to find out how far the FuryX will overclock with voltage, hence why I've ordered one, I would also love to know how a 1535 980ti will compare to whatever the FuryX will manually overclock to. However possible future outcomes aren't directly relevant to an absolute statement like "FuryX is faster than 980ti at 4K", or "980ti SLI cannot beat FuryX crossfire", both of which are objectively not true in absolute terms.
Don't quote me then,
Actually I think I'm the one that said the FuryX was better at 4k and had better scaling. The second statement is true in general, though I'll concede it is game dependent but the overall trend supports it. The first statement is true when comparing stock for stock, and could be true or false comparing max OC to max OC but that comparison is not available to make and I'm not one to simply guess. I don't see OC vs non-OC as a good means for comparison, but like I've said - you can make an argument for it.
So yeah, both statements are not true in the absolute as there are always outliers - yet they are both undeniably true statements.
They're off again!!
Some of you guys realllllllllllllly need to get laid
Still waiting for your answer as to why HDMI 2 is gimped on my 980ti
And yes a bold and cunning decision to make the Fury range HDMI 1.4 supporting 30hz at 4K, total genius!!! how Nvidia must envy such marketing strategies by AMD
True, or play some games.
What planet are you from?
Really does amaze me how everyone and their granny suddenly had this need of hdmi 2.0 when fury didn't support it. Funny how that works. Must have been a mass run of sales on hdmi 2.0 monitors or something i missed.....
Really does amaze me how everyone and their granny suddenly had this need of hdmi 2.0 when fury didn't support it. Funny how that works. Must have been a mass run of sales on hdmi 2.0 monitors or something i missed.....
Same thing every time AMD release a new card. The trolls pore over the specs trying to find something they can attack (no matter how insignificant) and then hammer it endlessly.
They go on nvidia forums when they want to complain about 3.5GB, gimped chroma, HDCP, drivers etc etc so it looks like NV is squeaky clean here.
Not to mention the rich ones who buy the enemy's card just so they have a defence against being called a troll.
Really does amaze me how everyone and their granny suddenly had this need of hdmi 2.0 when fury didn't support it. Funny how that works. Must have been a mass run of sales on hdmi 2.0 monitors or something i missed.....
As mentioned before - 4K TVs used with a PC is roughly as popular uptake as 4K monitors as a good few prefer to have a larger physical screen size to make the most of that resolution and as things stand to get 60Hz at 4K on those TVs 99% of them can only achieve it over HDMI 2.0 - the selection with displayport is extremely limited though starting to grow a little now.
For 4K people would be looking at top of the line GPUs so HDMI 2.0 support would be something they'd expect to find.
Not to mention the rich ones who buy the enemy's card just so they have a defence against being called a troll.
Same thing every time AMD release a new card. The trolls pore over the specs trying to find something they can attack (no matter how insignificant) and then hammer it endlessly.
They go on nvidia forums when they want to complain about 3.5GB, gimped chroma, HDCP, drivers etc etc so it looks like NV is squeaky clean here.
Not to mention the rich ones who buy the enemy's card just so they have a defence against being called a troll.