Haha probably, but then again, it would switch itself off or set itself on fire if it was overheating and faultyOr you're got an eco mode thats broken.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Haha probably, but then again, it would switch itself off or set itself on fire if it was overheating and faultyOr you're got an eco mode thats broken.
What's the deal with HDMI 2.0 anyway... I hear it's considered "legacy" already and uptake seems to have been very slow? From what I can see the new 4K Blu Rays coming later this year will have HDR and you need the new HDMI 2.0a to get that, is that why?
You're missing the point which was to show the crossfire setup scaling very well on the review I showed.
Also, the review you showed didn't even say what the Fury x was overclocked too! and by you're logic oc to oc is the only fair comparison. In that case a 15% oc on the Fury x is the most fair comparison which means the fury needs to be running @ 1207mhz, which I bet my back teeth it wasn't.
Oh the irony.
Or you're got an eco mode thats broken.
Yes, because everyone is out there running 1200+ OC's on their FuryX's aren't they
Yes, because everyone is out there running 1200+ OC's on their FuryX's aren't they
He should get a power reader for the socket the PC plugs into (Don't included monitors etc). I had one, and it should my L3600 was only pulling a max of around 50W when running at full pelt.
I dare say no one is because we don't have the tools to yet. I can manage just over 1150 on stock volts so I'm pretty keen on seeing what I can get once the new Afterburner releases.
But he's right - you're thinking comparing stock to stock is not fair (don't understand that myself) but thinking highly OC to stock is fair comparison?
I don't think the adapter is a "fix" so much as catering to people who use TVs instead of monitors for their PC. I had to buy a DP to DVI adapter to connect my 2nd monitor - is that AMD's fault too?
it IS stock for stock - they bought that card and ran it at the default settings
it isn't Nvidia's fault that AMD aren't allowing after market FuryX's
You know its your fault for buying a hmdi 2.0 TV right
Hardly the same thing, why not simply include HDMI 2 support in the first place? no one I have asked has an answer including AMD reps probably because the answer would be pure incompetence by AMD.
Most 4K TV's support 4K at 60hz over HDMI 2.0, but the numpties at AMD thought it better to go with the 2009 HDMI 1.4 standard that only supports 30hz at 4K, whichever way you look at it thats a really stupid decision for a manufacturer releasing a £500 graphics card in 2015.
Then to come out and say "Oh an adapter will be out soon that will fix it" confirms they know they cocked up.
AMD have full ungimped HDMMI 2.0 with this adaptor.
Nvidia is gimped sampling HDMI 2.0. Get your facts straight.
how is it gimped?
Really . OC by default does not mean its stock clocks.
Then the G1 is a stock 980TI and blitzes the fury x. Why don't you just use that for your argument.
I didn't realise many people gamed on 4k TVs? I know i would MUCH MUCH prefer a actual gaming monitor vs a TV!
If I had to guess, I would say that AMD want people to stop using HDMI so they can stop paying the HDMI tax. It's certainly something I would like to see happen....why not simply include HDMI 2 support in the first place?..