• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Radeon Fury X & Vulkan??????

and the only game that really needs a dx12 patch still doesnt have it...Planetside 2

I would love for Blizzard to implement DX12 or Vulkan for World of Warcraft.

They're OS X team has gotten Apple's Metal API working in the beta, nothing for Windows, and poor Linux is still left to use wrappers.
 
I would love for Blizzard to implement DX12 or Vulkan for World of Warcraft.

They're OS X team has gotten Apple's Metal API working in the beta, nothing for Windows, and poor Linux is still left to use wrappers.

It would be especially good for WoW, since MMOs are so CPU heavy.

Even with an overclocked 6700k and 1080 you'd see severe frame drops in large raids or towns/battelground pvp (basically any time there's lots of players and/or enemies with effects going on).
 
and the only game that really needs a dx12 patch still doesnt have it...Planetside 2

I am so with you there. I still have no freaking idea why the hell they went with DX9. It runs surprisingly well now though considering. Big battles were a nightmare at times though with so much CPU bottlenecking going on.

Loadsa.. not sure why you removed some from the list of released DX12 games though?
 
I am so with you there. I still have no freaking idea why the hell they went with DX9. It runs surprisingly well now though considering. Big battles were a nightmare at times though with so much CPU bottlenecking going on.

Loadsa.. not sure why you removed some from the list of released DX12 games though?

Because it suited an agenda?
 
You can force nightmare settings on a Fury and it runs just fine. At least I know it does at 1440p.

Literally zero difference anyway, just some random shadow detail that virtually unnoticeable. Just a setting that sucks up a lot of vram for zero difference.

I was able to run Nightmare at 4K on OpenGL and aside from costing a couple of FPS there was very little difference in image quality. If you're making a 4GB is not enough argument, Doom is not a good example.
 
I was able to run Nightmare at 4K on OpenGL and aside from costing a couple of FPS there was very little difference in image quality. If you're making a 4GB is not enough argument, Doom is not a good example.

Doom may not be, granted, but on one side we're told how if you want to future proof your stuff you need to buy an AMD card as all games will implement the features AMD cards have. It is also correct to expect games to use more and more memory as time goes by, stuff like that always happens.
Are we future proofing anything? Debatable, as many exchange cards as others exchange dirty shirts.

We're told not to buy a 970 even if it's on par with a 480 because ... memory and future proofing. A lot of people seem to be giving advice based on weather or whatever suits their immediate purpose, some consistency would be nice for a change :D
 
Back
Top Bottom