let's not get to excited about Mantle itself, to little to late? A few extra fps and washed out graphics does not a DirectX replacement make
Say what?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
let's not get to excited about Mantle itself, to little to late? A few extra fps and washed out graphics does not a DirectX replacement make
Interesting quotes there from the original article and hopefully they are close to full swing. Not sure if this would be an exclusive DX12 for W8 or even further ahead and W9 but it is the right thing to do and I trust in D3D to do the right thing.
Say what?
Windows 9 and DirectX 12 makes a lot of sense, ditch the stuff people hated about W8 keep the bits people liked, add in better performance API in DirectX 12 and Windows 9 could be extremely popular.
Yep we asked them for nearly 20 years and got nothing for a good while now. We are still asking so hopefully they deliver.
All we have seen so far from Mantle is a few extra fps and washed out graphics in BF4, so Microsoft have plenty of time to make a response.. I assumed Flopper's post was a joke but I'm not sure? Lol.
Flopper said
"to little to late.
I love Mantle."
All we have seen so far from Mantle is a few extra fps and washed out graphics in BF4, so Microsoft have plenty of time to make a response.. I assumed Flopper's post was a joke but I'm not sure? Lol.
Agreed. It is a shame when you need a very expensive CPU to power a couple fo high end cards and even then the CPU needs a massive OC to cope.
Games like Hitman/Sleeping Dogs/BF3/4/Even Tomb Raider require the 3930K to be at over 4.6 to allow a pair of Titans to work at full speed. It would be great to have the option to buy a cheap CPU and for it to have the ability to power a couple of decent cards. I remember the 2500K not being that old and not having the ooomph to power a pair of 680's in BF3.
Anyway, It is an open standard that doesn't work in favour of one or the other and hopefully MS can deliver. As much as I want OGL to be in the mix, sadly I can't see that happening (not without a massive cash injection).
I get 90+% scaling over three Ti's at 1440PI know that's not 1080 though.
50fps is a lot mate lol.
Bang for buck is something I'm really not a fair listener of.I want to do disable cores but I don't know if I can be bothered at the moment
50fps is a lot mate lol.
Bang for buck is something I'm really not a fair listener of.I want to do disable cores but I don't know if I can be bothered at the moment
110 FPS? it depends on a lot of things, GFX setting, CPU bottleneck.... 110 FPS @ 1440P on Ultra certainly doesn't seem bad, not at all.
It's not just about bang for buck. It's your overall system is just much more powerful in every way. The only real way to test would be for to bang in matt's cards or for matt to use 2 780ti's.
If i threw in another card id beat your 175 fps though, even on pci-e 2.0 x8 x8 and with old reference blowers.![]()