Yer, I should have had a play really but just too damned hot. Hopefully back on soonPatch 1.7 is shaping up nicely imo, some great new stuff in there. Classified gear is gonna be super-rare but SO good![]()

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Yer, I should have had a play really but just too damned hot. Hopefully back on soonPatch 1.7 is shaping up nicely imo, some great new stuff in there. Classified gear is gonna be super-rare but SO good![]()
Talking about DX12 I happened to find this link over at AMD reddit. A pot from 6 years ago on APIs
https://forums.tripwireinteractive....nsight-into-the-issue-of-draw-calls-on-the-pc
Partial Quote
I've yet to read the whole thing, but from what i've read it does seem legit.
Considering how low power and cheap Ryzen CPUs are, I think we might see some custom ryzen CPUs in the next-gen/next refresh of consoles; At that point PC gaming would lose its advantage.
The entry requirements are not at all ridiculous.to get access to the used hardware section that has a ridiculously high requirement to get access to?
threads like this are a gold mine for that![]()
I also backed up your claims with my own mantle vs dx 11 benchmarks shankly (in fact we had a number of people and even video comparisons) but nah, lets not bring "facts" with hard evidence into this sub-forum, instead lets just spout any old rubbish
Yup
And remember rise of the tomb raider, launched with just dx 11, eventually got a broken dx 12 patch which didn't see much improvement for any card and the nvidia lot were saying that async was included and then a few months later, a refinement patch came with async added and then we saw some nice gains for amd yet againFunnily the nvidia defence league went very quiet after that
What makes the above even funnier is that the xbox one had async on day 1 so it was purposely removed for the PC version...... oh and this:Even though, purehair was using amd's work![]()
But that is what "sponsorship" grants you![]()
I'm curious as to how your thought process goes when it comes to deciding when to use an apostrophe.bum di
Nvidia purposely got involved with the PC version of Rise of the Tomb Raider late in the PC version's development phase so they could ensure any tech such as a sync compute which they struggled at was removed, On PC it became a DX11 title that ran in way's that favour Nvidia hardware. AMD dropped the ball with that one.
At the time AMD and more so the fanboy's were blowing hot air about how 4gb's of HBM was like 5 or 6 gb's of GDDR5, Nvidia knew this was rubbish and made sure the ram usage exceeded 4gb's at all resolutions so that the 4gb Fiji cards would be unable to use the highest texture setting. Which in turn made the 6gb 980ti and 12gb Titan X look like the next level up of gpu, which they were anyway but, Nvidia just wanted to take advantage of every opportunity to highlight it.
bum di
Nvidia purposely got involved with the PC version of Rise of the Tomb Raider late in the PC version's development phase so they could ensure any tech such as a sync compute which they struggled at was removed, On PC it became a DX11 title that ran in way's that favour Nvidia hardware. AMD dropped the ball with that one.
At the time AMD and more so the fanboy's were blowing hot air about how 4gb's of HBM was like 5 or 6 gb's of GDDR5, Nvidia knew this was rubbish and made sure the ram usage exceeded 4gb's at all resolutions so that the 4gb Fiji cards would be unable to use the highest texture setting. Which in turn made the 6gb 980ti and 12gb Titan X look like the next level up of gpu, which they were anyway but, Nvidia just wanted to take advantage of every opportunity to highlight it.
bum di
Nvidia purposely got involved with the PC version of Rise of the Tomb Raider late in the PC version's development phase so they could ensure any tech such as a sync compute which they struggled at was removed, On PC it became a DX11 title that ran in way's that favour Nvidia hardware. AMD dropped the ball with that one.
At the time AMD and more so the fanboy's were blowing hot air about how 4gb's of HBM was like 5 or 6 gb's of GDDR5, Nvidia knew this was rubbish and made sure the ram usage exceeded 4gb's at all resolutions so that the 4gb Fiji cards would be unable to use the highest texture setting. Which in turn made the 6gb 980ti and 12gb Titan X look like the next level up of gpu, which they were anyway but, Nvidia just wanted to take advantage of every opportunity to highlight it.
I've experienced dx12 once under my 290x running man kind divided.
It was crap so I used dx11 and haven't used dx12 since
It's been a known issue for years, Nvidia have got around the draw call issue a few years ago when they lunched Maxwell which had a dramatically different front end and scheduler which allowed them effectively re-write code across multiple CPU threads when they released this driver:
![]()
Just goes to show that you CAN expect large increases in performance with driver updates and just how important the software is. I would even go as far to say when reviewing a video card your not only reviewing the hardware but your effectively reviewing the software support as well.
It's a shame AMD aren't able to do the same with their hardware, I was wondering if AMD had changed anything in Vega that might allow them to fix these overhead issues through software optimisations. Remains to be seen I guess but I don't know how much different Vega is from GNC.
DX12 on Mankind Divided was implemented through testing. The first couple of DX12 patches didn't help really but it's a lot better now.
Just goes to show that you CAN expect large increases in performance with driver updates and just how important the software is. I would even go as far to say when reviewing a video card your not only reviewing the hardware but your effectively reviewing the software support as well.
It's a shame AMD aren't able to do the same with their hardware, I was wondering if AMD had changed anything in Vega that might allow them to fix these overhead issues through software optimisations. Remains to be seen I guess but I don't know how much different Vega is from GNC.
I've experienced dx12 once under my 290x running man kind divided.
It was crap so I used dx11 and haven't used dx12 since
DX12 on Mankind Divided was implemented through testing. The first couple of DX12 patches didn't help really but it's a lot better now.
That wonder driver was just marketing, it didn't make any difference in most games. the 71% figure was in Rome Total War and you know how they did that? Well quite simple, Rome Total War didn't have support for SLI before that driver and when it got support game performance improved, by.. yes, 71%
The other problem with the Driver is that didn't actually reduce the CPU bottlenecks. You still needed a high end CPU to see any gains from this driver.
Other than the two outliers, it was just pretty standard driver update, but well marketed to deflect attention away from Mantle.
Has anyone seen this?.
yeah you're a couple of days late with that
Has anyone seen this?
32,000 GPU score in 3DMark11.
For comparison my 1070 with a casual overclock, and by casual overclock i mean i can't even be bothered to unlock volts, its just +100Mhz and +400Mhz on the memory.... meh that'll do.... i know if i bump the volts up i get another 60Mhz on the core and the memory is perfectly happy to run +500Mhz+
26,600 GPU score http://www.3dmark.com/3dm11/12261141
I mean 32,000 is not to be sniffed at in its own right, its a seriously quick card, but its a 500mm^2 300 Watt GPU with AIB 1080 performance at a time when that 1080 is not that far off getting replaced by Volta, what if the 1170/80 are a step up from the 1070/80 like the 'they were from the 970/80, AMD would be two and a half generations behind.
Right now with a 300 watt 500mm^2 card AMD need to be on par with the 1080TI 'AT LEAST'
In 8 months time my 26,600 scoring GTX 1070 becomes the £250 GTX 1160.
Not good enough, nothing like good enough.