Soldato
- Joined
- 19 Feb 2011
- Posts
- 5,849
^^^ was joking mostly mate
but if it means games make use of more cores its win for pretty much everyone now.

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Anandtech AMD cards 7fps to 43fps running Nitrous engine DX12 engine.
-600% performance increase on AMD cards running full DX12 game engine.
-CPU cores n their performance more important in the future games .
-Directx 12 on Xbox Nov.
-Directx 12 runs better than Mantle on AMD cards.
-Dx9 uses max 6000 draw calls vs 600,000 on dx12.
-Naysayers are eating crow!
-“Marketing chose 40% increase slogan as real gains were too unbelievable for the masses to digest”
-limitless light sources in games.
-1000+ characters AI characters on screen.
-toy story/ lord of the rings graphics no deffered rendering.
-future Xbox cpu bound exclusives DX12 games to have these gains however dx12 games initially 30% gains as they transition from Dx11 to full new dx12 game engines using esram etc
-Not so much games for cross platform games.
-Fable Legends Dx12 game
-AMD/MS has mega DX12 news at GDC.
-Stardock developed Starswarm dx12 demo in 2 months. Starswarm would have similar performance gains on Xbox one i.e 600%. They have something major at GDC in the Microsoft booth.
-Stardock are developing DX12 game engine “Nitrous”. Star Control game will make sense console. They will license it to 3rd parties in future after they release 2 games on Nitrous engine.
-Phil Spencer managing expectations, as in November with DX12 on Xbox….. the games released then wont look much different then until the new DX 12 game engines.
You do relalise that is System power draw and not GPU power draw?
Did you have a point somewhere? GPU load goes up, CPU load actually goes down drastically compare the numbers, the system supposedly using a 100W higher TDP gpu is using 14W more under DX12, 19W more under DX11.
DX11 vs DX12, cpu load goes down, gpu usage goes up, the power increase is down to the GPU, not CPU and it's actually likely to be larger than the 49W as cpu power usage probably decreased at the same time.
The massively efficient Maxwell is barley using less than a supposedly awfully inefficient 290X but the main point was that DX12 drastically increased power usage on the 980gtx, something I speculated would happen months ago, making a bit of a mockery of it's efficiency.
It's 'efficiency' is it's not being used all that much. Give it a sustained load and the power saving of the architecture disappears.
No one noticed or mention Anandtech say DX12 usage increased the power output of the 980 by 50W?
What we find is that Star Swarm and DirectX 12 are so efficient that only our most powerful card, the GTX 980, finds itself CPU-bound with just 2 cores. For the AMD cards and other NVIDIA cards we can get GPU bound with the equivalent of an Intel Core i3 processor, showcasing just how effective DirectX 12’s improved batch submission process can be. In fact it’s so efficient that Oxide is running both batch submission and a complete AI simulation over just 2 cores.
EDIT:- as for AMD/MS mega news at GDC< there is a rumour going around that Nvidia is due to break some bad news RE DX12 support around/after April this year. I suspect it will be something along the lines of some fairly major features missing from Nvidia, I think the rumour mentioned Kepler as being the main issue.
So what, nvidia and Sony are effectively out of the running when new DX12 games are released?
So what, nvidia and Sony are effectively out of the running when new DX12 games are released?
I have no idea what either of those things mean in relation to what I said, Sony? Where in the sweet jesus is Sony involved in this. For one thing the PS4 has a AMD GCN based gpu. Second, an old Nvidia architecture not supporting DX12 things isn't particularly big news although they(and Nvidia guys on here) made a massive deal about the statements about existing cards supporting it while AMD weren't quick to announce what cards supported what.
I was only asking!
Thanks for the reply anyway. My 980 is good for DX12, as is my son's PS4. The 670 in the media server, on the other hand... I think that'll be hitting the post office on its way to a new owner soon.
Now, if I had to hang off until the 8gb 980ti is released, I could justify shifting the 980 into the media server and....
erm, no, actually nvidia were the first to announce that kepler and fermi would not have full support for DX12, where as AMD just said that "all GCN cards will support DX12" without saying which cards might or might not have full support
the point people were picking up on was that nvidia said fermi onwards would have basic support, so 480 onwards, where as for AMD users it is GCN only, so not as far back
Which hardware will be DX12-compatible? AMD said all of its Graphics Core Next-based Radeon GPUs (i.e. Radeon HD 7000 series and newer) will work with the new API. Nvidia pledged support for all Fermi, Kepler, and Maxwell (i.e. GeForce GTX 400 series and newer) parts. The keynote included a demo of Forza 5 running in DirectX 12 mode atop an Nvidia GPU. Finally, Intel said the integrated graphics in its existing Haswell processors will also have DX12 support. All in all, Microsoft estimates that 100% of new desktop GPUs, 80% of new gaming PCs, and 50% of PC gamers will be able to take advantage of the new API.
Most likely because other people probably bothered to read and understand the entire article before desperately looking for some spin to counter how poor the 290X was in comparison. If you had done the same you would see that the 290X is bottle neck for the CPU.
So while the 290x is hitting a wall (and allowing the CPU to idle) the 980 is really pushing the system, therefore pulling more power to the CPU in order to feed the mighty 980. So even while it is trashing the 290x in framerate and pulling more CPU power it is still less total load.