• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will low level API's make Intel Stronger Per Core Performance Irrelevant?

If you only look at gaming, then my answer would be yes. Since the current gen consoles have a Jaguar 8 core low clocked CPU and most games on PC now days are console ports, which mean they will be coded specifically to use 6 to 7 cores, this can only be a good thing for current AMD users.

I think they might even be a case where current Intel CPU's might struggle in 2 to 3 years simply due to being 4 cores, majority of them, while AMD already offers 8 cores as standard pretty much. With the low level API's being used more and more I believe it will come down to who has more cores than who has a higher core performance in modern games at least.

If I look at my example, I am still using Phenom X6 1090t that I overclocked slightly to 3.5Ghz. I bought this back in 2010 and still even today I don't have a single game where it truly stressed the CPU on all cores to the max. BF4 multiplayer is the most taxing, about 80%-85% on all cores, but I still maintain a solid 60fps at 1080p on high/med settings. And that's using DX11, my performance can only improve with Mantle or DX12.

This CPU can probably last me another 3-4 years if I want it too, considering the low performance of the current gen consoles.
 
While some may look on it as a negative thing - due to not pushing the technology through lack of competition - the general users will benefit as people are not having to buy new components every three years (or so).

It does cement that there is no need to push raw clockspeed anymore! With programmers being encouraged to use the multi-threading capabilities for the hardware available, companies like intel need to bolster their other products as it may not be as desirable in the future.
 
10 was every bit as good as 11 and did most of the juicy stuff it just never caught on.

HSA and Mantle though? well, they're what will allow coders to exploit the low powered hardware in both of the new consoles and what will allow them to squeeze out all of that extra performance so they're very big indeed.

DX12 is just another confirmation that CPUs are about to be left in the dust. It may fail, but no doubt there will be a DX13 of sorts given the Xbone really needs something like that, especially because the hardware in the Xbone is not as good as the PS4.

I am currently playing through HL2 with the Fakefactory Cinematic mod and in all honesty DX9 can still kick it. The only thing I could really criticise it for is not having the tesselation so the face detail is a little bit crap. Other than that though? it's all about setting your mind to something.

DX10 was uber powerful and most of the DX11 stuff was doable in 10. I remember there was a hack for Dirt 2 where you could activate all of the lighting and shadows and so on.

Problem with DX10 was that the Xbox 360 was not really powerful enough to execute it properly. Then came DX11... Both were just test runs for the Xbone.

Nothing M$ have done since Motocross Madness and Midtown Madness were for PC gamers. They totally moved over to their consoles because they can charge that fat old licensing fee on their consoles. They can't charge it in Windows, so they copped the nark and decided to make everything of theirs Xbox only and exclusive.

So whatever they're doing it ain't for us. Which is why I'm quite thankful AMD have managed to merge technology so that PC gamers benefit from the consoles. These could be the last new consoles ever really.

From what I remember DX was tied to vista initially. Vista got a lot of bad press and there were a lot of people unwilling to adopt it. So you can see why games manufacturers wouldn't be very quick to use it either.
 
Games will continue to become more and more CPU demanding as physics etc advances, so no not really.

The reason why CPU's are currently largely irrelevant for gaming is because monitor resolution and the use of graphical effects (eg. NVidia Gameworks libraries) have moved at a much faster rate than GPU technology, so most games are heavily GPU bottlenecked at the settings people use.
 
Last edited:
From what I remember DX was tied to vista initially. Vista got a lot of bad press and there were a lot of people unwilling to adopt it. So you can see why games manufacturers wouldn't be very quick to use it either.

Indeed. It was the forced update.

Hilariously I really liked Vista. Not at launch, but if you install it now and install all of the fixes it's kinda Windows XP meets Windows 7.

But yeah, 10 had all of the things that we were sold with DX11...



Plus you could hack Dirt 2 to run the DX11 lighting methods in DX10.

But as you say it was limited to Vista and they point blank refused to release it for XP (idiots...) so it flopped.

Games will continue to become more and more CPU demanding as physics etc advances, so no not really.

The next big step is ray tracing IMO.
 
Back
Top Bottom