• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU AI extensions (GNA and DLBoost) and future games

Man of Honour
Joined
22 Jun 2006
Posts
14,601
I was reading almoststew1990's write-up on using a i7-860 and how some games didn't run, likely because missing extensions (AES?) and it had me wonder, do any games use, or are likely to use in the future, the new AI/machine learning extensions present since 11th gen?

This article suggests a few possibilities:

PCWorld said:
"Microsoft’s Photos app, for example, uses AI image analysis to come up with its own assessment of what it’s “seeing,” such as a beach scene, for example, or snow. Microsoft Photos and Google Photos already identify and group the subjects of your photos, recognizing who’s in them."

"Intel showed off other AI examples: stylizing a video in real time as it plays, just like applying a filter in Snapchat; removing unwanted background noise from a call or chat, and accelerating CyberLink PhotoDirector 10’s ability to de-blur photos. Microsoft’s Skype and Teams apps can already pick you out from a video call and blur or even replace the background. AI functionality will make that faster."

"Intel’s secret sauce is what it calls a Gaussian Neural Accelerator, a tuned piece of logic found within the Ice Lake chip package. The two work hand in hand. The CPU architecture accelerates what’s known as DLBoost, which in turn accelerates inferencing technology on Intel’s Ice Lake CPU. (Inferencing applies rules or algorithms to known facts to learn more about them.) The Gaussian Neural Accelerator, meanwhile, can run under very low-power conditions to perform a specialized task, such as real-time translation of an ongoing conversation."


Just wondering if one day these kind of extensions will be responsible for killing off pre-11th gen CPUs, say if they start using them for generating weather patterns, NPC behaviour, etc.
 
We'd need serious standardisation of these extensions for them to be absolutely necessary in software (like with AES), and that takes a very long time.

For AI stuff there's no standardisation of these, every company is doing their own (Intel, AMD, Apple, Nvidia, ARM, etc). You never know about the future but I wouldn't worry about it. If something comes to make chips obsolete it would be about security features (like AES) rather than compute.
 
Back
Top Bottom