• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Intel demos XeSS super resolution -open-AI-AMD/Nvidia can use it

Top work Intel, and open source, anyone can use it, Hurray!

Nothing against Nvidia per se, but, i hate the way everything they do is proprietary, how everything they do is to try and lock you in to their eco system, so i'm always happy when someone else does the same thing just as well and makes it available to everyone, in other words i like seeing others peeing on Nvidia's proprietary camp-fire.

You're a star Intel.


Intel is no shining knight though

As GN mentioned, this is Intel recognising it's starting from a position of 0% discrete gaming market share - it can't come in and make grand demands like Nvidia has tried to do, not until it commands significant market share - so if XeSS is to survive for longer than a few months it can't only run on Intel graphics cards so designing it to run on all of the competitors products is a good move that means up to 100% of the market can use your SDK


I just love it that we have a 3rd option now, every GPU Intel sells takes food out of Nvidia and AMD's mouths and that will light a fire up the ass
 
Intel is no shining knight though

As GN mentioned, this is Intel recognising it's starting from a position of 0% discrete gaming market share - it can't come in and make grand demands like Nvidia has tried to do, not until it commands significant market share - so if XeSS is to survive for longer than a few months it can't only run on Intel graphics cards so designing it to run on all of the competitors products is a good move that means up to 100% of the market can use your SDK


I just love it that we have a 3rd option now, every GPU Intel sells takes food out of Nvidia and AMD's mouths and that will light a fire up the ass
i think your getting a head of yourself. it wont light a fire under anyone's ass until they are in front and the way tech works, nvidia and amd are already 5 years in front of intel on the roadmap (they usually have the road mapped out 5-8 years ahead) . its good to have more competition but its going to take a few years at the very least for people to trust intel in the gpu scene and for them to be lighting any fires.
 
yes but they are already playing catch up and so they are likely already behind the curve.

Anything is possible, all Intel need is a good release.
The amount of people on here that said amd wouldn't ever catch nvidia in the high end gaming performance, you had Vega 64 going up against 2080 I kept saying all it takes is a good release.
RDNA2 drops and Amd is right back into the mix even beating nvidia in some aspects.

I think RDNA2 performance suprised a lot of people on here.

We really do not know how far Intel is my prediction is first release will be RTX 3060/3070 or RX 6800 level of performance.

Intel will definitely also have lower end GPUs it's a market that won't be missed.
 
Anything is possible, all Intel need is a good release.
The amount of people on here that said amd wouldn't ever catch nvidia in the high end gaming performance, you had Vega 64 going up against 2080 I kept saying all it takes is a good release.
RDNA2 drops and Amd is right back into the mix even beating nvidia in some aspects.

I think RDNA2 performance suprised a lot of people on here.

We really do not know how far Intel is my prediction is first release will be RTX 3060/3070 or RX 6800 level of performance.

Intel will definitely also have lower end GPUs it's a market that won't be missed.
but raja.. maybe they will light fires literally :D
 
This is going to kill FSR and DLSS on PC if AMd/Nvidia don't sort out their shot

XeSS takes the best of amd and Nvidia and combines it: you get an open source upscaled that works on any GPu and it's accelerated by AI cores like dlss.

For amd to compete they will quickly need to come up with FSR 2.0 that is accelerated by AI cores and for Nvidia they need make DLSS open source and able to run on any Gpu

FSR is supported by very old GPUs and will stay like that, whereas XeSS has hardware requirements which point at min. Pascal on NVIDIA's side of things and RDNA 2 on AMD's. Older GPUs than that will not support it, by the looks of things, because they simply lack required instructions in hardware.

The real issue here is that we'll have min. 5 different upscaling methods soon (XeSS, FSR, DLSS, TAUU and whatever else is supported currently by other engines or will soon be added). They have different hardware requirements too. This means huge headache for developers and gamers too - what to implement, what to skip, how much will it cost, etc. It will most likely end up with Microsoft sitting them all down by a table and coming together with an actual standard in DX, that will work on DX12 Ultimate GPUs. That still wouldn't stop innovation in hardware as such functions can be accelerated and each vendor can come up with their own solution (like with DXR).
 
"no visible quality loss." What utter tosh. You can clear see the difference between 4K and XeSS 4K in their own damn demo. Really irritated that they chose to use that line.

It looks very good in the slow mo shots, but don't oversell it FFS.

It's a marketing talk like "retina screen" by Apple and lots of other - it means at certain distance an in certain situations you won't be able to see a difference. From what Intel shown, difference is visible really only on x2 and x4 magnification. That's good enough for most use. What I've noticed though is that their solution seems to be eating DoF - original 4k scene has in places background objects blurred on purpose (artistic choice) and with Intel's upscaling they're as sharp as any other objects - that's not ideal. And it happens with DLSS at times too.
 
FSR is supported by very old GPUs and will stay like that, whereas XeSS has hardware requirements which point at min. Pascal on NVIDIA's side of things and RDNA 2 on AMD's. Older GPUs than that will not support it, by the looks of things, because they simply lack required instructions in hardware.

The real issue here is that we'll have min. 5 different upscaling methods soon (XeSS, FSR, DLSS, TAUU and whatever else is supported currently by other engines or will soon be added). They have different hardware requirements too. This means huge headache for developers and gamers too - what to implement, what to skip, how much will it cost, etc. It will most likely end up with Microsoft sitting them all down by a table and coming together with an actual standard in DX, that will work on DX12 Ultimate GPUs. That still wouldn't stop innovation in hardware as such functions can be accelerated and each vendor can come up with their own solution (like with DXR).


XeSS does not have hardware requirements I don't know why you think it does - it runs on any GPU including old iGPU's. That's because XeSS is dynamic, if the GPU has AI cores it uses the XMX instruction set and it does not it uses the DP4A instruction set - both achieve the same image quality with the only difference being performance.

I mean the bare minimum to run XeSS is a GPU which uses Shader cores but given that any GPu released in the last 20+ years have those, I wouldn't be concerned.
 
Last edited:
XeSS does not have hardware requirements I don't know why you think it does - it runs on any GPU including old iGPU's. That's because XeSS is dynamic, if the GPU has AI cores it uses the XMX instruction set and it does not it uses the DP4A instruction set - both achieve the same image quality with the only difference being performance.

I mean the bare minimum to run XeSS is a GPU which uses Shader cores but given that any GPu released in the last 20+ years have those, I wouldn't be concerned.

You didn't read Intel's info well enough if you think it doesn't have hardware requirements. It requires DP4a instructions support in hardware - these are supported on Pascal and newer NVIDIA GPUs, new Intel GPUs and I believe NAVI+ AMD GPUs (not 100% about Vega). Anything older than that is NOT supported. Though, also other architectures support it in recent GPUs (ARM etc.) - not just PC hardware.
 
Everyone releasing this feature because of a lack performance.

That is sadly the truth but also caused by game devs and what people expect. 4k monitors are still a tiny minority on the market, hence most games are designed with max details (aside RT stuff) for 1440p at most - modern GPUs can handle that resolution just fine. The moment 4k becomes really popular, they will likely start optimising settings in games for 4k (like they do on consoles) - but that would also mean less fancy details in games (most people wouldn't notice a difference anyway). As is now, PC gamers expect super ultra details, even though that will never run properly on 4k without upscaling. Hence, now we have upscaling to compensate for that.
 
Intel is no shining knight though

As GN mentioned, this is Intel recognising it's starting from a position of 0% discrete gaming market share - it can't come in and make grand demands like Nvidia has tried to do, not until it commands significant market share - so if XeSS is to survive for longer than a few months it can't only run on Intel graphics cards so designing it to run on all of the competitors products is a good move that means up to 100% of the market can use your SDK


I just love it that we have a 3rd option now, every GPU Intel sells takes food out of Nvidia and AMD's mouths and that will light a fire up the ass
I assume this is in-reference to @humbug 's comment about it being Open Source.

While I agree they are no knights in shining armour, in the area of image reconstruction tech, Intel does have a history of making it open source. See the intel open image denoiser
 
You didn't read Intel's info well enough if you think it doesn't have hardware requirements. It requires DP4a instructions support in hardware - these are supported on Pascal and newer NVIDIA GPUs, new Intel GPUs and I believe NAVI+ AMD GPUs (not 100% about Vega). Anything older than that is NOT supported. Though, also other architectures support it in recent GPUs (ARM etc.) - not just PC hardware.

Since it's open source it may be possible for AMD and Nvidia to adapt some of the DP4a instructions to work on gpu's that don't have the hardware. However I suspect they won't bother though since they have no financial incentive to do it. Devs may go for the easiest option and at the moment FSR is the easiest to implement but time will tell which tech is used most.

Check out the latest DF high res video for XeSS.


If that is 4K upscaled from 1080P then DLSS is dead imo. Almost certainly AMD will catch up with FSR 2.0 so I reckon DLSS will have a very hard time. Nvidia cannot sustain a proprietary tech if competitive alternatives are available.
 
Since it's open source it may be possible for AMD and Nvidia to adapt some of the DP4a instructions to work on gpu's that don't have the hardware. However I suspect they won't bother though since they have no financial incentive to do it. Devs may go for the easiest option and at the moment FSR is the easiest to implement but time will tell which tech is used most.

Check out the latest DF high res video for XeSS.


If that is 4K upscaled from 1080P then DLSS is dead imo. Almost certainly AMD will catch up with FSR 2.0 so I reckon DLSS will have a very hard time. Nvidia cannot sustain a proprietary tech if competitive alternatives are available.

I think you're right.

There are two options here.

#1 FSR is not quite as good as DLSS 2 but much easier to put in to games and it will work on any GPU: Intel XE / RX 400 / GTX 900 series and newer, so easy to do and works with probably about 90% of GPU's people own.
#2 XeSS is not so easy to do but its as good as DLSS 2 and works at least with the latest GPU's from all vendors.

DLSS is done, that's not so good for Nvidia but they will survive, it is good, very good for us the consumers and if any of us have any sense we would rejoice at that.
 
I think you're right.

There are two options here.

#1 FSR is not quite as good as DLSS 2 but much easier to put in to games and it will work on any GPU: Intel XE / RX 400 / GTX 900 series and newer, so easy to do and works with probably about 90% of GPU's people own.
#2 XeSS is not so easy to do but its as good as DLSS 2 and works at least with the latest GPU's from all vendors.

DLSS is done, that's not so good for Nvidia but they will survive, it is good, very good for us the consumers and if any of us have any sense we would rejoice at that.

This should be good for everyone, yes. Of course, a bunch of fanboys will try to persuade everyone that DLSS is the only way to go but in reality we'll have a bunch of these algorithms for a while - just like it was with anti-aliasing in the past, till most Devs moved to some form of TAA. Nvidia has the advantage of very good marketing and the fact that most gamers know their brand. But they can't ride on that forever and chances are they will make dlss open source as well, but fastest with tensor cores. Time will tell.
 
Since it's open source it may be possible for AMD and Nvidia to adapt some of the DP4a instructions to work on gpu's that don't have the hardware. However I suspect they won't bother though since they have no financial incentive to do it. Devs may go for the easiest option and at the moment FSR is the easiest to implement but time will tell which tech is used most.

Check out the latest DF high res video for XeSS.


If that is 4K upscaled from 1080P then DLSS is dead imo. Almost certainly AMD will catch up with FSR 2.0 so I reckon DLSS will have a very hard time. Nvidia cannot sustain a proprietary tech if competitive alternatives are available.


Well Intel's CEO has been shouting to the heavens for a few months now that Intel is coming to kick Nvidia off its AI leadership throne very soon so I'd not surprised if XeSS beats DLSS. In the couple benchmarks Intel has released for its new HPC server class GPUs they are indeed beating Nvidia's latest in AI workloads
 
Well Intel's CEO has been shouting to the heavens for a few months now that Intel is coming to kick Nvidia off its AI leadership throne very soon so I'd not surprised if XeSS beats DLSS. In the couple benchmarks Intel has released for its new HPC server class GPUs they are indeed beating Nvidia's latest in AI workloads

The one thing worth noting here is that what they shown is one demo with very slowly moving camera and no dynamic changes. If they run it millions of times through AI, it could generate pixel-perfect images. But how it will really work in real use and proper games - we don't know. Might be better or might be worse than DLSS. I just wouldn't put too much trust in Intel's marketing - they always oversell their new toys.
 
Back
Top Bottom