• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia open source image upscaler - NIS - FSR killer?

Caporegime
Joined
4 Jun 2009
Posts
32,271
Interesting stuff from nvidia today, first up video showing how far dlss has come and overview to what exactly DLSS is and how it works:


Essentially not the FSR attack PR piece that we thought it was going to be but a very good educational piece.

The biggest point they announced today was a FSR competitor:

The latest Game Ready Driver releasing on November 16th provides an update to our existing NVIDIA Image Scaling feature that boosts performance on ALL games and GeForce GPUs through a best-in-class spatial scaling and sharpening algorithm. NVIDIA Image Scaling is accessible both from the NVIDIA Control Panel and GeForce Experience, and includes a per-game sharpening setting tunable from NVIDIA’s in-game overlay.

NVIDIA is releasing the NVIDIA Image Scaling algorithm as an open source SDK that delivers best-in-class spatial scaling and sharpening and works cross-platform on all GPUs. The SDK will be publicly available on GitHub on November 16th for all developers to integrate into their games.

https://www.kitguru.net/gaming/dominic-moass/nvidia-image-scaling-analysis-versus-fsr-dlss/


Personally I'll still use DLSS over FSR/NIS where possible.
 
Did they get rid of the actual sharpening feature in the driver ? As this NIS is not doing the same job as the sharpening option did before it.
 
So basically Nvidia now has its own FSR for non tensor cards and DLSS for tensor cards.

I'm still gonna use dlss obviously as it's the better option and also because NIS is a driver feature which means I can't tweak anything in game - they say they've added it to their in game overlay but I ain't installing that rubbish software. At least it's open source, I'll try NIS if developers implement it directly into their games like they have to do with FSR
 
Last edited:
I'm going to go out on a limb here and say that Nvidia introduced this because they are going to remove tensor cores from certain GPU models. So high end will have tensor cores, low end will not.

Or this could be a contingency if they are unable to get power under control for these next generation of cards. Chop off the tensor cores and then pretend like DLSS was never a thing.

I don't see why a dev will use this over FSR, since FSR also works on consoles. NIS also goes against the whole Nvidia markerting campaign about DLSS being a good because of vector data or whatever it is called.

Or this could just be the g-sync compatible thing all over again. Regardless I can smell the fear. Competition is back baby.
 
I'm going to go out on a limb here and say that Nvidia introduced this because they are going to remove tensor cores from certain GPU models. So high end will have tensor cores, low end will not.

Or this could be a contingency if they are unable to get power under control for these next generation of cards. Chop off the tensor cores and then pretend like DLSS was never a thing.

I don't see why a dev will use this over FSR, since FSR also works on consoles. NIS also goes against the whole Nvidia markerting campaign about DLSS being a good because of vector data or whatever it is called.

Or this could just be the g-sync compatible thing all over again. Regardless I can smell the fear. Competition is back baby.

1st point could be very valid.

Given the amount of work they have done for dlss (both on the technical front and from a PR POV) and getting it adopted into game engines, open source sdk etc. etc., dlss isn't going anywhere imo, the results/comparisons to your generic upscaler i.e. FSR and NIS show why we still need DLSS kind of tech.

FSR/NIS are good for what they were designed for it i.e. vast adoption rate and for people with lower end/older gpus or ones that just don't have the hardware requirements for it. I have yet to test NIS but from my experience with FSR so far and seeing why people think FSR is good is because of the over sharpening, which initially does look better than a blurry/smeary AA mess but after time you can see its shortcomings. It kind of reminds me of people who like things like the "vivid" preset mode on their TV/LCD and sweetfx/redux mods i.e. blown out highlights, shadows/blacks being crushed.

Given NIS is open source and the code is also up on github, there is no reason why developers can't take it and use it, essentially nvidia have done what amd did, here you go, use as you please. The main possible advantage with NIS here though is nvidia have improved upon FSR by letting people enable/use it through their drivers for "any" game, however, if it is anything like the 3rd party mods where FSR is being forcefully injected, chances are it won't be great and will bring its own fair share of problems.

How many consoles games are using FSR? It seems like most console games still rather use adaptive resolution techniques? Or for PS 5 exclusives, checkerboarding.
 
This all comes across as abit odd to me. Nvidia was not long ago label FSR is poor and they have been already doing this within the driver for years.

I also remember a similar incident related to Gsync and Freesync. Nvidia hit back with how bad Freesync was only to later go onto actually support it. The Gsync module is now basically dead.

The most important thing for these companies is cost RTX definitely adds more cost to the making of these GPUs.
I wouldn't be surprised if you start to see less GPUs with tensor cores released tbh

It's more than likely you will see a switch up to get the cost down.

End of the day its the mainstream that make the gaming world go around high end GPUs are not that.
 
I'm going to go out on a limb here and say that Nvidia introduced this because they are going to remove tensor cores from certain GPU models. So high end will have tensor cores, low end will not.

Or this could be a contingency if they are unable to get power under control for these next generation of cards. Chop off the tensor cores and then pretend like DLSS was never a thing.

I don't see why a dev will use this over FSR, since FSR also works on consoles. NIS also goes against the whole Nvidia markerting campaign about DLSS being a good because of vector data or whatever it is called.

Or this could just be the g-sync compatible thing all over again. Regardless I can smell the fear. Competition is back baby.

I think NIS is just there to take the wind out of AMD's FSR sails, Same thing happened with the sharpening, Integer scaling and Fast-Sync.
 
Did they get rid of the actual sharpening feature in the driver ? As this NIS is not doing the same job as the sharpening option did before it.
The old sharpening settings with the two different sliders are gone. You also can't just enable it on a per-game basis any more from the control panel. You have to toggle it on globally and then can adjust the level of sharpening for each game.
 
So I've just tried NIS in Battlefield 4. I play at 4k 120hz and was getting a steady 118 fps before. With NIS my FPS is jumping around between 135 - 175 fps, so quite a major boost in performance and no noticeable difference in image quality.

The major downside for me though is having to use it in full screen as I prefer to play in borderless window. Also sometimes my screen is cropped and I have 2 large black bars down either side of the screen. This seems to happen randomly

However in Battlefied 1 my fps actually went down slightly and the game was unplayable due to crash to desktop every 10 seconds no matter how many times I restarted the game

NIS seems to be still quite in the beta stage and even if you intend to use it through geforce experience (as opposed to control panel) you have to download the experimental features update. I suspect Nvidia has rushed this tech out due to AMD FSR and and Intels's upcomiing iteration.

Ill try some other non DLSS games at the weekend
 
Last edited:
lossless scaler on steam has added it i think. That uses borderless window to do its thing i think.

Yes, I noticed it via DSOG's cheeky article - https://www.dsogaming.com/news/amd-gpu-owners-can-now-use-nvidias-image-scaling-tech-nis/

AMD GPU owners can now use NVIDIA’s Image Scaling Tech (NIS)
November 19, 2021 John Papadopoulos Leave a comment

A few days ago, we informed you about the new algorithm for NVIDIA’s Image Scaling tech, NIS. This tech was available to all NVIDIA owners and according to the green team, it produces better results than AMD’s FSR. So, good news everyone as AMD GPU owners can now use NVIDIA’s tech via Lossless Scaling 1.6.

For those unaware, the primary focus of Lossless Scaling is to make a game window borderless and scales it to fullscreen using integer multipliers. As such, the output image maintains its original clarity and integrity. This is ideal for playing emulators. Additionally, players can enable fullscreen anti-aliasing in the app even if the game has no such option.

Its latest version, version 1.6.0, though adds support for NVIDIA Image Scaling. By using Lossless Scaling, everyone can use NVIDIA’s tech. And yes, this includes AMD GPU owners.

The downside here is that Lossless Scaling is not a free program/tool. While it’s not expensive (it’s only $4), we are certain that some AMD users may be put off by its price. After all, most of us want free stuff.

But anyway, those interested can purchase it from Steam.

Have fun!

A little awkard for HUB after their tweet -

Clearly still has a bone to pick with Nvidia and targetting DLSS.:cry: It's funny how quickly they forget when they got banned by Nvidia and they wonder why and cry wolf later.:rolleyes:

 
Nvidia might be ahead a bit with upscaling technology but they're really lagging behind in customer support by not copying AMD. Sure, NIS might be really good and better than FSR, but who the hell wants to buy an Nvidia card today and not get DLSS/raytracing? Isn't that the primary selling point of those cards now?

Moreover, I don't see how this improves the market or industry at all. Why not cooperate with AMD and Intel to create a standard upscaler that's good enough, that works across the board on PC and console? Why can't we have that from the "industry leader" in consumer graphics?
 
Driver side NIS will never be better than FSR game support. This also goes for FSR being forced into games.

The reason FSR requires game support is so it can be added at the right time into the rendering pipeline. Failing this you just start upscaling the hole images including UI.
 
Moreover, I don't see how this improves the market or industry at all. Why not cooperate with AMD and Intel to create a standard upscaler that's good enough, that works across the board on PC and console? Why can't we have that from the "industry leader" in consumer graphics?

Only Nvidia cards (with the necessary hardware) can support AI upscaling without impacting rasterization; how much of an impact, I'm not sure as we can't test anything on the RDNA2 cards. Intel plan on releasing their card with dedicated hardware for AI upscaling, similar to Nvidia, as well as their AI upscaler, which I believe is using DirectML, and should hopefully because available on all capable cards.

I understand why they don't support older cards with DLSS, like the 1060, because it doesn't have the internals to do it well enough, similar to the RDNA2 cards (I'm sure they'd fair better as they're a much newer architecture).
 
I'm going to go out on a limb here and say that Nvidia introduced this because they are going to remove tensor cores from certain GPU models. So high end will have tensor cores, low end will not.

Tensor cores take up room for mining cores - so they'll get rid of them like you say. There's a higher demand from miners than gamers. Miners pay more. Product manufactured for whomever's the most valuable customer. :D
 
The old sharpening settings with the two different sliders are gone. You also can't just enable it on a per-game basis any more from the control panel. You have to toggle it on globally and then can adjust the level of sharpening for each game.

Apparently it's still there but you have to go into "Adjust screen size and position" and enable "Integer scaling", Then Image sharpening replaces NIS in the 3D settings menu.
 
Tensor cores take up room for mining cores - so they'll get rid of them like you say. There's a higher demand from miners than gamers. Miners pay more. Product manufactured for whomever's the most valuable customer. :D
I'm certain some intrepid programmer will come along and figure out how to mine on tensor cores :D
 
Back
Top Bottom