1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

** The AMD Navi Thread **

Discussion in 'Graphics Cards' started by subbytna, Aug 20, 2017.

  1. Troezar

    Mobster

    Joined: Aug 6, 2009

    Posts: 3,163

    I've always been suspicious of the language used. Pixels are physical objects, no amount of software tinkering will give you more of them. Reminds me of downloading more RAM ;) It's more like tuning the settings on your TV to make the image look better.
     
  2. Martini1991

    Caporegime

    Joined: Sep 18, 2009

    Posts: 27,297

    Location: Dormanstown.

    There is *upscaling* going on. As they're running a 2560x1440 image on a 4K screen full screen. That's upscaling.
    But the word upscaling is banded about as something it's not.

    Opening up a 1920x1080 JPG on a 2560x1440 screen and stretching it to fill the 2560x1440 screen is also upscaling.
     
  3. Grim5

    Hitman

    Joined: Feb 6, 2019

    Posts: 953

    But it will continue

    Your pixels are just pixels and software can’t make pixels out of thin air is true

    And actually the exact same thing happens with headphones and speakers.

    Everyone and their cow claims they can emulate surround sound - uh no, speakers are physical objects and no amount of software can replace that
     
  4. D.P.

    Caporegime

    Joined: Oct 18, 2002

    Posts: 29,966


    Exactly, you can increase the contrast, saturation, white-balance to make the image look "better" for some personal definition of better. But the pixel-count remains the same.


    Open an image in photoshop/GIMP, apply a sharpening filter, and hey presto you have just done what AMD's and Nvidia's sharpening filters do. The image resolution is exactly the same. The edges have more contrast and appear sharper. If you want to increase the resolution then you will have to upscale the image, but AMD's and Nvidia's sharpening doesn't do that. Only NVidia's DLSS does that, and it does it in a way that is much smarter than simple mathematical transforms like Bilinear interpolation.

    AMD is trying to market sharpening as an equivalent of DLSS because AMD have no answer to DLSS. They are trying to fool potential buyers into believing there is an equivalence.
     
  5. bigmike20vt

    Wise Guy

    Joined: May 23, 2006

    Posts: 1,474

    Why on earth would you want freesync and DP to die off? Makes no sense.
    Gsync... Ok I guess I can see that (tho personally I am ok with that too as a premium product tho believe freesync or equivalent should always be fully supported as the base/royalty free one as well.. but why should companies be forced to pay the royalties for hdmi? That is just adding expense for the sake of it.
    Are there fees involved with VR port? If not then, due to the size of them, maybe that will take off and we can just have 4 of them on a card, barring that I would be ok with all DPs on a GPU myself..... Tho whilst the convenience of 1xHDMI is nice I must admit, adapters are cheap and ultimately it is us who are paying the licencing fees. Hardly fair for those who don't need it .

    Btw I have ditched a.momitor for pc gaming many many years ago but you are fooling yourself if you don't think a gaming monitor is better across the board for gaming apart from size . For me I can make the sacrifice. I love my 65/70 inch gaming TVs with 3d support for the rare time it is supported .... But I am not a competitive gamer. If I was the response time and refresh rates would simply not cut it.
     
    Last edited: Jun 13, 2019
  6. D.P.

    Caporegime

    Joined: Oct 18, 2002

    Posts: 29,966



    When you output say a 1080p image to a 1440p monitor, the monitor upscales to 1440p in a pretty bad way using a basic transform.


    Here are some examples to highlight what DLSS can really do. When you rely on a simple mathematical transform to upscale, which is what monitors do, you end up with an image like the bicubic or bilinear upscaling. If you use machine learning then statistical properties of image pixel data distributions are used that can intelligent replicate lost detail and give a final image that is far closer to the original native resolution.

    Nvidia's DLSS is using something similar to the SRGAN models.
    [​IMG]
    [​IMG]
    [​IMG]
     
  7. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 10,525

    Location: London

    Fool buyers? Pretty much what Nvidia are doing with DLSS then? So far whatever the AI is supposed to be doing is making little to no difference from what I last saw. Lol.
     
  8. Satchfanuk

    Gangster

    Joined: Jun 24, 2016

    Posts: 380

    Location: Norfolk

    So, from this statement, I assume this applies to anything involving DLSS, meaning a 4k DLSS image is not 4k because it wasn't rendered at that resolution. And the same for any other resolution with DLSS on. :D
     
  9. shankly1985

    Capodecina

    Joined: Nov 25, 2011

    Posts: 18,876

    Location: The KOP

    Well maybe amd have found away to make this work better? Just like amd found a better way with adaptive sync?

    I know that might be hard for you to believe at first but just hang in July 7th isn't far away.

    And again for the record I am not saying this is better than DLSS I am just echo what the Internet is saying.

    Just before you take that reply out of context.
     
  10. shankly1985

    Capodecina

    Joined: Nov 25, 2011

    Posts: 18,876

    Location: The KOP

    Exactly lol like I said it's all faked so why does it matter what method is used if the final outcome is a image that looks better than what you running native.
     
  11. TNA

    Capodecina

    Joined: Mar 13, 2008

    Posts: 10,525

    Location: London

    What I would like to see is AMD to use this and Nvidia to use their DLSS to make 4K look better, rather than just lower resolutions.
     
  12. Martini1991

    Caporegime

    Joined: Sep 18, 2009

    Posts: 27,297

    Location: Dormanstown.

    I'm almost positive that is how RIS can be used though.
    Sharpening my native render.
     
  13. shankly1985

    Capodecina

    Joined: Nov 25, 2011

    Posts: 18,876

    Location: The KOP

    That goes against what they designed for though. It's to get you a better image without the performance cost of that native image.

    I might be wrong here I think you could actually use amd sharpen though on a 4k image.
     
  14. Martini1991

    Caporegime

    Joined: Sep 18, 2009

    Posts: 27,297

    Location: Dormanstown.

    There's some
    In CCC you can make the GPU do the upscaling.
    Either way. Upscaling is being done, but the context of the word is being muddied by some.
     
  15. Killer7xx

    Gangster

    Joined: Dec 27, 2008

    Posts: 363

    We probably won't see the 5800/5900 until the high end GPUs launch next year and those GPUs might even be next-gen instead of Navi.
     
  16. Besty

    Mobster

    Joined: Oct 18, 2002

    Posts: 2,961

    Until AMD come up with a response to the faux-Gold plated Titan....
     
  17. DragonQ

    Soldato

    Joined: Jun 13, 2009

    Posts: 6,478

    I hate this effect on TVs and am sure I will hate it on GPUs.
     
  18. shankly1985

    Capodecina

    Joined: Nov 25, 2011

    Posts: 18,876

    Location: The KOP

    GPU scaling and display scaling have always been around for years.
    Amd is doing something different with this old technique to improve the image quality.

    Looking at this from outside the box it looks like Amd have found another way to get something similar like DLSS by using something that has been around for years and improving it.

    It's like Freesync all over again.
     
    Last edited: Jun 13, 2019
  19. D.P.

    Caporegime

    Joined: Oct 18, 2002

    Posts: 29,966


    No they are simply sharpening an image. If you send a 1440p image to a 1440p monitor then no scaling happens, if you apply AMD's or Nvidia's sharpening then you increase edge contrast and the image will image a little sharper. If you only render at 1080 then either the screen or GPU can stretch that to 1440p, but the sharpening is not doing any upscaling. At best, with AMD's sharpening it can be applied after the internal scaler (some lame bicubic scaler), which will counteract some of the softness you get with naive transformation methods. But this still has absolutely nothing to do with super resolution and is just using the same old scaling methods around since the first digital screens and TVs.

    I don;t blame you for being confused. AMD have purposed tried to be confusing here, and have misleading tried to compare their sharpening to DLSS on purpose because AMD have no answer to DLSS. AMD should be applying it do Nvidia's sharpening filter and highlighting any differences. Anything AMD is doing with sharpening NVidia has either been doing for some time or could implement in drivers very quickly.

    But in any-case it doesn't at all replace what DLSS does. If anything, more advanced sharpening filters are more useful after applying DLSS to upscale an image by increasing sharpness. If people are complaining about the softness of DLSS then a sharpening filter is a simply remedy, already in Nvidia's drivers.
     
  20. shankly1985

    Capodecina

    Joined: Nov 25, 2011

    Posts: 18,876

    Location: The KOP

    Listen DP
    You confused here lol

    We are not taking about using the native res like AMDs demo it was running at 1440p on a 4k display and upscaled on the GPU and then sharpening to make the image look better.

    It's not rocket science, I do like how you keep adding nvidia into this? Damage control at its best nvidia doesn't do what amd is doing here.

    They is clearly more at work here and we will know more come July 7th.

    Question why does it only work on dx12 amd Vulkan if all it does is a basic sharpening? It's also locked to navi