• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

** The AMD Navi Thread **

Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
Gsync and Freesync needs to die off, as does Displayport.

What the industry desperately needs is for everyone to move ASAP to HDMI 2.1 and for everyone to support HDMI Forum VRR.
And dump displayport, dump gsync and dump freesync

DP doesn't have royalty fees. HDMI does, and 2.1 (incl VRR) add on top of the other HDMI fees.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
AMD is in the best shape now since the ATi acquisition in 2006. See their debt now, it's only 1B$:
1znbl37.png

https://www.macrotrends.net/stocks/charts/AMD/amd/debt-equity-ratio

It makes perfect sense for them to start trying harder to regain the performance crown. Which will lead to a better reception from the customers with regards to their intentions of being a "premium" brand and charging more for the products.
They can't be a premium brand if their top product only equals the mainstream parts of the competition, right?

Takes 4+ years to develop new GPU. So Navi started development back in 2014, given that it was somewhat ready in 2018.

That video was pretty damn scary, as the ending said about the next new gpus that come out they will do anything they can to make you want them.......... super......

That video game 4 months before Turing..... And we all see today how Ngreedia is working even when there is competition. Because at mid-range there was competition all the time, though that doesn't stop NV to sell worse products at higher prices as recent as GTX1650. Ofc many forgot the GTX780Ti and Titan Black. Both more expensive and slower than the 290X 8GB, yet they sold more units by a big margin.
I still remember the argument of the NV fans about "energy savings". But when run to them the maths of their electricity bills, they should still be running the 780Ti for the next 20 years (as of now 5.5y later), and 35 years the Titan Black, just to break even of the extra money paid over the 290X.

Ofc hypocrisy at it's best.

FYI before someone says I am an AMD shill again, back in in Autumn 2013 had a GTX780 coming from a GTX580 had few years back.
 
Caporegime
Joined
18 Oct 2002
Posts
32,615
Ok, just a thought considering @Gibbo's latest GPU deals.

This is obviously to clear stock of current cards in preparation for the incoming new range. But, and it's a big but, why does it seem that AMD are clearing everything, including the Radeon VII?

Surely the VII still has a place as their top product. Unless there is more to come from NAVI?

It did seem strange to me that the launch was of a singular 'series', the 5700/XT. Normally there would be more to a launch than that. Is there a possibility that AMD were holding back on any other tier NAVI cards which will appear before too much longer? Surely two cards aren't replacing AMD's entire graphics stack?



Radeon VII is end-of-life, production stopped some time ago. As was rumoured at the release time, they only produced about 40K or so and that was it. Just a PR stunt.

AMD make a loss on every one sold, so time to stop the funding the PR.
 
Caporegime
Joined
18 Oct 2002
Posts
32,615
Watch the AMD video above DP for god sake lol The guy clearly talks about upscaling the image from 1440p to 4k


It is not up-scaling though, it is sharpening. You need to look through any of the marketing buzzwords and actually look at the facts.

Sharpening a 1440p image still gives you a 1440p image.
 
Caporegime
Joined
18 Oct 2002
Posts
32,615
No I have said multiple times now we just after wait and see come July 7th. All I have been posting is information from the internet. How good AMDs feature is I don't know! So don't try and move the ballpark with me.
In the video he clearly says the demo is running at 1440p and upscaled to 4k. I have also told you this! You will need a native res for this to work so in my case 1440p I could run a game at 1080p and use this RIS to get a 1440p image for the cost of 1080p image.

And for the record, I got the 1080p to 1440p from the website write up I posted. AMDs own words!



No, n=that is not at all what is happening. If you want a 1440p image then you will have to render at 1440p.


Sharpening is not upscaling. If there is any upscaling happening then it isn't because of AMD's sharpening.
 
Soldato
Joined
6 Aug 2009
Posts
7,070
It is not up-scaling though, it is sharpening. You need to look through any of the marketing buzzwords and actually look at the facts.

Sharpening a 1440p image still gives you a 1440p image.

I've always been suspicious of the language used. Pixels are physical objects, no amount of software tinkering will give you more of them. Reminds me of downloading more RAM ;) It's more like tuning the settings on your TV to make the image look better.
 
Caporegime
Joined
18 Sep 2009
Posts
30,097
Location
Dormanstown.
There is *upscaling* going on. As they're running a 2560x1440 image on a 4K screen full screen. That's upscaling.
But the word upscaling is banded about as something it's not.

Opening up a 1920x1080 JPG on a 2560x1440 screen and stretching it to fill the 2560x1440 screen is also upscaling.
 
Soldato
Joined
6 Feb 2019
Posts
17,464
I've always been suspicious of the language used. Pixels are physical objects, no amount of software tinkering will give you more of them. Reminds me of downloading more RAM ;) It's more like tuning the settings on your TV to make the image look better.

But it will continue

Your pixels are just pixels and software can’t make pixels out of thin air is true

And actually the exact same thing happens with headphones and speakers.

Everyone and their cow claims they can emulate surround sound - uh no, speakers are physical objects and no amount of software can replace that
 
Caporegime
Joined
18 Oct 2002
Posts
32,615
I've always been suspicious of the language used. Pixels are physical objects, no amount of software tinkering will give you more of them. Reminds me of downloading more RAM ;) It's more like tuning the settings on your TV to make the image look better.


Exactly, you can increase the contrast, saturation, white-balance to make the image look "better" for some personal definition of better. But the pixel-count remains the same.


Open an image in photoshop/GIMP, apply a sharpening filter, and hey presto you have just done what AMD's and Nvidia's sharpening filters do. The image resolution is exactly the same. The edges have more contrast and appear sharper. If you want to increase the resolution then you will have to upscale the image, but AMD's and Nvidia's sharpening doesn't do that. Only NVidia's DLSS does that, and it does it in a way that is much smarter than simple mathematical transforms like Bilinear interpolation.

AMD is trying to market sharpening as an equivalent of DLSS because AMD have no answer to DLSS. They are trying to fool potential buyers into believing there is an equivalence.
 
Soldato
Joined
23 May 2006
Posts
6,722
So we can have more proprietary crap that separates the industry - that's worked out so well for PC's - that's partly why a $2000 PC monitor can't even come close to a $1000 TV in image quality
Why on earth would you want freesync and DP to die off? Makes no sense.
Gsync... Ok I guess I can see that (tho personally I am ok with that too as a premium product tho believe freesync or equivalent should always be fully supported as the base/royalty free one as well.. but why should companies be forced to pay the royalties for hdmi? That is just adding expense for the sake of it.
Are there fees involved with VR port? If not then, due to the size of them, maybe that will take off and we can just have 4 of them on a card, barring that I would be ok with all DPs on a GPU myself..... Tho whilst the convenience of 1xHDMI is nice I must admit, adapters are cheap and ultimately it is us who are paying the licencing fees. Hardly fair for those who don't need it .

Btw I have ditched a.momitor for pc gaming many many years ago but you are fooling yourself if you don't think a gaming monitor is better across the board for gaming apart from size . For me I can make the sacrifice. I love my 65/70 inch gaming TVs with 3d support for the rare time it is supported .... But I am not a competitive gamer. If I was the response time and refresh rates would simply not cut it.
 
Last edited:
Caporegime
Joined
18 Oct 2002
Posts
32,615
There is *upscaling* going on. As they're running a 2560x1440 image on a 4K screen full screen. That's upscaling.
But the word upscaling is banded about as something it's not.

Opening up a 1920x1080 JPG on a 2560x1440 screen and stretching it to fill the 2560x1440 screen is also upscaling.



When you output say a 1080p image to a 1440p monitor, the monitor upscales to 1440p in a pretty bad way using a basic transform.


Here are some examples to highlight what DLSS can really do. When you rely on a simple mathematical transform to upscale, which is what monitors do, you end up with an image like the bicubic or bilinear upscaling. If you use machine learning then statistical properties of image pixel data distributions are used that can intelligent replicate lost detail and give a final image that is far closer to the original native resolution.

Nvidia's DLSS is using something similar to the SRGAN models.


 

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,176
Location
Greater London
AMD is trying to market sharpening as an equivalent of DLSS because AMD have no answer to DLSS. They are trying to fool potential buyers into believing there is an equivalence.

Fool buyers? Pretty much what Nvidia are doing with DLSS then? So far whatever the AI is supposed to be doing is making little to no difference from what I last saw. Lol.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
When you output say a 1080p image to a 1440p monitor, the monitor upscales to 1440p in a pretty bad way using a basic transform.


Here are some examples to highlight what DLSS can really do. When you rely on a simple mathematical transform to upscale, which is what monitors do, you end up with an image like the bicubic or bilinear upscaling. If you use machine learning then statistical properties of image pixel data distributions are used that can intelligent replicate lost detail and give a final image that is far closer to the original native resolution.

Nvidia's DLSS is using something similar to the SRGAN models.



Well maybe amd have found away to make this work better? Just like amd found a better way with adaptive sync?

I know that might be hard for you to believe at first but just hang in July 7th isn't far away.

And again for the record I am not saying this is better than DLSS I am just echo what the Internet is saying.

Just before you take that reply out of context.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
So, from this statement, I assume this applies to anything involving DLSS, meaning a 4k DLSS image is not 4k because it wasn't rendered at that resolution. And the same for any other resolution with DLSS on. :D

Exactly lol like I said it's all faked so why does it matter what method is used if the final outcome is a image that looks better than what you running native.
 
Soldato
Joined
25 Nov 2011
Posts
20,639
Location
The KOP
What I would like to see is AMD to use this and Nvidia to use their DLSS to make 4K look better, rather than just lower resolutions.

That goes against what they designed for though. It's to get you a better image without the performance cost of that native image.

I might be wrong here I think you could actually use amd sharpen though on a 4k image.
 
Caporegime
Joined
18 Sep 2009
Posts
30,097
Location
Dormanstown.
There's some
When you output say a 1080p image to a 1440p monitor, the monitor upscales to 1440p in a pretty bad way using a basic transform.


Here are some examples to highlight what DLSS can really do. When you rely on a simple mathematical transform to upscale, which is what monitors do, you end up with an image like the bicubic or bilinear upscaling. If you use machine learning then statistical properties of image pixel data distributions are used that can intelligent replicate lost detail and give a final image that is far closer to the original native resolution.

Nvidia's DLSS is using something similar to the SRGAN models.



In CCC you can make the GPU do the upscaling.
Either way. Upscaling is being done, but the context of the word is being muddied by some.
 
Back
Top Bottom