• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
If you expect 30% for the same money you have very low expectations. Generational used to be 40~70% faster,and not the recent rubbish ones. Anything under 40% is useless.



You make entire sense,it's just people newer to all of this are just used to getting ripped off like some Stockholm Syndrome.

The real question is what is possible? If 30% is all they can deliver and it cost more R&D than the previous generation then that's fine. If they could deliver 100% at half the cost but only give us 50% art an increased cost then that isn't ok. It isn't necessarily linear and without a lot more information we can't know.
 
The real question is what is possible? If 30% is all they can deliver and it cost more R&D than the previous generation then that's fine. If they could deliver 100% at half the cost but only give us 50% art an increased cost then that isn't ok. It isn't necessarily linear and without a lot more information we can't know.

30% is a bit rubbish,as in the end,you could just drop a few settings and get near enough that. Sometime like 50% extra performance is the difference between 40FPS and 60FPS in a game,or 95FPS and 144FPS. It's very noticeable.

Nvidia is at mining level margins.....with no mining. In the end,if they want to increase pricing of their items,way past inflation,then good for them....they can keep their products and the same goes with AMD. As much as they are not charities,neither are consumers. If it's too expensive,then that's their problem,and they can find less expensive ways of increasing performance. If we keep making excuses,it will be like Intel with minor progress here and there for years. Apple tried the same rubbish,and see what happened with all the cheaper competitors?? Their sales started to go down,so they needed to start introducing cheaper models. Samsung tried the same,and in the end had to get off their arse too,and make some decent cheaper models as they started loosing sales too.

GPU improvements are barely keeping up with game visual improvements,let alone increases in monitor resolution. This is why most people are still stuck at 1080p,as mainstream graphics cards are not increasing in performance at a fast enough rate. It used to be that the mainstream GPU of a new generation,used to be 80% to 90% of the top GPU of the previous generation,many times(not all times). Does even a RTX2060,which is more a lower end high end GPU,even keep up with a GTX1080TI...no it doesn't. Did a GTX1060 keep up with the GTX980TI,no it didn't. AMD barely keeps up,because their top end wasn't really fast either.

Now mainstream GPUs are lower tier performance,and its starting to really hold back PC gaming. Look at all the popular games,ie,Fortnite and the like....they look cartoony. There is a reason for that.

Edit!!

Also think outside of richer markets such as the UK. Many games have regional pricing which can be half that of the UK. Now imagine,how bad the price increases,and slowing down mainstream GPU performance,is affecting the rest of the PC gaming world?? No wonder so many PCs on Steam and elsewhere have slow parts.
 
Last edited:
30% is a bit rubbish,as in the end,you could just drop a few settings and get near enough that. Sometime like 50% extra performance is the difference between 40FPS and 60FPS in a game,or 95FPS and 144FPS. It's very noticeable.

Nvidia is at mining level margins.....with no mining. In the end,if they want to increase pricing of their items,way past inflation,then good for them....they can keep their products and the same goes with AMD. As much as they are not charities,neither are consumers. If it's too expensive,then that's their problem,and they can find less expensive ways of increasing performance. If we keep making excuses,it will be like Intel with minor progress here and there for years. Apple tried the same rubbish,and see what happened with all the cheaper competitors?? Their sales started to go down,so they needed to start introducing cheaper models. Samsung tried the same,and in the end had to get off their arse too,and make some decent cheaper models as they started loosing sales too.

GPU improvements are barely keeping up with game visual improvements,let alone increases in monitor resolution. This is why most people are still stuck at 1080p,as mainstream graphics cards are not increasing in performance at a fast enough rate. It used to be that the mainstream GPU of a new generation,used to be 80% to 90% of the top GPU of the previous generation,many times(not all times). Does even a RTX2060,which is more a lower end high end GPU,even keep up with a GTX1080TI...no it doesn't. Did a GTX1060 keep up with the GTX980TI,no it didn't. AMD barely keeps up,because their top end wasn't really fast either.

Now mainstream GPUs are lower tier performance,and its starting to really hold back PC gaming. Look at all the popular games,ie,Fortnite and the like....they look cartoony. There is a reason for that.

Edit!!

Also think outside of richer markets such as the UK. Many games have regional pricing which can be half that of the UK. Now imagine,how bad the price increases,and slowing down mainstream GPU performance,is affecting the rest of the PC gaming world?? No wonder so many PCs on Steam and elsewhere have slow parts.

No excuses we just don't have all the facts. I wouldn't be surprised if they are doing a Sergey Bubka. The only way to be sure is to stop buying their products en masse, if they do their pricing we'll know ;)
 
I personally wouldn't worry about AMD pricing on RDNA2. At a guess their lineup would be around 5700 xt pricing and probably lower then 2080ti pricing.
However, what is interesting is the potential increase in performance to Big Navi. With the amount of uArch changes I would be surprised if the performance isn't around 50% of the 2080ti.
OC potential around 2800Mhz
3D memory stacking
4 SDMA instead of 2
HBM2e??
Higher Rop counts
80 and 80+ CUs
There is a a lot to be excited about in those rumors. Just have to see how much of this is true ;)
 
The 2070 Super, was launched 2 days after the 5700 XT, so no, they hadn't been out much longer.

You are quite right, I thought they were out longer. :o Unsure of the availability at the time as I was not buying, although from memory it cant have been a longstanding situation. Whenever I checked the OcUK site they were always dearer ;)
 
The real question is what is possible? If 30% is all they can deliver and it cost more R&D than the previous generation then that's fine. If they could deliver 100% at half the cost but only give us 50% art an increased cost then that isn't ok. It isn't necessarily linear and without a lot more information we can't know.

dude that's insane, have you not seen the margins? Current estimates say a 2080Ti costs roughly £450 to produce, yes you can add some R&D, overheads and profit etc but no way does that equate to the current asking prices even today months before a new release. Yes its not linear but neither does Nvidia, board partners & retailers have the god given right to rip people off which is pretty much what they are doing on the supply/demand argument.
 
I guess this is where we are different people cause I wouldn't accept that no matter how fast the 3080ti would be as the way I see it, it is only as fast as it needs to be to beat the competition and not a megaflop more.

Could you elaborate on this? I'm not sure I understand what you are saying.
 
We will see won't we. But it certainly coincides with my previous post ;).

It's obvious, AMD have been pushing pricing since the awful FuryX (I had a Fury once the pricing became more reasonable, good card). Matched a 980 performance on launch for 980Ti pricing, gg AMD. No longer budget indeed, just don't take the **** while you're at it :D
 
It's obvious, AMD have been pushing pricing since the awful FuryX (I had a Fury once the pricing became more reasonable, good card). Matched a 980 performance on launch for 980Ti pricing, gg AMD. No longer budget indeed, just don't take the **** while you're at it :D
I'm not locked into one brand :D
And it's 2020 not 2015. And we have not only AMD but Nvidia and Next Gen Consoles releasing this year. Something that didn't happen in 2015. So a bit of projecting there. ;)
 
dude that's insane, have you not seen the margins? Current estimates say a 2080Ti costs roughly £450 to produce, yes you can add some R&D, overheads and profit etc but no way does that equate to the current asking prices even today months before a new release. Yes its not linear but neither does Nvidia, board partners & retailers have the god given right to rip people off which is pretty much what they are doing on the supply/demand argument.

Don't get me wrong, I'm no corporate cheerleader. My only point is lots of figures are bandied around as fact when we don't know, it's all just guessing. Anyone here got the breakdown of the R&D costs for Ampere? I don't mean a back of a fag packet calculation, I mean official documents.
 
Could you elaborate on this? I'm not sure I understand what you are saying.

Sure, what I mean is that Nvidia (or AMD for that matter) won't create a GPU that is the fastest it can be unless it is needed to beat the competition. They will only make it as fast as they think is necessary to win in the bracket the GPU is slotted in. If there is no one to fight against then they (the Nvidia's, AMDs, and Intel of the world) will only add what they think will be just enough to get consumers interested. We saw this with Touring and RayTracing but it certainly backfired a bit as the rasterization performance was lackluster from the new generation of GPU. I'm fairly certain that Touring could have been better at launch, we could have gotten the super cards as non-super cards instead of the current non-super cards :P. At least that is my personal view of it.
 
Sure, what I mean is that Nvidia (or AMD for that matter) won't create a GPU that is the fastest it can be unless it is needed to beat the competition. They will only make it as fast as they think is necessary to win in the bracket the GPU is slotted in. If there is no one to fight against then they (the Nvidia's, AMDs, and Intel of the world) will only add what they think will be just enough to get consumers interested. We saw this with Touring and RayTracing but it certainly backfired a bit as the rasterization performance was lackluster from the new generation of GPU. I'm fairly certain that Touring could have been better at launch, we could have gotten the super cards as non-super cards instead of the current non-super cards :p. At least that is my personal view of it.

Definitely a possibility. Do you think it's a possibility that that was the best they could do on the older, cheap process? Ie they stayed on it for big margins with no competition but it was also approaching its maximum capability?
 
I'm not locked into one brand :D
And it's 2020 not 2015. And we have not only AMD but Nvidia and Next Gen Consoles releasing this year. Something that didn't happen in 2015. So a bit of projecting there. ;)

Neither am I, fanboyism is just silly in this game ;)

As always, time will tell!! Not long to go now for these lovely new cards to arrive :cool:
 
Definitely a possibility. Do you think it's a possibility that that was the best they could do on the older, cheap process? Ie they stayed on it for big margins with no competition but it was also approaching its maximum capability?
Sure, nvidia most likely have been near the edge of what the 12nm finfet would allow for but there is more to it than just that I feel. For example, the RTX 2080 was launched using a cut down 104 die instead of the full "fat" version. Why cut the die down when the xx80 is basically the highest performing card using a 104 die? Because they didn't need the performance at the time of launch and should AMD somehow manage to counter it they had the full fat 104 die to fall back on. We saw this when the RX 5700XT launched. To me, this makes sense from a business perspective. Companies are not charities even when they want to be seen as such. It's about money and securing the future for more sales and more money. To me, this seems to be the logical conclusion but of course, I can be wrong :).
 
Sure, nvidia most likely have been near the edge of what the 12nm finfet would allow for but there is more to it than just that I feel. For example, the RTX 2080 was launched using a cut down 104 die instead of the full "fat" version. Why cut the die down when the xx80 is basically the highest performing card using a 104 die? Because they didn't need the performance at the time of launch and should AMD somehow manage to counter it they had the full fat 104 die to fall back on. We saw this when the RX 5700XT launched. To me, this makes sense from a business perspective. Companies are not charities even when they want to be seen as such. It's about money and securing the future for more sales and more money. To me, this seems to be the logical conclusion but of course, I can be wrong :).

I think you're more likely right than wrong.
 
Sure, what I mean is that Nvidia (or AMD for that matter) won't create a GPU that is the fastest it can be unless it is needed to beat the competition. They will only make it as fast as they think is necessary to win in the bracket the GPU is slotted in. If there is no one to fight against then they (the Nvidia's, AMDs, and Intel of the world) will only add what they think will be just enough to get consumers interested. We saw this with Touring and RayTracing but it certainly backfired a bit as the rasterization performance was lackluster from the new generation of GPU. I'm fairly certain that Touring could have been better at launch, we could have gotten the super cards as non-super cards instead of the current non-super cards :p. At least that is my personal view of it.

Turing, and its price/performance stagnation, is the outlier rather than the norm.

Pascal was pretty good, and the 1080ti, when compared to previous gen, looks like Nvidia was swinging for the fences.

Price performance stagnation, or even regression, could become the new normal, but one gen is kind of soon to make the call with multiple generations of good progress before Turing.
 
Status
Not open for further replies.
Back
Top Bottom