• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Navi 23 ‘NVIDIA Killer’ GPU Rumored to Support Hardware Ray Tracing, Coming Next Year

Status
Not open for further replies.
Soldato
Joined
19 Nov 2015
Posts
4,867
Location
Glasgow Area
I'm in the business of selling very expensive things which technically have "no competition". However... That's not how the world works.


The whole "nvidia don't have any competition at the high-end" and "Nvidia can charge what they want" is a lot of tosh. Of course they have competition. They have to compete for your money. They have competition from drones, TVs, motorbikes, metal detectors, console, football season tickets... Etc etc

You name it! They have to offer a value for your money. Gaming graphics cards are a complete unnecessary luxury purchase and as such have to compete as a value for money proposition. Case in point, I had money saved for a new GPU. Didn't like the price of the 2080ti so I bought a switch and a drone instead. That's literally a lost sale for nvidia same as if I had bough an AMD card.

Nvidia are writing their own downfall with this intel-esc attitude.
 
Last edited:
Associate
Joined
3 May 2007
Posts
1,878
^ yep this is exactly it, I would like a gpu upgrade. 4k and vr need everything i can chuck at them.
But no way in hell am I paying 1k for a gpu, another lost sale.
 
Soldato
Joined
20 Apr 2004
Posts
4,365
Location
Oxford
I'm in the business of selling very expensive things which technically have "no competition". However... That's not how the world works.


The whole "nvidia don't have any competition at the high-end" and "Nvidia can charge what they want" is a lot of tosh. Of course they have competition. They have to compete for your money. They have competition from drones, TVs, motorbikes, metal detectors, console, football season tickets... Etc etc

You name it! They have to offer a value for your money. Gaming graphics cards are a complete unnecessary luxury purchase and as such have to compete as a value for money proposition. Case in point, I had money saved for a new GPU. Didn't like the price of the 2080ti so I bought a switch and a drone instead. That's literally a lost sale for nvidia same as if I had bough an AMD card.

Nvidia are writing their own downfall with this intel-esc attitude.

lol
 
Associate
Joined
14 Jan 2014
Posts
220
its in the quote you replied to ;)

the increased performance in Ray Tracing is worth the extra £70 for me. IF the next gen of cards from AMD rival NV for performance AND raytacing then my next card will uite possibly be AMD.... but it will definitely have ray tracing. Whilst not many games support it, those which do look better than without imo.

The question is however is whether or not NV will drop prices even more

Ah right, makes sense now, I didn't really get that from the original post. More along the lines of you were having a debate in your head over bang for buck and didn't really reach a conclusion.

I'd probably agree with the sentiment anyway, I still see NVidia as a marginally superior product if you were comparing like for like / tier for tier, between Nv and AMD. I don't know if that's synonymous with the public's view of the industry but I suppose that is reflected in AMD's pricing, undercutting Nv in each tier price range.

I have always been Nvidia, although currently on my first AMD card (5700) and pleased to say its fine, I've had no real issues other than some teething problems jumping in to Nv graphic setting heavy games (e.g. hairworks in witcher, reflections in GTA).

I can only put my slightly skewed favouritism of Nvidia down to 1) the fact that when i was first in the market for a GPU, the general consensus, in these forums and others, was that AMD's drivers were awful; and 2) having used Nvidia for years with absolutely zero problems, i can't fault their product.

That said, I will not, ever, take the bait on price gouged products. £650 was the price point for Ti models for years and £500 before that. The 2080Ti series jump to £1100 is just a joke. If Nv revert back to their previous pricing structure then i'm in 100% for the 30xx series. Remaining at this ludicrous inflated structure is BS.
 
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
I generally agree with your argument here, except that NVidia appear to be pushing hard for better tech and performance, where Intel appeared to stagnate.

Define "performance"? RTX2080Ti is on average 35% faster at 4K than the 5700XT reference with 19.11 drivers, while costs over 3 times as much. (consider reference was £299 the last 2+ months at OCUK). And with 19.12.2/3 drivers the gap is even smaller.

So AMD actually doesn't have that big road to cover as you might expect. And before you reply, enjoy this.

 
Caporegime
Joined
17 Mar 2012
Posts
47,668
Location
ARC-L1, Stanton System
Define "performance"? RTX2080Ti is on average 35% faster at 4K than the 5700XT reference with 19.11 drivers, while costs over 3 times as much. (consider reference was £299 the last 2+ months at OCUK). And with 19.12.2/3 drivers the gap is even smaller.

So AMD actually doesn't have that big road to cover as you might expect. And before you reply, enjoy this.


As i said before, AMD's RDNA shaders are very good, the 5700XT is an upper mid range GPU, its really an RX 580. 251mm^2 on 7nm, the 2080TI is an enthusiast GPU 775mm^2 on 12nm, more than 3X the size on a 1.7X larger node.

Nvidia with Turing are not that far ahead, if ahead at all, i don't think they are, AMD could easily make a larger RDNA GPU that would beat the 2080TI and it still be smaller with the node difference taken into account.
 
Last edited:
Soldato
Joined
22 Nov 2009
Posts
13,252
Location
Under the hot sun.
As i said before, AMD's RDNA shaders are very good, the 5700XT is an upper mid range GPU, its really an RX 580. 251mm^2 on 7nm, the 2080TI is an enthusiast GPU 775mm^2 on 12nm, more than 3X the size on a 1.7X larger node.

Nvidia with Turing are not that far ahead, if ahead at all, i don't think they are, AMD could easily make a larger RDNA GPU that would beat the 2080TI and it still be smaller with the node difference taken into account.

Lets take into consideration this. RDNA is what Zen1 was. If you look at the design feels that if you cut it in half and somehow connect them through something similar to Infinity Fabric it would work. RDNA2 includes RT etc on the actual shaders units, not something external "glued" to it. And make sense considering takes 3 years to design and develop a new architecture that would need to operate with MCM in mind after this year at 5nm.

Turing cannot work like that due to the design and the includion of 2 extra blocks. I will be surprised if Nvidia keeps Tensor & Ray Tracing cores that way on 7nm. More likely going to use a similar RDNA2 design philosophy of including this extra functionality on the CUDA core/units so it would be easier to split on multiple chiplets to connect them together. After all Nvidia said numerous times MCM is going to be the future of GPUs.
And makes sense given that no 700mm2 behemoths can be made at 7nm let alone 5nm.

If Nvidia keeps the 3 different parts as they are today, with their indivicuality, it will end up like Intel. Making expensive monolithic chips losing the race.

And feeling sorry for those bought RTX2080Ti believing it's RT & Tensor cores would keep working on Ray Tracing "to the future". Nvidia has proven time and again, isn't afraid to kick people in the nuts, completely making irrelevant and obsolete their own tech.
 
Caporegime
Joined
17 Mar 2012
Posts
47,668
Location
ARC-L1, Stanton System
Lets take into consideration this. RDNA is what Zen1 was. If you look at the design feels that if you cut it in half and somehow connect them through something similar to Infinity Fabric it would work. RDNA2 includes RT etc on the actual shaders units, not something external "glued" to it. And make sense considering takes 3 years to design and develop a new architecture that would need to operate with MCM in mind after this year at 5nm.

Turing cannot work like that due to the design and the includion of 2 extra blocks. I will be surprised if Nvidia keeps Tensor & Ray Tracing cores that way on 7nm. More likely going to use a similar RDNA2 design philosophy of including this extra functionality on the CUDA core/units so it would be easier to split on multiple chiplets to connect them together. After all Nvidia said numerous times MCM is going to be the future of GPUs.
And makes sense given that no 700mm2 behemoths can be made at 7nm let alone 5nm.

If Nvidia keeps the 3 different parts as they are today, with their indivicuality, it will end up like Intel. Making expensive monolithic chips losing the race.

Read....

This is quite an old article now, March this year, but one that seems to have passed us by.

https://www.tomshardware.com/news/amd-3d-memory-stacking-dram,38838.html

One of the problems with 3D Stacking is interconnect bandwidth when passing data outside of the source silicon.

For example AMD's Zen and Zen+ CPU architecture is a half step to separating dies, they are monolithic in that they contain the core and uncore in a single package and split into two zones with a fabric link in the middle joining not just the two halves but an infinite loop of external monolithic dies, "Infinity Fabric"

PSgvDhl.jpg.png

Zen 2 is a full separation of the core and uncore.
The outer chiplets, which are 7nm, contain the cores and Caches. The central chip, which is 14nm, contains the PCIe lanes, the Memory Controllers, the IO.... and critically the Interconnect Controller IP.

WuiFNq7.png

This is the missing technology for true 3D Stacking, by true i'm talking about connecting stacked separate components directly to where it is needed through the component as if it was a 'vertical' piece of silicon. Like HBM.
Not just stacking components and passing data around the outside, Intel.

So you can Stack HBM on top of a GPU which is stacked on top of Core Chiplets which are stacked ontop of an IO die.

Or HBM stacked on top of GPU's, System Ram Stacked on an IO die tapping directly into the controllers.

This saves space, makes the chips overall cheaper to manufacture, increases power efficiency and performance.

We are well on our way to 3D stacking, and something tells me the package in the new XBox will be 3D stacked, we didn't see it but its right under our noses, AMD have been building the technology bit by bit and putting it in our hands, I think the new XBox is going to be the testing ground for Ryzen, and more besides, in a GPU.

The IP patent.
http://www.freepatentsonline.com/20190196742.pdf

https://forums.overclockers.co.uk/threads/3d-stacked-gpus-comming-sooner-than-you-think.18874970/
 
Associate
Joined
16 Jan 2010
Posts
1,415
Location
Earth
Define "performance"? RTX2080Ti is on average 35% faster at 4K than the 5700XT reference with 19.11 drivers, while costs over 3 times as much. (consider reference was £299 the last 2+ months at OCUK). And with 19.12.2/3 drivers the gap is even smaller.

So AMD actually doesn't have that big road to cover as you might expect. And before you reply, enjoy this.

I have to say that puts things in perspective. That's the sort of difference and improvement (5700XT to 2080ti) that we used to enjoy when a new card came out at approximately the same price
as the previous generation. Nvidia's completely unjustified price hike with the 2080ti is all too clear.
 
Associate
Joined
21 Apr 2007
Posts
2,487
Define "performance"? RTX2080Ti is on average 35% faster at 4K than the 5700XT reference with 19.11 drivers, while costs over 3 times as much. (consider reference was £299 the last 2+ months at OCUK). And with 19.12.2/3 drivers the gap is even smaller.

So AMD actually doesn't have that big road to cover as you might expect. And before you reply, enjoy this.


I enjoyed that :)

My issue is there is a brick wall of price/perf not where it should be and the costs over and above that wall are HUGE and completely unreasonable
 
Caporegime
Joined
18 Oct 2002
Posts
32,618
I'm in the business of selling very expensive things which technically have "no competition". However... That's not how the world works.


The whole "nvidia don't have any competition at the high-end" and "Nvidia can charge what they want" is a lot of tosh. Of course they have competition. They have to compete for your money. They have competition from drones, TVs, motorbikes, metal detectors, console, football season tickets... Etc etc

You name it! They have to offer a value for your money. Gaming graphics cards are a complete unnecessary luxury purchase and as such have to compete as a value for money proposition. Case in point, I had money saved for a new GPU. Didn't like the price of the 2080ti so I bought a switch and a drone instead. That's literally a lost sale for nvidia same as if I had bough an AMD card.

Nvidia are writing their own downfall with this intel-esc attitude.


Except Nvidia are enjoying record breaking sales, profits and massive market share. Just because Nvidia lost to competition for your sale, doesn't mean that is the case in general for the market. If the market did not support Nvidia's price points then Nvidia would reduce prices.

Nvidia is nothing like Intel so the comparison is moot.
 
Soldato
Joined
22 Nov 2018
Posts
2,715
The whole "nvidia don't have any competition at the high-end" and "Nvidia can charge what they want" is a lot of tosh.

Yeah but Nvidia definately do charge what they want. They have increased their profit margin to a record high. The increase in price parallels the increase in market share over the years.

When Nvidia's market share was smaller, they charged less profit. You can check the stats online.
They have to offer a value for your money.

You'd think so but they dont unfortunately.
 
Last edited:
Soldato
Joined
1 May 2013
Posts
9,713
Location
M28
Since CES was a no show for this mythical 'killer' card :D

Which trade shows are upcoming and ones that AMD has historically announced new products at?

I am itching for an HDMI2.1 full-featured card and the first decent one to come out from either camp I will snaffle up :)
 
Caporegime
OP
Joined
24 Sep 2008
Posts
38,322
Location
Essex innit!
Define "performance"? RTX2080Ti is on average 35% faster at 4K than the 5700XT reference with 19.11 drivers, while costs over 3 times as much. (consider reference was £299 the last 2+ months at OCUK). And with 19.12.2/3 drivers the gap is even smaller.

So AMD actually doesn't have that big road to cover as you might expect. And before you reply, enjoy this.

Whilst I don't argue with that, there is other things at play that need to be considered. NVidia have at least strived to bring the industry forward with RayTracing and anyone who is clued up should know that this is the way forward. I begrudingly paid for my 2080Ti but have no regrets because I really like the way RT works in games like Control and Metro. BF5 even looks great, even though you don't get to appreciate it.
 
Soldato
Joined
17 Jul 2007
Posts
24,529
Location
Solihull-Florida
Define "performance"? RTX2080Ti is on average 35% faster at 4K than the 5700XT reference with 19.11 drivers, while costs over 3 times as much. (consider reference was £299 the last 2+ months at OCUK). And with 19.12.2/3 drivers the gap is even smaller.

So AMD actually doesn't have that big road to cover as you might expect. And before you reply, enjoy this.



So the Ti beat the AMD card hands down.

Just what are you trying to prove?

And where is this NV beating card?
 
Status
Not open for further replies.
Back
Top Bottom