• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Soldato
Joined
21 Jul 2005
Posts
20,044
Location
Officially least sunny location -Ronskistats
The vast majority will buy nvidia because AMD price matches Nvidias cards based on raster but offers less features. I like the strides AMD has made with RDNA2 but me to consider buying an AMD card at a similar price as an nvidia equivalent they either need better raster +20% or raster and featureset parity, failing that then they need to be cheaper.

Why would anyone pay the same price for an inferior product?

What feature set are we talking about pre ray tracing then? I mean look how well physX and g-sync panned out... if you read what people on this forum post about - you will see many struggle to run ray tracing on games so they mention "doesnt have enough horsepower". Hardly a feature worth bragging about. I agree this gen the prices of AMD should never be on par with nvidia, however you are now seeing many AMD cards dropping below msrp - however the nvidia cards not so much movement in comparison.
 
Caporegime
Joined
4 Jun 2009
Posts
31,045
Just maybe the reason nvidia gpus are still costing more and amd cards are costing less is because more people are still buying nvidia far more than amd cards? And just maybe it's because people value nvidia over amd for certain reasons.... That's how these things work, if re/e tailers are having a hard time selling stock, they have no choice but to reduce the price and if said items are selling extremely well, why reduce the price.....

As for comparing nvidias halo card, it's not exactly new nvidia charging a premium for them cards, this has always been the way.

As for this comment:

you will see many struggle to run ray tracing on games so they mention "doesnt have enough horsepower". Hardly a feature worth bragging about

Bit of a difference, one is far more usable than the other brand i.e. recent RT game:

aNDvW93.png

In a lot of games, it is literally a case of reducing settings or turning the feature off entirely compared to the competition.....

6750XT cheapest is £529.00, 3070ti going from £649.00 to £750....

How are the these cards the same price as NV?

:D

A 3070ti is faster than a 6750xt.... 3070 is a better match for the 6750xt and it costs what £440?


But yes, lets stick to facts :)
 
  • Like
Reactions: TNA
Soldato
Joined
28 Oct 2011
Posts
8,405
That my friend, is how you spot a man maths excuse to buy your favourite brand! :p

:D

I know and they constantly post on these forums "Woe is me if only AMD did this and that i'd buy their cards instead" - yeah of course you would mate...I don't say to myself, yeah I'd buy Nike if they did x and y - because I know it is nonesnse i'll buy Adidas like I do every time and have done since my balls dropped.

Then of course you have to wait 4 or 5 years for NV to catch up with VRAM. 3070ti? Still on 2017's VRAM...

But they're the same price!

Two attempts by NV to get up to 12GB for the 3080, still 4GB short of the minimum it should have had on release...

£300-400 more than the 6800Xt.

But they're the same price!

You've got to laugh.

:cry:
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,694
Location
Uk
LOL 6950 XT is about £500 cheaper than 3090ti...

:D
People shopping at that end of the market care little for prices though.
:D

Then of course you have to wait 4 or 5 years for NV to catch up with VRAM. 3070ti? Still on 2017's VRAM...

But they're the same price!

Two attempts by NV to get up to 12GB for the 3080, still 4GB short of the minimum it should have had on release...

£300-400 more than the 6800Xt.

But they're the same price!

You've got to laugh.

:cry:
If the 3080 had of come with 16gb it would have been a slower card with a bandwidth of just over 600 compared to the 760 it had which would have meant it struggled to feed all those cuda cores, this would have effected many more games than just the 1 or 2 AMD sponsored titles in which the 10gb buffer struggles.

16gb on a 3070ti would have been overkill for a card of that performance level and added a lot of extra cost while 12gb on a 192 bus would have made the card slower unless they used a 384 bit bus which would have also added to the costs.
 
Soldato
Joined
28 Oct 2011
Posts
8,405
If the 3080 had of come with 16gb it would have been a slower card with a bandwidth of just over 600 compared to the 760 it had which would have meant it struggled to feed all those cuda cores, this would have effected many more games than just the 1 or 2 AMD sponsored titles in which the 10gb buffer struggles.

-------------------------------------------------


6800XT says hello.

I honestly think Nvidian's lie awake at night making this stuff up.

------------------------------------------------

People shopping at that end of the market care little for prices though.

----------------------------------------------


:cry:

Here we go again, now the facts don't fit the narrative, another attempt to move the goalposts. You never mentioned that people at the top end of the market care little for prices in your earlier post?

---------------------------------------------

The vast majority will buy nvidia because AMD price matches Nvidias cards based on raster but offers less features. I like the strides AMD has made with RDNA2

Why would anyone pay the same price for an inferior product?

-------------------------------------------


It's nothing to do with price, feature set, raster or anything else - it's to do with brand. You even let the cat out the bag with this...

--------------------------------------------

but me to consider buying an AMD card at a similar price as an nvidia equivalent they either need better raster +20% or raster and featureset parity, failing that then they need to be cheaper.

-----------------------------------------


:cry:

Don't you read your own posts? So basically AMD have to be better than NV by 20% or cheaper for you to buy one?

This is why people say that Nvidian's only want AMD to do better so they can by NV cheaper! because it's true!

Cheers mate! You've made my arguments for me perfectly.

Not about cost, raster, feature set or anything else - it's about BRAND! always has been!

;)
 
Soldato
Joined
6 Feb 2019
Posts
17,594
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,694
Location
Uk
If the 3080 had of come with 16gb it would have been a slower card with a bandwidth of just over 600 compared to the 760 it had which would have meant it struggled to feed all those cuda cores, this would have effected many more games than just the 1 or 2 AMD sponsored titles in which the 10gb buffer struggles.

-------------------------------------------------


6800XT says hello.

I honestly think Nvidian's lie awake at night making this stuff up.

------------------------------------------------

People shopping at that end of the market care little for prices though.

----------------------------------------------


:cry:

Here we go again, now the facts don't fit the narrative, another attempt to move the goalposts. You never mentioned that people at the top end of the market care little for prices in your earlier post?

---------------------------------------------

The vast majority will buy nvidia because AMD price matches Nvidias cards based on raster but offers less features. I like the strides AMD has made with RDNA2

Why would anyone pay the same price for an inferior product?

-------------------------------------------


It's nothing to do with price, feature set, raster or anything else - it's to do with brand. You even let the cat out the bag with this...

--------------------------------------------

but me to consider buying an AMD card at a similar price as an nvidia equivalent they either need better raster +20% or raster and featureset parity, failing that then they need to be cheaper.

-----------------------------------------


:cry:

Don't you read your own posts? So basically AMD have to be better than NV by 20% or cheaper for you to buy one?

This is why people say that Nvidian's only want AMD to do better so they can by NV cheaper! because it's true!

Cheers mate! You've made my arguments for me perfectly.

Not about cost, raster, feature set or anything else - it's about BRAND! always has been!

;)
AMD use the large cache on their cards to get around the bandwidth limitations although they do still tend to struggle more at high resolutions despite often having more VRAM also remember that the 3080 was available for msrp at times even during the crazy times last year where other cards were selling for double

Also your complaining about the VRAM and prices nvidia cards but you seem pretty happy with an 8gb 3060ti you bought.

Equally and overwhelmingly good experiences with both over the years. I had an Nvidia card fail almost straight away many years ago and and EVGA replaced it with a much better brand new card. Never had any driver issues, I've had cards that ran far too hot but we're talking 2000's here and I quickly sold both and replaced them, one AMD and one NV.

I've had AMD since about 2012 because they were simply better VFM up until this gen, then bought a 3060ti in December and very happy with that.

I've found the easiest way to be happy with your card is to stop inccesantly fiddling with it and play games...

;)
 
Soldato
Joined
6 Feb 2019
Posts
17,594
Doesn't look good for RTX4000

Report out today saying everyone but Intel is calling TSMC asking to delay their next gen wafer deliveries - Nvidia is the worst, Nvidia told TSMC it wants to cut its order not just delay it.

The sudden sharp reduction in PC hardware sales and the mining crash has sent some shockwaves out and suddenly everyone believes they have ordered too much supply from TSMC. Nvidia as mentioned is the worst because they already have too much RTX3000 stock and are trying to reneg on pre purchased 5nm wafers and apparently TSMC has said "nnnnnnnnnnooooooo"

The good news is that with lower demand, RTx4000 may be priced better, the bad news is RTx4000 is probably now delayed because Nvidia has too much supplies of Ampere

 
Soldato
Joined
6 Aug 2009
Posts
7,071
Doesn't look good for RTX4000

Report out today saying everyone but Intel is calling TSMC asking to delay their next gen wafer deliveries - Nvidia is the worst, Nvidia told TSMC it wants to cut its order not just delay it.

The sudden sharp reduction in PC hardware sales and the mining crash has sent some shockwaves out and suddenly everyone believes they have ordered too much supply from TSMC. Nvidia as mentioned is the worst because they already have too much RTX3000 stock and are trying to reneg on pre purchased 5nm wafers and apparently TSMC has said "nnnnnnnnnnooooooo"

The good news is that with lower demand, RTx4000 may be priced better, the bad news is RTx4000 is probably now delayed because Nvidia has too much supplies of Ampere

By all accounts Nvidia pay a lot of cash up front too to TSMC whereas AMD don't. Despite the naysayers I smell some bargains on the way.
 
Associate
Joined
21 Apr 2007
Posts
2,487
The good news is that with lower demand, RTx4000 may be priced better, the bad news is RTx4000 is probably now delayed because Nvidia has too much supplies of Ampere
That's not what they did with Turing unfortunately, they made you pay through the roof to get a meaningful performance upgrade. Different market conditions in the latter part of this year of course I think it will be interesting to see what performance a 4080 & 4070 nets you and if they try to tie that back to a 3090 as a way to fake the value you'd be getting as opposed to an MSRP 3080.

I think how we'll see this play out is a modest upgrade for the 4080 vs. 3080 but the 4090 making significant gains for a very high price marketed as a gaming GPU like the 2080Ti was. That would fit with a reduction in allocation and silicon supply.
 
Caporegime
Joined
4 Jun 2009
Posts
31,045
Gsync ultimate module provides feature to the screen that non ultimate monitors including Freesync don't have - variable refresh overdrive

Good find, guy seems to know his stuff (although things will be a bit different now given the post is 2 years old but highlights the main thing I was referring to with the variable overdrive)

The scaler is the part that makes this happen. The G-Sync module uses a custom scaler. All other monitors user off-the-shelf scalers. Early Freesync monitors used scalers not originally intended for variable refresh rates. You were lucky to be able to use them with a decent range, let alone static overdrive. Throwing variable overdrive into the mix wasn't going to happen.

We're now seeing wider ranges with current-gen scalers, but still not to the same degree as Nvidia's scaler. I think we're 1-2 scaler generations away from seeing variable overdrive implemented in monitors without a G-Sync module.

Last I recall, there were 3 vendors making scalers. Up until recently (the last few years), there wasn't much demand for adaptive-sync since it was new tech (demand from monitor manufacturers to scaler providers, not consumer demand). Since Nvidia's scaler is proprietary, and the 3 other vendors have the other monitor manufacturers captive, there's no reason for them to spend the extra R&D to get a scaler on par with Nvidia's.

Basically, every monitor with a G-Sync module uses Nvidia's scaler. That gives them scale. There are likely zero companies who, on their own, sell as many gaming monitors in one year as Nvidia sells G-Sync modules. A single gaming monitor company would need at least that kind of scale to justify the R&D needed to make a scaler on par with Nvidia's.

Finally, it's hard to convince gamers of the advantages of that scaler for the following reasons:

  • It would raise the price per unit to account for the R&D.
  • Good luck convincing a gamer that a 30-240hz is noticeably better than 48-240hz (it's not thanks to AMD LFC and Nvidia's unbranded equivalent).
  • Good luck explaining variable overdrive to the average gamer (it's still a foreign concept to most).
  • And finally, good luck getting a reviewer to even test variable overdrive. Currently, I don't know of a single one who does, and the proper testing methodology to do so would range from time-consuming to grueling. And since it's difficult to test, it's even harder to explain to the average gamer (see the prior point).
Bottom line is that there's just not enough justification for the R&D that would need to go into it right now.
 
Last edited:
Associate
Joined
31 Dec 2010
Posts
2,440
Location
Sussex
Planned trickle of stock.

Now we know that AMD have been margins obsessed lately and have overpriced their GPUs. Some of this was due to the shortage of 7nm and since GPUs are the the lower margin product where they aren't bound by the sweetheart deal Microsoft and Sony got.

However some of this - especially before RDNA2 - was due to choice. They'd rather sell less than reduce their margins, even - and this is the strange part - if this does nothing about getting a return on their fixed costs (R&D, masks, drivers etc.). The obsession on margins seems to ignore the fixed costs.

Anyway, my take is that AMD should buy up as much wafers as they can get as they have a broad collection of chips and in the GPU market - if they were willing to play it smart - they could easily increase their tiny 17-19% marketshare to 30% or more.

I guess if this is game of predicting what the opposition would do (rather than being the result of collusion), then as by far the ones with the smaller marketshare AMD's approach should be quite different from Nvidia's. However that report looks almost like what we would see if there was a colluding cartel. Would be a delicious irony of Nvidia's attempt to keep prices up with an artificial shortage backfires and AMD takes all those wafers and goes for GPU marketshare.

Regarding AMD's prices this gen, I sort of agree that AMD has to be cheaper than Nvidia. Not so much that I care for RT or upscaling but lots of people and more importantly how viable is for AMD to survive on under 20% of the marketshare? The 3060Ti and the 6700 XT perform about the same but the MSRP were like £369 vs £419. To me the extra 4GB is nice, but if AMD want marketshare their card should have been £350 or less. Similar 3070 vs 6800 - although there the 16GB vs 8GB is more noticeable; the sell the prices should have been the same.

As for GSYNC modules, wasn't there some top of the range monitors which require a fan because of the module?

Seems to me that Nvidia overclocked these out of the gate, or perhaps more precisely they are using one chip for different specs and to drive certain resolutions and refreshes they have to overclock them. Not that I'm in the market for £1000+ ultimate monitor (unless maybe it was OLED), but a monitor with a fan is big no-no from me.
 
Back
Top Bottom