• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Seems not this time and pretty much all of the media is reporting the same information so if you keep up to to date it shouldn't be a surprise. As I said, it makes sound business sense from Nvidias side to release the most fastest and most expensive card first to maximise potential sales and it's likely this will now be a thing going forward.

Wowsers.
 
Ahh trying to deflect blame from Nvidia, by use of whataboutism. Except it is irrelevant to Nvidia and the pricing of their new cards. It comes across as desperate attempt to shield Nvidia from criticism.

Yep the apologists on here are astounding. Here take my money! After complaining for two years about cards availability and pricing.

If you actually followed the discussion rather than spent your time moving goal post.. But I'm not interested in this dance where I shoot the ball and you quickly move the goal posts.

Debunker lmao. Best of luck staying on topic. Do you specialise in transplants?
 
Yep the apologists on here are astounding. Here take my money! After complaining for two years about cards availability and pricing.

Yup a bit like the people who said they wanted a 3080 but couldn't get one so what did they do.... they rewarded nvidia and this disgraceful behaviour by buying an even more expensive gpu from them, that sure will send a message to nvidia :cry: ;) :D
 
  • Haha
Reactions: TNA
Look at the very generous deals that Nvidia has given us on the EOL 3000 series and ask yourself which of the two you have listed is most likely going to happen.

Who knows maybe having the cake they offered us being thrown in their face might make them reconsider.
Early days yet. Nvidia still rule the roost for now.

There's clearly a high cost associated with producing graphics cards these days, so we aren't - imho - going to see any stellar prices from Nvidia, AMD or Intel. A recession isn't going to change that either - it doesn't suddenly make cards any cheaper to produce.

However, what I do hope we see is more reasonable performance cards in the ~£300-£500 bracket, and far less of a concentration of these cards in the £700-£1k mark. On that front, I remain hopeful, even if it takes us until 2Q 2023 to get there. :)
 
Early days yet. Nvidia still rule the roost for now.

There's clearly a high cost associated with producing graphics cards these days, so we aren't - imho - going to see any stellar prices from Nvidia, AMD or Intel. A recession isn't going to change that either - it doesn't suddenly make cards any cheaper to produce.

However, what I do hope we see is more reasonable performance cards in the ~£300-£500 bracket, and far less of a concentration of these cards in the £700-£1k mark. On that front, I remain hopeful, even if it takes us until 2Q 2023 to get there. :)
A recession doesn't drop costs, but it does forcefully drop profit margins ;)

Well time will tell.
 
Early days yet. Nvidia still rule the roost for now.

There's clearly a high cost associated with producing graphics cards these days, so we aren't - imho - going to see any stellar prices from Nvidia, AMD or Intel. A recession isn't going to change that either - it doesn't suddenly make cards any cheaper to produce.

However, what I do hope we see is more reasonable performance cards in the ~£300-£500 bracket, and far less of a concentration of these cards in the £700-£1k mark. On that front, I remain hopeful, even if it takes us until 2Q 2023 to get there. :)

Well said.

It's pretty simple, if nvidia do try anything dodgy with pricing, you simply vote with your wallet and go rdna 3 (assuming they offer similar package for cheaper/realistic prices) or you just skip a gen or if you badly need a new gpu, buy a second hand current one for cheap (which also means nvidia don't get a cut), not rocket science :)

I personally can't see nvidia charging that much more than what rdna 3 will be (as in >£100-150 difference), the only way they could justify it is if their gpus are significantly better.
 
the market is different this time, PC gamers are now the dominant factor. Nvidia will have a good idea as to how large the market it is for the non-price sensitive segment they tested that earlier this year with the 3090Ti and found very quickly the saturation point.

Anyone not on Ampere or RDNA2 right now is a price sensitive consumer, and when you think about it the only segment really motivated to upgrade is 4k, VR and content creators wanting a better experience.

That top-end market is looking rather small if you ask me, maybe their plan is to sell very few but raked to the absolute limit. If Nvidia do that though they open the door for AMD to massively grow market share if they choose to... and will Nvidia respond with an AD102 4080Ti?

I think we are entering an interesting period, the smart thing to do is at least wait for AMD to show their hand to see which way things are likely to go. Unless the 4090 comes in better than the 3090 which seems unlikely, more likely is a slightly increased MSRP imho.

All of this assumes you are looking at the high end tier, if you're not then non of this applies you will either have a current gen deal to consider (not sure if we have those yet tbh), pickup a used 3080/3090 or wait it out til next year.
 
I personally can't see nvidia charging that much more than what rdna 3 will be (as in >£100-150 difference), the only way they could justify it is if their gpus are significantly better.

6900xt was $999 msrp and the 3090 was $1500 msrp, so that's $500 difference and the 6900xt was not far behind a 3090 in real terms. So expect the same gap again, there never was a £100-150 difference on release and later we all know what happened.

Also these cards are costing more to make now and with new process from TSMC that costs a lot more than Samsungs even with the shortage increases added to the Samsung node at the time. I'm expecting silly prices from them all this time round, even more than 30 series and 60 series.

If AMD has the faster card this time then expect AMD to do what Nvidia does with pricing as they showed their true colours with cpu pricing for the 5000 series, yes you will say the new 7000 series are slightly cheaper (the msrp , no idea yet on street price), they are cheaper because motherboards for them are a lot more expensive and DDR5 RAM, reality is too we have to wait and see how much real world faster these cpus even are yet and the real street price, look at 5950x as an example at the time I purchased one it was £820 from their fake msrp too back then.
 
lmao, so Nvidia is launching a "12gb vram is not enough" edition of the 4080 and also a real 4080
If it's true it's more likely it's being released (small numbers) so they can increase the prices at each tier.... so instead of '650' for the 4080 it will be '900' type of thing. Last I saw was the 4070 was 12GB....

I'm personally aiming around the 4080 range but it must have 16GB+ of memory, not really needed for gaming but it does make a difference with things like 3D rendering etc.
 
I personally can't see nvidia charging that much more than what rdna 3 will be (as in >£100-150 difference), the only way they could justify it is if their gpus are significantly better.
Yeah, my concerns around RDNA3 grow by the day, as we seem to have little realistic info on what it can do. One year ago they were to be the "chiplet" cards of our dreams, but now that seems to be limited to a single - high-end, high-price - card with multiple GPU chiplets. Who knows if even that is true?

Hopefully AMD are sandbagging and have something truly amazing this time around - and across all price tiers.
 
I'm personally aiming around the 4080 range but it must have 16GB+ of memory, not really need for gaming but it does make a difference with things like 3D rendering etc.
This was my thinking. Only issue is, I can still see that being £800+

Tbh, I'd be happy with 3080 level performance from, say, a 4060Ti, if it was priced at £350-£400. The PS5 and XBOX X Series do OK with half that GPU power, so I'd happily accept something much cheaper than have expensive performance I'd never really see or use.
 
the market is different this time, PC gamers are now the dominant factor. Nvidia will have a good idea as to how large the market it is for the non-price sensitive segment they tested that earlier this year with the 3090Ti and found very quickly the saturation point.

Anyone not on Ampere or RDNA2 right now is a price sensitive consumer, and when you think about it the only segment really motivated to upgrade is 4k, VR and content creators wanting a better experience.

That top-end market is looking rather small if you ask me, maybe their plan is to sell very few but raked to the absolute limit. If Nvidia do that though they open the door for AMD to massively grow market share if they choose to... and will Nvidia respond with an AD102 4080Ti?

I think we are entering an interesting period, the smart thing to do is at least wait for AMD to show their hand to see which way things are likely to go. Unless the 4090 comes in better than the 3090 which seems unlikely, more likely is a slightly increased MSRP imho.

All of this assumes you are looking at the high end tier, if you're not then non of this applies you will either have a current gen deal to consider (not sure if we have those yet tbh), pickup a used 3080/3090 or wait it out til next year.

Very true that, nvidia will have a hard time getting people to upgrade from ampere if they don't get the performance to price spot on as look at the recent steam survey, that is a lot of "mindshare" for ampere/nvidia as people like to put it...

CICwcsF.png

Also, amd don't have to supply 80% of their hardware to consoles this time round either so stock shouldn't be an issue this time.

6900xt was $999 msrp and the 3090 was $1500 msrp, so that's $500 difference and the 6900xt was not far behind a 3090 in real terms. So expect the same gap again, there never was a £100-150 difference on release and later we all know what happened.

Also these cards are costing more to make now and with new process from TSMC that costs a lot more than Samsungs even with the shortage increases added to the Samsung node at the time. I'm expecting silly prices from them all this time round, even more than 30 series and 60 series.

If AMD has the faster card this time then expect AMD to do what Nvidia does with pricing as they showed their true colours with cpu pricing for the 5000 series, yes you will say the new 7000 series are slightly cheaper (the msrp , no idea yet on street price), they are cheaper because motherboards for them are a lot more expensive and DDR5 RAM, reality is too we have to wait and see how much real world faster these cpus even are yet and the real street price, look at 5950x as an example at the time I purchased one it was £820 from their fake msrp too back then.

Flagship pricing is an exception as mentioned:

amd and especially nvidia always charge far more for these cards.

More so for nvidia as they know they can market the cards to people who use the cards for their professional workloads, amd don't have that same perk. Everything else below it, nvidia priced very well imo, which most reviewers seemed to agree with except for maybe the extreme low end with the 3050.

3080 - £650
6800xt - £600

3070 - £450
6800 - £530
6700xt - £420

Yeah, my concerns around RDNA3 grow by the day, as we seem to have little realistic info on what it can do. One year ago they were to be the "chiplet" cards of our dreams, but now that seems to be limited to a single - high-end, high-price - card with multiple GPU chiplets. Who knows if even that is true?

Hopefully AMD are sandbagging and have something truly amazing this time around - and across all price tiers.

Yup it can go 2 ways at this point, based on history, I'm expecting much the same as RDNA 2 i.e. great cards overall but will be lacking in 1-2 critical areas, ray tracing being the main thing.
 
Kope says RTX4090 has 75 billion transistors, compared to the 28 billion of the RTX3090. It's not clear how much of the extra transistors are from memory, given that the 4090 has a vastly larger cache size than the 3090.

Transistor sizes:

Rtx4090: 75b (167% more transistors). 96mb cache

Rtx3090: 28b (56% more transistors). 6mb cache

Rtx2080ti:18b (50% more transistors) 5.5mb cache

GTX1080ti:12b (72% more transistors over 780ti). 2.5mb cache

 
Can't wait to sell my 3090 for £400.
Meh... I'm going to sit back with my 3090, have a few pints, and wait for it to blow over. Just because a new, more performant card is released, it doesn't magically make your old flagship stop working, and there's really no game out yet wear the 3090 doesn't shine.

I'll probably pick up a 4 or 7 series later next year once games that start pushing the envelope are released (although there's really nothing in the pipeline yet I see doing that).
 
Just make sure you get the 16GB version mate, because I can tell you now, there will be at least two lads here who will take pleasure in saying…

*drumroll*


12GB is not enough!
Haha definitely mate! As long as it performs I still think 12gb is fine... what we thinking the 4080 will be? At least 75% faster than the 3090 if 4090 is double the 3090?
 
Kope says RTX4090 has 75 billion transistors, compared to the 28 billion of the RTX3090. It's not clear how much of the extra transistors are from memory, given that the 4090 has a vastly larger cache size than the 3090.

Transistor sizes:

Rtx4090: 75b (167% more transistors). 96mb cache

Rtx3090: 28b (56% more transistors). 6mb cache

Rtx2080ti:18b (50% more transistors) 5.5mb cache

GTX1080ti:12b (72% more transistors over 780ti). 2.5mb cache


I don't believe that's true. Unless they have used like 20+ billion transistors on the cache or some other tech inside the gpu. He's again pulling numbers out of his rear end.

As the top end nvidia currently is Hopper H100 and has 80 Billion transistors on a huge die at 4nm and was 68% more transistors than A100 on 7nm on just as huge die, the AD102 die is no where near that size.

You just have to look at Ampere A100 before that was on 7nm and had 54 Billion Transistors.

Proving he really is a troll this guy. Or Nvidia is doing some strange counting of transistors or 20+ Billion are for their cache they added now which makes no sense.
 
Back
Top Bottom