• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

If anything, it wasn't the 4090 that was mis-named but the 3090, which should have been a Titan. As you say, they marketed it as a Titan replacement and even Jensen still referred to the 3080 as the top tier "gaming" card at launch. Who's to say there isn't a 4000 series Titan card coming with the "pro features" you mention?

Regards the uplift, the 4090 is nearly twice the performance of the 3090 which seems a pretty decent uplift to me, especially for only $100 more than the 3090 was at launch?

(I'm ignoring the 3090Ti here, which was just a money-grab)

Ada has no NVLINK capability on the chip, even their A series (Quadro) cards don't have them this generation and why they still sell Ampere A series for people that need NVLINK and up to 96GB VRAM with 2 x A series Ampere 48GB.

Ada was designed as a gaming chip.

Ada is not twice as fast as a 3090 and nowhere near. Don't fall for the fake frame generation rubbish. Ada is 55-60% faster at best in some games that have been Nvidia sponsored.

You are falling for the large increase in numbers because they came from a larger starting point the 3090/ti, it is almost the same increase we got from 2080ti to 3090.

Here is a video that explains it and shows you the RT advancements too (get ready for the reality)..:-

 
Last edited:
Ada has no NVLINK capability on the chip, even their A series (Quadro) cards don't have them this generation and why they still sell Ampere A series for people that need NVLINK and up to 96GB VRAM with 2 x A series Ampere.

Ada was designed as a gaming chip.

Ada is not twice as fast as a 3090 and nowhere near. Don't fall for the fake frame generation rubbish. Ada is 55-60% faster at best in some games that have been Nvidia sponsored.

You are falling for the large increase in numbers because they came from a larger starting point the 3090/ti, it is almost the same increase we got from 2080ti to 3090.

Here is a video that explains it and shows you the RT advancements too (get ready for the reality)..:-



How many more time will that crap video be posted?
 
How many more time will that crap video be posted?

It's the facts there, so why would it be crap as you put it ? The figures are right there and worked out correctly. Reality sucks ..

Most would rather believe nvidia with their fake claims of 2x-4x and more x's... The uplift on the 4090 is the same as what we saw with 2080ti to 3090/ti. They are the facts.

If you watch the RT figures Nvidia clearly lied about their better RT on 40 series too, there is no uplift there as it's exactly the same as Ampere, because ADA is nothing more than Ampere shrunk on to a smaller node with more SM's (60% more on AD102 (4090 chip)),more cache and AVI encoders/decoders.

You can even see it on Nvidia's whitepaper on the ADA architecture and compare it to Ampere's whitepaper and look at the design of the sms's they are the same. Also Ada has removed the silicon that does NVLINK, so downgraded.
 
Last edited:
I don't watch Adored or MLID videos. Both hardcore AMD-biased channels. MLID is tolerable but Adored is the worst offender.

TBF at least adored somewhat backs up his "theories", MLID is beyond bad but as we all know, some on here love content with no substance or evidence to back up claims :p :cry: :D

As it is, if you want the best RT advancement, we still only have one vendor to choose from and that's nvidia.

It'll be interesting to see RDNA 3 actual performance improvements in RT too (that isn't their pr slides either), I have a feeling, it won't be quite as good as what amd have showed us and potentially even fall below ampere in some titles....
 
Last edited:
Regards the uplift, the 4090 is nearly twice the performance of the 3090 which seems a pretty decent uplift to me, especially for only $100 more than the 3090 was at launch?

It's not twice the performance at all. It's about 50%-80%. It also depends on what monitor you have. There have been plenty of people on the internet complaining that in some games with some monitors it's only one or two frames faster than a 3090. This misinformation, hype, or whatever you want to call it, is part to blame for the hysterical "I'll pay anything for that card" that we tend to see. People want to believe it's two or three times faster and they get all worked up about it. They really need to calm down and take a happy pill or two. A typical generational increase is about 30%. The 4090 is above average, but it is not revelational. Also, people seem to forget that not so long ago the generational power increase came at ZERO additional cost. Each generation was about the same price, ( plus a small amount for inflation of course). It was only shortages that broke that down and drove prices higher. The shortages are over now, so what's the big excuse? These days we are expected to pay for any increase in power, it seems. 30% more power, 30% more cost. Or in the case of the 4080, 30% more power, 80% more cost.
I used to buy an "upgrade" every generation but with this new pricing scheme I am not sure when I will ever buy one again. Maybe I will replace the entire PC every five years or so. I don't know but one thing for sure, the days of GPU upgrading is over for me. Rather than being a fun thing to get excited about it's now dead becuase I really can't justify a £1000+ upgrade every two years.
 
Last edited:
It's not twice the performance at all. It's about 50%-80%. It also depends on what monitor you have. There have been plenty of people on the internet complaining that in some games with some monitors it's only one or two frames faster than a 3090. This misinformation, hype, or whatever you want to call it, is part to blame for the hysterical "I'll pay anything for that card" that we tend to see. People want to believe it's two or three times faster and they get all worked up about it. They really need to calm down and take a happy pill or two. A typical generational increase is about 30%. The 4090 is above average, but it is not revelational. Also, people seem to forget that not so long ago the generational power increase came at ZERO additional cost. Each generation was about the same price, ( plus a small amount for inflation of course). It was only shortages that broke that down and drove prices higher. The shortages are over now, so what's the big excuse? These days we are expected to pay for any increase in power, it seems. 30% more power, 30% more cost. Or in the case of the 4080, 30% more power, 80% more cost.
I used to buy an "upgrade" every generation but with this new pricing scheme I am not sure when I will ever buy one again. Maybe I will replace the entire PC every five years or so. I don't know but one thing for sure, the days of GPU upgrading is over for me. Rather than being a fun thing to get excited about it's now dead becuase I really can't justify a £1000+ upgrade every two years.
It can definitely be in the ballpark of 2x the performance of the regular 3090 (which is a more fair comparison that using the 3090 Ti, which is 10-15% faster than a regular 3090) depending on the situation and you can see evidence of that on various Youtube comparison videos. It's likely a 4090 Ti or similar will be released at some point.
 
Last edited:
Despite all the outrage on the prices, the 4090 truly is a beast. Just finished a gaming session on Cyberpunk and despite using 1.78x DLDSR and downscaling to 3440x1440, which is actually a resolution 10% higher than 4k, I was still seeing 70 plus FPS at all time, with maxed out RT including GI. Add the motion clarity, brighter color volume and the incredible dynamic range of HDR on top with an oled and it looks incredible.

It really is powerful in a way the 3090/3080 Ti were not when they came out.

You're just wasting performance. Downscaling from DSR to that kind of small resolution is a waste of money.

HDR is available on the 30 and 20 series GPUs.
Motion clarity is just an FPS number. You can toggle settings to reach a decent motion resolution and DLSS 3.0 motion interpolation won't help with that.

An OLED is a display type independant of the 4090.

At your resolution, its a waste of money. You would have been better off getting a bigger, high pixel count display rather than trying to kid yourself that DLDSR is doing anything magical. You'll have to pixel peep to find the difference. DSR is never going to make a low resolution display a higher resolution one. Its not making your 1440p ultrawide monitor a 4K or 8K one.

A 4090 is best used for Triple-4K gamers and a 8K gamers or 4K panel high refresh rate people who NEED that FPS number as high as possible at max settings.

For the price of a 4090, shopping wisely, you can get very close to triple 4K OLED panels.
 
Last edited:
It can definitely be in the ballpark of 2x the performance of the regular 3090 (which is a more fair comparison that using the 3090 Ti, which is 10-15% faster than a regular 3090) depending on the situation and you can see evidence of that on various Youtube comparison videos. It's likely a 4090 Ti or similar will be released at some point.

To be fair I was actually comparing with the original 3080 when I said it was nearly twice the speed, my bad. It does appear to be closer to 50% on average compared to the 3090, certainly at higher resolutions and with RT enabled, which is what you buy a card like this for.

Even if it is only around 50% faster on average than the 3090, my point was that this is still a fairly decent uplift, especially relative to the price uplift.
The UK MSRP of the 4090FE model is about 21% more than the 3090FE was (and much of that is down to the exchange rates now, in the US it was even closer).
A 50% uplift for 21% more money seems good value relatively speaking, and that's even before you factor in that it was almost impossible to get a 3090 of any variety back then and they were regularly fetching around £2000.
 
Jensen would have you believe that - in reality (without DLSS 3.0) it's a pretty typical gen-on-gen advance (similar to 20->30 series).

Pic-02.png


Exactly.

The 4090 is a very exciting propsosition for those struggling to hit 60fps with a 3090TI/3090. Those users being triple 4K monitors and 8K gamers. Everyone else are just jumping from 70 to 100fps etc when they probably could have done the same by scaling resolution down, using a different DLSS setting, playing with settings etc.


The 4090 is a very very powerful card but so many people are jumping at them and not using them to its full potential because they're using, essentially, a display which simply can't take full advantage of it due to the low pixel account unless they do something silly like crank settings or DSR even though its 99% of the time going to be imperceivable to the human eye.

DLSS 3.0 is exciting but the motion interpolation is essentially broken for the moment until we see a 3.1 revision to deal with the issues which are very clear.
 
Last edited:
The 4090 is a very exciting propsosition for those struggling to hit 60fps with a 3090TI/3090. Those users being triple 4K monitors and 8K gamers.

Pretty much nail on head.

The only real justification for a 4090 is if you absolutely NEED the raw horsepower regardless of cost, for high resolutions with all the candy turned on.

If you don't NEED this then it becomes a question of value, at which point it's very hard to make an argument for a 4090 when used 3080s are everywhere for around £500.
 
The 4090 is a very exciting propsosition for those struggling to hit 60fps with a 3090TI/3090.
A generational leap is still a leap :D

And nvidia's marketing aside, a 60% uplift over a generation that was already incredibly powerful is pretty darn impressive. Only the lowest of nvidia's 30-series was slower than the Xbox Series X or PS5 - the fastest cards have more than double the raster performance and destroy the consoles in RT - the 4090 raises that to around 350% of current gen console performance.
 
A generational leap is still a leap :D

And nvidia's marketing aside, a 60% uplift over a generation that was already incredibly powerful is pretty darn impressive. Only the lowest of nvidia's 30-series was slower than the Xbox Series X or PS5 - the fastest cards have more than double the raster performance and destroy the consoles in RT - the 4090 raises that to around 350% of current gen console performance.
i agree with you
 
Pretty much nail on head.

The only real justification for a 4090 is if you absolutely NEED the raw horsepower regardless of cost, for high resolutions with all the candy turned on.

If you don't NEED this then it becomes a question of value, at which point it's very hard to make an argument for a 4090 when used 3080s are everywhere for around £500.

I actually need 90fps 1% lows (VR), and my 3080Ti can handle the job most of the time.

The problem for me is getting that last little bit requires a LOT of horsepower. The 4090 should handle it. The 4080 isn't enough, but the XTX *may* get the job done.
 
Last edited:
Got my 4080 FE this morning, haven't had time to do much testing but it's noticeably a lot more power efficient than my 3080. 60-80W less than my 3080 on the same settings on a quick Hunt Showdown test. Shame it's so over priced as it's a great product.
 
This is what I think Nvidia Gpus should cost right now if they want to maintain decent sales during recession.
Reccession is for the poor not the ones who can afford a new pc that wouldn't buy a 4080 or 4090 anyway

And nvidia's marketing aside, a 60% uplift over a generation that was already incredibly powerful is pretty darn impressive.

yes what were the gains on CPUs in gaming from about 3-4 generations ago? probably about 5fps :rolleyes:

but nvidia almost doubling performance is terrible
 
Last edited:
Any chance an admin can rename this thread to the “Nvidia 4000 series whinge fest”?

Yes the pricing sucks, but I think we established that many weeks ago! :p
Christmas is coming, I am sure Jenson Scrooge is preparing a great deal for all Nvidia fans, "Resizable Bar Humbug." He will be visited by Linus Torvalds first on Christmas Eve.
 
Back
Top Bottom