• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

DLSS 3 will include enhancements to existing DLSS, these enhancements will work on 'all' RTX cards. It is the specific frame insertion feature (that many are sceptical about anyway) that will be exclusive to 4000 series.

4080 and 4090 include a much nicer encoder which is now dual core and natively supports AV1, so seems Nvidia recognised their huge filesize problem with the added AV1 support. Media I think arent talking about this anywhere, only know its updated due to the Nvidia slides. They always just seem obsessed with rasterization performance.

4080 16 gigs version has a nice profit margin on the VRAM, extra $300 for an extra 4 gigs VRAM although they also giving it 4090 cooler spec and some small amount of extra cores.
nVidia's gaslighting us - the '3' part of DLSS 3.x *is* the frame insertion - upscaling and Reflex were/are already a part of DLSS 2.x - all they've committed to is continuing to revise the upscaler. Personally I don't believe that such a significant change to DLSS would have been engineered *without* backwards compatibility - nVidia's simply decided that doubling the frame rates of the 20xx and 30xx series isn't in their financial interest.

Yes - the AV1 encoder is nice to have - 40xx is a decent die-shrink and refresh of Ampere and seems to deliver the kind of performance you'd expect from the changes/improvements - it's never going to be 2-4x the speed of a 3090 though except in extremely edge cases.

The 12GB is a straight rip off - it's a xx70 series card that nVidia's selling at near double the MSRP it should be. The only card with any real value in the lineup is the 4090 unfortunately.
 
Last edited:
the reviewers are just stuck in the past.. we got look at RASTA, i think nvidia should be partnering with professional quality assurance service providers like tuv or dnv and just start ghosting these youtube content creators
dlss is how are you going to get the 2-slot graphic cards back in business
I have to ask, but do you work for Nvidia or one of its associates?
 
The 12GB is a straight rip off - it's a xx70 series card that nVidia's selling at near double the MSRP it should be. The only card with any real value in the lineup is the 4090 unfortunately.

xx70 is just a name. There's no defining feature of xx70 other than more expensive/faster than xx60 and cheaper/slower than xx80. I agree that having two quite distinct cards called a 4080 is needlessly confusing but that doesn't make it an "xx70" or a "straight rip off". As for price/performance, it looks like both 4080s are much better value than the 30xx cards of similar value (based on what Nvidia's claiming), and they'll almost certainly offer better bang for buck than the 4090 in real world usage.
 
I have to ask, but do you work for Nvidia or one of its associates?
rasta is on the path of diminishing returns, unless you are okay with 5 slot graphic cards.. its pretty straightforward to see what dlss brings in terms of efficiency
reviewers seem to be stuck in the past if they cant see an obvious technological revolution, they are going to eat those words 5 years down the line and would have lost all credibility
but i am only looking at the 4090 when i make these statements, but still i believe this is the right time to consider dlss as a core offering instead of a shiny new add-on
 
Last edited:
The 12GB is a straight rip off - it's a xx70 series card that nVidia's selling at near double the MSRP it should be. The only card with any real value in the lineup is the 4090 unfortunately.

Correct. Look at the core naming convention, the 12GB was definitely intended to be a 4070.

I used to buy a new card every generation, an XX70 or XX80, but I.m not doing that this time. The prices are ridiculous. NVIDIA needs a good slap!
 
xx70 is just a name. There's no defining feature of xx70 other than more expensive/faster than xx60 and cheaper/slower than xx80. I agree that having two quite distinct cards called a 4080 is needlessly confusing but that doesn't make it an "xx70" or a "straight rip off". As for price/performance, it looks like both 4080s are much better value than the 30xx cards of similar value (based on what Nvidia's claiming), and they'll almost certainly offer better bang for buck than the 4090 in real world usage.

Well, no, there is the core. According to the core naming convention, the 12GB was supposed to be the 4070. Also, I don't agree they are much better value. As yet we don't know what the performance is, we only have NVIDIA's word on that, and we know how reliable they are(!), but I would bet that for normal raster games the performance increase is just the normal 30%, in which case NVIDIA used to give that increase, generation on generation, for no additional cost.
 
Last edited:
From what I understand based on manuel comment on reddit.

DLSS 3 will include enhancements to existing DLSS, these enhancements will work on 'all' RTX cards. It is the specific frame insertion feature (that many are sceptical about anyway) that will be exclusive to 4000 series.

4080 and 4090 include a much nicer encoder which is now dual core and natively supports AV1, so seems Nvidia recognised their huge filesize problem with the added AV1 support. Media I think arent talking about this anywhere, only know its updated due to the Nvidia slides. They always just seem obsessed with rasterization performance.

4080 16 gigs version has a nice profit margin on the VRAM, extra $300 for an extra 4 gigs VRAM although they also giving it 4090 cooler spec and some small amount of extra cores.

Personally I consider the 12 gigs version too low longevity on the VRAM, but those who dont care for decent quality textures, shadows etc. and only want their FPS high on shooters, will appreciate the weaker 4080 variant.

I had decided to skip the 4000 series but the new encoder stuff has some interest from me, so I will await to see how the release goes, if I do dip in, it will be another FE of course the 16gigs variant. If the 4070 has a 16 gig version and the better encoder chip, I will consider that as I am happy with current level of rasterization performance, but I expect it will be limited to 12 gigs.

I see Nvidia have gone in on the treat Americans better crowd on pricing.
It's not just 4gb ram, it's a cut down chip.
 
rasta is on the path of diminishing returns, unless you are okay with 5 slot graphic cards.. its pretty straightforward to see what dlss brings in terms of efficiency
reviewers seem to be stuck in the past if they cant see an obvious technological revolution, they are going to eat those words 5 years down the line and would have lost all credibility
but i am only looking at the 4090 when i make these statements, but still i believe this is the right time to consider dlss as a core offering instead of a shiny new add-on
You're answering a question i didn't ask, i asked if you work for Nvidia or one of its associates?
 
Well, no, there is the core. According to the core naming convention, the 12GB was supposed to be the 4070.

What core naming convention? If you're referring to the AD10x numbers, there's been no consistency with those between generations either.

Also, I don't agree they are much better value. As yet we don't know what the performance is, we only have NVIDIA's word on that, and we know how reliable they are(!), but I would bet that for normal raster games the performance increase is just the normal 30%, in which case NVIDIA used to give that increase, generation on generation, for no additional cost.

According to Nvidia's graphs, the 4080-12Gb provides almost as much performance as a 3090Ti on current gen games without fancy tricks and masses of RT (and romps ahead with them). Since the card comes in at a third or more less than the 3090 Ti it presents a much better value, and against 30x0 cards of similar price it represents a big increase in performance. I don't see there's much to argue about here.

Now if it turns out that Nvidia's chart is so much hot air then, sure, that picture will change.
 
Corsair just announced a $20 cable with no adapters required to make the new GPUs compatible with existing Corsair PSUs. The cable is a 16 pin to 2x8 pin single cable that plugs between the GPU and PSU and can support up to 600w

I dont get that,. Pci-e 8 pin is meant to be 150W max per connector so why only 2 per 16 pin so 300W each? Even Asus with their Thor PSUs which cost £300 went down the same route and in initially they said it will support up to 600W but now they are saying its 450W max as I presume they have discovered issues with only using 2 8 pins to supply all that power.

personally I want one of Zotac 4 x 8 pin adaptors.
 
Since the card comes in at a third or more less than the 3090 Ti it presents a much better value, and against 30x0 cards of similar price it represents a big increase in performance. I don't see there's much to argue about here.
I do, the 3090 Ti was never good value to begin with so you're comparing something that is bad value with something that's not quiet as bad and claiming that makes it good value.
 
If DLSS 3 draws one frame and then anticipates the next 3 frames, then you are going to be shooting at nothing. The problem will be even more noticeable playing against people with different connection speeds. It seems to me DLSS 3 will be ok for single player games but not fast paced FPS shooters
Lucky for you it doesn't do anything of the sort. It generates single frames inbetween two properly rendered frames - smoothing motion but not affecting any interaction. If you were shooting based on 50 frames a second before you're still shooting based on 50 frames a second now, only you also get a visual effect of almost 100 frames per second.
 
Back
Top Bottom