• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

People will no doubt say "fake frames/performance" etc. but like I always say, I couldn't care less how it is done, as long as the end result works and looks good. Have to accept that going forward we will be paying more for the software aspects than just hardware now, AI/ML is going to be a big factor.

Games have always been about faking stuff for performance. Remember when Doom was faking 3d? What matters is whether it works well, and we won't know that until a decent number of people get their hands on this hardware or even see it ourselves.

Doesn't DLSS stretch an image supersampled to to the preferred resolution? therefore small text becomes blurred? Higher frames for cost of image quality.

DLSS doesn't exactly stretch the image, it deduces missing detail from the features it can find. Assuming they haven't botched it completely (and every review of DLSS 2.0 I've seen suggests they haven't) it should produce nice sharp text when upscaled.
 
How do you know what the performance of a card not even announced yet is? A handful of leaked results don't exactly give a clear picture. I suspect it will be true that older games don't see much benefit from the lower end 40x0 cards compared to more expensive 30x0 cards, Nvidia seem to have focused on delivering newer features (like raytracing support) rather than ramping up the power for more traditional techniques (man, how did shader-driven tech become "traditional"? I remember when they were brand new tech and super exciting). This seems very sensible. Very few are going to benefit from that. I mean, if the stickied GPU hierarchy is to be believed, even the base level 3050 is already powerful enough to run 1080p at 60Hz - which is going to be enough to satisfy most people on the monitors most people are using (65% of people on the latest Steam Survey).

According to that same survey less than 0.5% of people have a 3090, and the 3090 Ti is rare enough it gets lumped into "other". These are niche cards, offering ultimate power for big bucks. Now, if you have the money to spend on that: good on you, but complaining that a extremely niche, ultimate performance, product is expensive is weird. Just don't buy it.
Nvidia's own marketing slide only showed a £1000 3080 12gb just about matching the top previous gen products so it isn't hard to extrapolate that whatever comes in at £650 will only match a 3080 as that was already not far off the top.

The question now though is whether or not they will go with 8gb of vram on the 4070 or whatever comes in for £650 just to cement ADA's place as the worst GPU launch in history
 
Last edited:
i-dont-believe-you-whatever.gif


You said the same last time with the 3090. You said you was going to stick to your 2080Ti. Day by day it will eat at you not having the best :p
You no me to well mate lol.. are you getting one ?
 
Profits may have done so, but the 1080 was $599 at launch to the 980's $549 (and AMD's profits followed a similar trajectory). Although, looking at, I guess the way was paved by the 2080 Super, 2080 Ti, and Titan RTX. Back then, I think most people understood these as "Halo" cards rather than anything that ordinary gamers were looking at, and the 4090 released here is the same.
Comparing AMD profits on the surface is problematic, especially from that time period as you have Ryzen thrown into the mix, you'd have to dig a little deeper to know how much of that profit is GPU vs CPU.

Like i said it started (IMO) with the 10 series, more specifically later in the launch cycle (2017) when we got the Titan, that's when Nvidia realised people were willing to pay anything. After that the $1000+ bracket that was previously exclusive to the Titan was joined by the 2080 Ti then they just kept adding more and more cards to that $1000+ bracket.
 
I thought the lower tier cards / FE come with the 3 x 8 adapter which will supply around 500w ( 450w via cable, 50w via pci slot ) although i could be wrong but swear I heard the fe comes with the 3x8 rather than 4 x 8 :confused:

The AIB cards are coming with 4 x 8 pci-e, the FE may only be 3 x 8 PCI-e. My point was even the abse AB cards seem to be 4 x 8 pci-e so there should be plenty of power headroom.
 
Comparing AMD profits on the surface is problematic, especially from that time period as you have Ryzen thrown into the mix, you'd have to dig a little deeper to know how much of that profit is GPU vs CPU.

Like i said it started (IMO) with the 10 series, more specifically later in the launch cycle (2017) when we got the Titan, that's when Nvidia realised people were willing to pay anything. After that the $1000+ bracket that was previously exclusive to the Titan was joined by the 2080 Ti then they just kept adding more and more cards to that $1000+ bracket.

Completely this. We had posts around that era saying the horse had bolted once people were buying the latter card mentioned. It has not changed all we have seen is the mid tier now get dragged up in cost.
 
Profits may have done so, but the 1080 was $599 at launch to the 980's $549 (and AMD's profits followed a similar trajectory). Although, looking at, I guess the way was paved by the 2080 Super, 2080 Ti, and Titan RTX. Back then, I think most people understood these as "Halo" cards rather than anything that ordinary gamers were looking at, and the 4090 released here is the same.
Fair enough having a silly price for a "halo" card for those who want the best and don't care about the costs but now we've got to the situation where a mid range card is going for almost 1300 quid.
 
Fair enough having a silly price for a "halo" card for those who want the best and don't care about the costs but now we've got to the situation where a mid range card is going for almost 1300 quid.

I guess I would say a £1300 card is clearly not mid-range. For me, the question is what a card at the amount I'm willing to pay will do.
 
i brought the 3080 for an upgrade from the 5700XT, VR in MFS is now 40fps instead of being under 30's but with the new update people are seeing higher frames 90+ in NYC with DLSS.

so comparison from DLSS 2 and 3 i think that there is only a 20% increase in performance on real world figures between the two. But of course they don't show that ;).


Not sure what I'm doing wrong with DLSS enabled the framerate is the same
 
Back
Top Bottom