• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Its a bit shady having a 4080 12gb and 16gb versions, most people on paper will just assume it’s the memory that’s different but it’s a completely different card.

Used to see that all the time on gaming forums people would say "I've got a 1060 why is performance so bad?" and it'd turn out they had the 3gb gimped version. They saw the 1060 name and bought it not knowing the difference they thought it was just the memory but it wasn't.
 
Last edited:
Using estimated compute performance. As the 4090 runs between 2.6ghz and 2.8ghz under load, together with its core count it will produce an estimated 100 teraflops of FP32 compute compared to 12 teraflops of the Xbox Series X and 10 teraflops of the Playstation 5
You know you can’t compare Tera flops
 
0-EDEF490-867-B-4-D8-F-BAD0-83-DE6-C4-D7-CD2.png


Interesting, what's going on here? have they only cranked up the TFlop count this time around again? We know from previous generations that 2x floating point performance doesn't result in 2x overall performance / framerate. There's other important things too, like Pixel Rate, Texture Rate and memory bandwidth.

They appear to be gambling that people will equate ray tracing in games to 'next gen', but the problem is, it doesn't really do much to improve graphical detail in games, it's a purely optional feature in at least 90% of games.

Nvidia obviously thinks that they aren't going to get smashed by RDNA3, due to their advantage with Ampere...

I was right that the launch would focus on the RTX 4080 and 4090, and move prices up a product tier... RTX 4080 naming scheme = higher prices.

The RTX 4070 becoming a more mid range tier is not a surprise (when the prices on this card at MSRP were considered very good), as more and more Ampere cards came, this is what we saw with Ampere too.

I can see why people are disappointed though, the RTX 4080 12GB is performing a bit worse/ about the same as the best Ampere card. This generation could be titled 'Ampere gen 2' in my opinion. Remember on the official roadmap, it was called 'Ampere Next', but people forgot that and listened to silly rumours instead.
 
Last edited:
vf92cVg.png

The above is suggested to be the case. However, wait for reviewers like Steve at GN. As he already stated he will be providing Transient power spikes for the 4000 series. So, I will pay close attention to his review when he posts it.
Regardless, Transient power spikes will be a real thing when next gen GPUs are released. And I hope all of you understand that you need a PSU that is able to handle spikes higher then the rated power consumption of the GPU.
 
Last edited:
Reminder of Nvidia's roadmap:

qJ725PRVfiUbN2XZzjBmqH.jpg


I'm always a bit staggered by how little attention the tech community pays to the official roadmaps / hints from technology companies themselves... It says '3 chips... One Architecture'.
 
Last edited:
I'm out, almost a grand for a 4070.
It is worse than that. 4080 12gb has 192 bit memory bus. That is usually reserved for the xx60 card as it has been last few generations.

4080 16gb has 256 bit memory bus which is usually reserved for the xx70 cards.

And so the xx80 card is the 4090 that we got here in all its glory.
 
Hm. I think i will sit this one out and stick with my 3070 for a while.

Bit ***** what they have done with the 4080 12/16gb models. Its also very confusing to do things like that, because the name only implies a difference in memory.

Be interesting to see how many they sell at these prices. Beyond the enthusiast market, i think they will struggle, especially in this economic situation.

Judging by the slides, any cards that might be more reasonably priced (4060/4060ti), likely won't be that much of a jump from a 3070 anyway.
 
Last edited:
It looks like that way if you ignore Ray tracing and DLSS performance - surprised it hasn't been posted here yet but Nvidia released some game benchmarks showing the 4080 losing to rtx3000 GPUs lmao

Here is the benchmarks, look at the left side to see the new "4080" losing to the 3090ti

Not surprised. With a 192 bit memory bus the memory bandwidth on the 4080 12gb will in all likelihood be lower than any 3080 to 3090ti GPU. That will be causing it to lose on those games most likely.
 
Us - can we just have more 4K /8K performance, at a proportional price increase?

Nvidia - No, because development costs and time, and anyway, RT is awesome. So very shiny...

Did we mention DLSS 3, upscale your woes away
 
Last edited:
The laughable thing, is that they are nowhere near the 2-4x performance increase they claim, even with the very high end, high power RTX 4090.

It would just be untrue, if not for Warhammer Darktide's average framerate result.

But hey, 12GB on the RTX 4080!

Que 2 year thread on the profound philosophical question of whether 12GB is enough ;)
 
Last edited:
I got laughed at here earlier when I said not many game at 4k but many here said 4k 120 fps is the norm...
If contemplating spending this sort of money on a GPU you'd be doing yourself dirty by not first sorting out your monitor to a full array local dimming hdr1000 monitor or an oled to give you that incredible step up from edge lit led. Those only come in 1440 ultra wide or 4k.
 
Back
Top Bottom