• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Are the 5000 cards still looking to be released at the tail end of this year/start of next year?

Sold my pc several months ago and regretting it now haha, had some fun on Mac’s, now it’s time to see about jumping back to pc. So tempted to start buying bits and just chuck in a 4070ti super
 
Have a 4080 Super FE now, the encode/decoder chip definitely had a buff, previously I couldnt play 1440p/60 2160p/60 on youtube without stutters using hardware decoding, now I can.

Also tested F7 remake and wow what an improvement with the extra VRAM. Although I cant help but feel in a few years 16 gig wont be enough.
 
AMD's discrete GPU marketshare according to JPR is going nowhere

Relative to Nvidia, AMD's desktop GPU sales seems to be at near record lows

I feel AMD need to fill the lower market void that Nvidia vacated, I know Intel are targeting that area, but AMD have the advantage of having being in the market for much longer so I think would outsell ARC if at similar price points.
 
AMD's discrete GPU marketshare according to JPR is going nowhere

Relative to Nvidia, AMD's desktop GPU sales seems to be at near record lows

It all went down hill when AMD tried to position themselves as the premium brand charging premium prices, the problem is there was already a premium player in the market that makes better products with stronger features.
 
Have a 4080 Super FE now, the encode/decoder chip definitely had a buff, previously I couldnt play 1440p/60 2160p/60 on youtube without stutters using hardware decoding, now I can.

Also tested F7 remake and wow what an improvement with the extra VRAM. Although I cant help but feel in a few years 16 gig wont be enough.
Normally 16GB should be plenty - unless we get some really messy games. It will run out of juice before.
 
How often does the 4080 Super FE come into stock?
I missed a restock, as hotstock app wasnt configured right, then I hit another about 3 weeks later.

The restock I noticed 8 hours late was sold out by the time I noticed, the second one I caught stayed in stock for nearly a week.

However Nvidia just announced will be a shortage due to faulty Micron chips.
 
Last edited:
Also tested F7 remake and wow what an improvement with the extra VRAM. Although I cant help but feel in a few years 16 gig wont be enough.
Now you know what benefits of more vram gives, it's smoother everything.

IMO 16Gb won't be, going by what's happening with my previous gpus, DLSS is clearly now a requirement for my old 3070 and 3080, SOW is claiming 3080 is only capable of 60fps/High setting/DLSS quality@1440p, and plenty more info is out there at this point.

8/10Gb was never enough then, 12Gb isn't now, 16Gb, it's not a done deal IMO, and that's why I went for the 79XTX's 24Gb, and I'll probably hold on to this one so 16Gb going forwards for 22% more expensive-**** that.

edit-It's not a slight on anyones purchase, everybody should and will enjoy whatever they have.:thumbsup
 
Last edited:
Now you know what benefits of more vram gives, it's smoother everything.

IMO 16Gb won't be, going by what's happening with my previous gpus, DLSS is clearly now a requirement for my old 3070 and 3080, SOW is claiming 3080 is only capable of 60fps/High setting/DLSS quality@1440p, and plenty more info is out there at this point.

8/10Gb was never enough then, 12Gb isn't now, 16Gb, it's not a done deal IMO, and that's why I went for the 79XTX's 24Gb, and I'll probably hold on to this one so 16Gb going forwards for 22% more expensive-**** that.

edit-It's not a slight on anyones purchase, everybody should and will enjoy whatever they have.:thumbsup
If I'm not mistaken, 8/10 GB began having issues when the current consoles came out as they have more room for the developers to play around (and also miss the optimization part). 16GB will start having problems once next gen consoles will launch. Moreover, nVDIA seems to require less vRAM (at least going by its usage in games) than AMD.
 
It all went down hill when AMD tried to position themselves as the premium brand charging premium prices, the problem is there was already a premium player in the market that makes better products with stronger features.

Pretty much this.

Had basically every amd gen gpu from the 3850 to the vega 56 but since then the gap has only widened more and more, originally it was mostly DLSS and RT grunt in the handful of games at the start but here we are now where we have:

- ray reconstruction that not only provides better IQ but also slightly better perf. too
- RTX hdr, absolute must have for any HDR fan, so many games with **** native hdr implementation, usually raised blacks which is a no go on oled
- RT grunt/hardware design, basically every game using some form of RT and it shows because ampere is pulling even further ahead than its competiting rdna 2 tech to usually being on par with 7900xt in the ones which use HW RT

Yes, you might pay a "premium" for nvidia kit (much less of a premium nowadays compared to the ati/start of amd days though and depending on how and where you shop....) but at the end of the day, nvidia tech just simply works better. Recently I moved from a gsync ultimate monitor to a gsync compatible one (basically the same kind of monitor aw34dw to aw32) and the gsync module is far superior to adaptive/free/g sync compatible, far superior handling of VRR at <60 fps (I also noticed this with my previous freesync premium screen before moving to the gsync ultimate oled one), it's much smoother, it has considerably less flickering (as proven by rtings, the only oled screen on the market to not face severe flickering issues when < 60 fps thanks to the gsync module, said monitor gets a 9+ rating whilst the freesync version gets a 5 rating for vrr flicker).

The one thing where AMD has arguably matched nvidia now is frame gen (nvidia FG as evidenced by Alex/DF still has the lead from a technical perspective for things like latency, IQ when comparing side by side) as long as devs can implement it well, which is the problem...... Ghost of tshumia is the best AMD FG experience so far followed by using the mod version.

If I'm not mistaken, 8/10 GB began having issues when the current consoles came out as they have more room for the developers to play around (and also miss the optimization part). 16GB will start having problems once next gen consoles will launch. Moreover, nVDIA seems to require less vRAM (at least going by its usage in games) than AMD.

It's much less of an issue for 10gb still, I still have yet to face all these supposed issues (if we're not talking about well regarded broken games on launch day which have been fixed/patched now....) with 10gb at both 3440x1440 and 4k outside of the self sabotaging myself in a certain game and the more legit games being cp 2077 when modded with textures and AW 2 (and even then, having more vram like the 3090 here would not have helped since the grunt is also a problem in these 2 games when maxed) my main problem is lack of grunt as proven now by all the benchmarks out there, 24gb hasn't saved a 3090 outside of a couple of launch day titles (which as noted, were broken and resolved) last year and could be argued even the current gen top end gpus lack grunt without having to resort to upscaling or/and frame gen when playing at high res or/and high HZ, hello hellblade 2.... The good news is that this year things seem much better on the vram front, UE 5 as evidenced have been using low amounts of vram, even less than 8gb dedicated vram, then we have ubisoft moving to a much better engine that properly handles vram management as shown in avatar and explained by Alex and the engine developers for snowdrop engine.

Since getting the aw32 qd-oled 4k, I game more at this res than on my aw34 3440x1440 now where I used DLDSR. Games run and look better at 4k with dlss perf. than they did with native 3440x1440 or DLSS quality or DLDSR 1.78x with dlss perf. (DLDSR has an overhead cost). Of course I'm not getting anywhere near the 240hz/fps no matter what settings etc. are used hence why a 50xx is sorely needed now.

EDIT:

As also noted before too, thanks to things like FG and upscaling, gpus are lasting longer too with being able to max settings, mrk has shown this beautifully with cp 2077 where max raster native 4k is a considerably worse experience from a visual and performance front than using dlss frame gen and upscaling with path tracing, if we didn't have such "cheating" software features, this simply would not be possible on any current gpu nor for the forseeable future either. Funnily the amd/console fans are lapping up this kind of tech now..... ;) :p :D
 
Last edited:
Pretty much this.

Had basically every amd gen gpu from the 3850 to the vega 56 but since then the gap has only widened more and more, originally it was mostly DLSS and RT grunt in the handful of games at the start but here we are now where we have:

- ray reconstruction that not only provides better IQ but also slightly better perf. too
- RTX hdr, absolute must have for any HDR fan, so many games with **** native hdr implementation, usually raised blacks which is a no go on oled
- RT grunt/hardware design, basically every game using some form of RT and it shows because ampere is pulling even further ahead than its competiting rdna 2 tech to usually being on par with 7900xt in the ones which use HW RT

Yes, you might pay a "premium" for nvidia kit (much less of a premium nowadays compared to the ati/start of amd days though and depending on how and where you shop....) but at the end of the day, nvidia tech just simply works better. Recently I moved from a gsync ultimate monitor to a gsync compatible one (basically the same kind of monitor aw34dw to aw32) and the gsync module is far superior to adaptive/free/g sync compatible, far superior handling of VRR at <60 fps (I also noticed this with my previous freesync premium screen before moving to the gsync ultimate oled one), it's much smoother, it has considerably less flickering (as proven by rtings, the only oled screen on the market to not face severe flickering issues when < 60 fps thanks to the gsync module, said monitor gets a 9+ rating whilst the freesync version gets a 5 rating for vrr flicker).

The one thing where AMD has arguably matched nvidia now is frame gen (nvidia FG as evidenced by Alex/DF still has the lead from a technical perspective for things like latency, IQ when comparing side by side) as long as devs can implement it well, which is the problem...... Ghost of tshumia is the best AMD FG experience so far followed by using the mod version.



It's much less of an issue for 10gb still, I still have yet to face all these supposed issues (if we're not talking about well regarded broken games on launch day which have been fixed/patched now....) with 10gb at both 3440x1440 and 4k outside of the self sabotaging myself in a certain game and the more legit games being cp 2077 when modded with textures and AW 2 (and even then, having more vram like the 3090 here would not have helped since the grunt is also a problem in these 2 games when maxed) my main problem is lack of grunt as proven now by all the benchmarks out there, 24gb hasn't saved a 3090 outside of a couple of launch day titles (which as noted, were broken and resolved) last year and could be argued even the current gen top end gpus lack grunt without having to resort to upscaling or/and frame gen when playing at high res or/and high HZ, hello hellblade 2.... The good news is that this year things seem much better on the vram front, UE 5 as evidenced have been using low amounts of vram, even less than 8gb dedicated vram, then we have ubisoft moving to a much better engine that properly handles vram management as shown in avatar and explained by Alex and the engine developers for snowdrop engine.

Since getting the aw32 qd-oled 4k, I game more at this res than on my aw34 3440x1440 now where I used DLDSR. Games run and look better at 4k with dlss perf. than they did with native 3440x1440 or DLSS quality or DLDSR 1.78x with dlss perf. (DLDSR has an overhead cost). Of course I'm not getting anywhere near the 240hz/fps no matter what settings etc. are used hence why a 50xx is sorely needed now.

EDIT:

As also noted before too, thanks to things like FG and upscaling, gpus are lasting longer too with being able to max settings, mrk has shown this beautifully with cp 2077 where max raster native 4k is a considerably worse experience from a visual and performance front than using dlss frame gen and upscaling with path tracing, if we didn't have such "cheating" software features, this simply would not be possible on any current gpu nor for the forseeable future either. Funnily the amd/console fans are lapping up this kind of tech now..... ;) :p :D

Spot on. Hopefully AMD get the message and start to go back to pricing cards properly while trying their best to catch up with Nvidia.
 
Back
Top Bottom