• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

I'm glad we agree, the 4090 is the only obvious choice :)

:p
But for a mainstream dGPU buyer like me and everyone I know we are constrained by budget. I know of nobody who would spend £300 on a dGPU who will spend £600. So ultimately there is no point upgrading if the improvement is minimal. No wonder dGPU sales have collapsed.
 
But for a mainstream dGPU buyer like me and everyone I know we are constrained by budget. I know of nobody who would spend £300 on a dGPU who will spend £600. So ultimately there is no point upgrading if the improvement is minimal. No wonder dGPU sales have collapsed.
Shame the 6700XT refuses to hit £300, it'd be a damn good buy at that for the mid-range.
 
Last edited:
The generational uplift is there. The RTX4080 is around 75% faster than the RTX3070TI at qHD:

The RTX4070TI is around 76% faster than an RTX3060! The RTX4090 is around 65% faster than the RTX3090 at 4K:

The RTX7900XT is around 50% faster than an RX6900XT. The big problem is that instead of giving users a big generational uplift for the tiers under the RTX4090,they simply used it to jack up prices.

The RTX3090 was around 56% faster than the RTX2080TI at 4K:

The RTX3080 was around 68% faster than an RTX2080 at 4K looking at the above TPU figures.

The names are meaningless. Nvidia can scribble anything they want on the box.

If they can't provide ~30% generational improvement at a given price point, it's a fail.

Allowing for inflation and considering that today's $599 is not worth as much as it was when Ampere launched, let's see Nvidia can manage a 25% improvement over the 3070Ti.

Exactly, as I did say:

Which is now the mid-tier and lower issue. Your getting nowhere near enough gains for the gen shrink, at least not for these prices. Turing looks like losing its worst release title..

So with last gen stock still commanding silly prices, your improvements should also focus on raster performance not RT and fake frame gen.
 
Last edited:
Anything under 40% is a useless generational improvement IMHO.
I'm just talking about advancement from a technology standpoint, not "Is it worth upgrading?"

Now that VR is the most difficult load I put on a GPU, I want something like >50% before I'm tempted to upgrade. (Although I don’t expect it at last-gen's same price point if the jump is that big)

I get no benefit over 90FPS, but avoiding dropped frames and keeping my minimum frame rate comfortably over 90FPS is valuable to me. I almost totally ignore average FPS these days and look at 1% and .1% lows.
 
I'm just talking about advancement from a technology standpoint, not "Is it worth upgrading?"

Now that VR is the most difficult load I put on a GPU, I want something like >50% before I'm tempted to upgrade. (Although I don’t expect it at last-gen's same price point if the jump is that big)

I get no benefit over 90FPS, but avoiding dropped frames and keeping my minimum frame rate comfortably over 90FPS is valuable to me. I almost totally ignore average FPS these days and look at 1% and .1% lows.

The generational uplift is there,if you normalise tier to tier. You are talking about 65% to 75% but Nvidia just decided to jack up pricing at the same time,so it actually looks poorer than it is.

So at this point,even if you do increase your budget,for many of us mainstream dGPU buyers it looks a poor generation.AMD also is realising they can get away with pricing their new generation equivalents only slightly better than Nvidia too! :(
 
  • Like
Reactions: J.D
8gb should be banned on £200+ cards in 2023. Hope reviewers can the 4060s.

The title of the video says it all...

The Last of Us Part I, RIP 8GB GPUs! Nvidia's Planned Obsolescence In Effect​


Finally a review site uses that term.. That most of us here have been using since day one with Nvidia's games.

 
Last edited:
The generational uplift is there,if you normalise tier to tier. You are talking about 65% to 75% but Nvidia just decided to jack up pricing at the same time,so it actually looks poorer than it is.

So at this point,even if you do increase your budget,for many of us mainstream dGPU buyers it looks a poor generation.AMD also is realising they can get away with pricing their new generation equivalents only slightly better than Nvidia too! :(
They are trying to make it look like they can " get away with it." But we don't know if they actually are.

I suspect stuff isn't selling as well as they had hoped.
 
I think hiding in bushes was intended by the developer. When the settings in-game are used, that's fine but when you do scummy things like that to gain a massive advantage over others then anyone using that should've been permabanned imo.

Boo this man, "boooooooooooooooo". I've changed settings for clarity like in Battlefield games but I don't go beyond what everyone else can use as I don't want a massive advantage in my favour.

Everybody has access to the console commands, just needed to google it and be okay with playing at barf settings.

Was not long before servers and leagues started specifying what the minimum settings were and if changing settings via console command was allowed because it did make a massive difference in terms of where people could hide and the amount of cover available.

In the initial leagues I played in everybody was doing it so it was atleast equal on that front but when I went into public games between matches then yes I did have a huge advantage. I remember someone trying to hide in a bush opposite a bunker thinking they were really well concealed (which they would have been at normal settings) but with my settings it was like they were stood out in the open. They were so confused when I kept shooting them.

Also you can booo me all you want. I was a teenager back then and a bit of a troll so I just found it really funny at the time.
 
  • Like
Reactions: J.D
8gb should be banned on £200+ cards in 2023. Hope reviewers can the 4060s.


Seeing the $600 3070ti get under 20fps at 1080p is hilarious

And these numbers alone don't donut justice, watch the gameplay in the hub video and the 3070 is stuttering like mad, it's completely unplayable
 
Last edited:
8gb should be banned on £200+ cards in 2023. Hope reviewers can the 4060s.

It's interesting that the settings menu at 3m26 suggests 11.4GB usage for the 4090 (along with 4.8GB of OS & other apps?!) whereas his vram page (done on a 6950XT) suggests 14.6GB usage.

We all knew this would come back to bite us in the **** at some point, though fingers crossed it's down to a bad port and not the start of a trend. At least not yet.
 
Everybody has access to the console commands, just needed to google it and be okay with playing at barf settings.

Was not long before servers and leagues started specifying what the minimum settings were and if changing settings via console command was allowed because it did make a massive difference in terms of where people could hide and the amount of cover available.

In the initial leagues I played in everybody was doing it so it was atleast equal on that front but when I went into public games between matches then yes I did have a huge advantage. I remember someone trying to hide in a bush opposite a bunker thinking they were really well concealed (which they would have been at normal settings) but with my settings it was like they were stood out in the open. They were so confused when I kept shooting them.

Also you can booo me all you want. I was a teenager back then and a bit of a troll so I just found it really funny at the time.
I understand to a certain level, in the very first CoD games we'd use certain console commands to remove the 91 FPS lock and set them higher, 125, 250, 333, 500 & 1000. 125 would be so you could jump to certain levels, 333 = highest jumps but 500 & 1000 would remove every 2nd or 3rd step noise and you'd be silent climbing ladders so most just used 125-333. We openly talked about it on the servers though so most people knew and used them also. I'd just feel dirty to have a visual advantage like that but as you said, you were a teenage mutant ninja troll at the time :D. At least there was a more even playing field in the league matches.

Bet the dude in the bush was thinking you were hacking lol.
 
The title of the video says it all...

The Last of Us Part I, RIP 8GB GPUs! Nvidia's Planned Obsolescence In Effect​


Finally a review site uses that term.. That most of us here have been using since day one with Nvidia's games.



Right at the end, STeve says "That the game does run really well on 8GB just not on ultra quality preset". You probably wont see the difference between the next setting down. PLus - anyone who PC games - wouldn't use a preset anyway. If you are - get console if all the graphical settings to you, really aren't worth playing with.

I get what you are saying, but to me, this guy is paid by clicks for content, best way to get clicks is to be controversial from all the other sites he's in competition with to get people to go to your content around the game. I just don't think this game in it's current state, is worth claiming that gfx cards are dead. But, it could be NV & AMD are in cahoots with game devs so that whilst stupid people pay for MSRP for this game now, they'll also believe that they need a new gfx card. I don't know who paid MSRP or cheaper on legit key sites, but steam reviews just makes me wait. I don't hate this game and I want to play it, but I don't want to play it in it's current state, when I know that when it's cheap, IT WILL play like it's meant to be played.

Only problem I have with this, is that the game in it's current state just isn't worth a review, particularly with reference to making gfx card purchases. Only got to look at the steam reviews OF THE ACTUAL GAME to see what's happening. Fair enough if you believe what he is saying and you need to upgrade your gfx card at great expense if you must play this beta now. But to me and all the other so called expert review sites, they get paid by content - they just want clicks. Like Reality TV - it isn't reality, it is highly scripted and edited to display controversy because that's what people with no critical thinking will be drawn to.

The mindset that people need to go out and buy new gfx cards to play a beta or alpha quality release is baffling. Then to moan about prices of cards they don't need. IT isn't AMD or NV ripping you off, it's yourselves, believing that AAA games released are well optimized, like all other AAA releases over the last year or two. You deserve to be ripped off, once when buying the game on release and upgrading a gfx card to play it.

It's a clever trick that someone can make it on the internet - anyone can. But they are all in competition to keep those clicks, subscribers etc for revenue. They don't get it right all the time either. Like when JAYZ2CENTS tried putting the voltages of the previous gen of AMD's CPU's through th3 Ryzen 3000's which were designed for a lower voltage, when it didn't work, threw his hands upand said they were rubbish. One end of the spectrum I know, but please, these people after your clicks - generating more clicks by folk posting links in forums.

USing performance of games on release as a yardstick for gfx card purchases is futile. Check steam reviews first before buying the game, not buy the game first then believe you need an upgrade. If you use a preset because you cant fathom all the graphical settings available or just cant be bothered, you don't deserve to call yourself a PC gamer. They are there to be used - presets are for console users coming to PC to give you a selection of starting points - fettle from there.

The spoiler link that will apparently upset everyone, has loads of old games that all run fine on a 3080. He doesn't elude to any of the games like FC6, Hogwarts, the twitter page by Kepler where you drew that spoiler, 4000 followers, and you've landed on that and thought, on the internet innit! Must be true. But it says right there, they work with game devs to find out what they need - maybe game devs say it will only need X, but find that due to their pressures to release, it doesn't work on what was envisaged on release.

Of course this arguement will go on and you will be right in the future. But right now, if you are using game release performance, play with a preset because the gfx menus you don't understand - then you will, need to place gfx upgrade before your knowledge of PC gaming. When the next gen games on say Unreal 5 are released - given a bit of time for optimisation, if THEN VRAM issues are common across next gen engines then you really can start calling what the min VRAM is.

Using day 1 release performance as a yard stick - to anyone sane that can think for themsleves and not wholly believe influencers have your best interest at heart, is insanity. If you do then fair enough - you deserve to be continually tricked into upgrading. Only got to look at 4090 performance with cards not being utilized fully, that the game, performance wise, just isnt there yet regarding optimization as the game isn't using whats available and stuttering even on cards with 24GB.

Apologies for another wall of text.
 
Back
Top Bottom