• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Right at the end, STeve says "That the game does run really well on 8GB just not on ultra quality preset". You probably wont see the difference between the next setting down. PLus - anyone who PC games - wouldn't use a preset anyway. If you are - get console if all the graphical settings to you, really aren't worth playing with.

I get what you are saying, but to me, this guy is paid by clicks for content, best way to get clicks is to be controversial from all the other sites he's in competition with to get people to go to your content around the game. I just don't think this game in it's current state, is worth claiming that gfx cards are dead. But, it could be NV & AMD are in cahoots with game devs so that whilst stupid people pay for MSRP for this game now, they'll also believe that they need a new gfx card. I don't know who paid MSRP or cheaper on legit key sites, but steam reviews just makes me wait. I don't hate this game and I want to play it, but I don't want to play it in it's current state, when I know that when it's cheap, IT WILL play like it's meant to be played.

Only problem I have with this, is that the game in it's current state just isn't worth a review, particularly with reference to making gfx card purchases. Only got to look at the steam reviews OF THE ACTUAL GAME to see what's happening. Fair enough if you believe what he is saying and you need to upgrade your gfx card at great expense if you must play this beta now. But to me and all the other so called expert review sites, they get paid by content - they just want clicks. Like Reality TV - it isn't reality, it is highly scripted and edited to display controversy because that's what people with no critical thinking will be drawn to.

The mindset that people need to go out and buy new gfx cards to play a beta or alpha quality release is baffling. Then to moan about prices of cards they don't need. IT isn't AMD or NV ripping you off, it's yourselves, believing that AAA games released are well optimized, like all other AAA releases over the last year or two. You deserve to be ripped off, once when buying the game on release and upgrading a gfx card to play it.

It's a clever trick that someone can make it on the internet - anyone can. But they are all in competition to keep those clicks, subscribers etc for revenue. They don't get it right all the time either. Like when JAYZ2CENTS tried putting the voltages of the previous gen of AMD's CPU's through th3 Ryzen 3000's which were designed for a lower voltage, when it didn't work, threw his hands upand said they were rubbish. One end of the spectrum I know, but please, these people after your clicks - generating more clicks by folk posting links in forums.

USing performance of games on release as a yardstick for gfx card purchases is futile. Check steam reviews first before buying the game, not buy the game first then believe you need an upgrade. If you use a preset because you cant fathom all the graphical settings available or just cant be bothered, you don't deserve to call yourself a PC gamer. They are there to be used - presets are for console users coming to PC to give you a selection of starting points - fettle from there.

The spoiler link that will apparently upset everyone, has loads of old games that all run fine on a 3080. He doesn't elude to any of the games like FC6, Hogwarts, the twitter page by Kepler where you drew that spoiler, 4000 followers, and you've landed on that and thought, on the internet innit! Must be true. But it says right there, they work with game devs to find out what they need - maybe game devs say it will only need X, but find that due to their pressures to release, it doesn't work on what was envisaged on release.

Of course this arguement will go on and you will be right in the future. But right now, if you are using game release performance, play with a preset because the gfx menus you don't understand - then you will, need to place gfx upgrade before your knowledge of PC gaming. When the next gen games on say Unreal 5 are released - given a bit of time for optimisation, if THEN VRAM issues are common across next gen engines then you really can start calling what the min VRAM is.

Using day 1 release performance as a yard stick - to anyone sane that can think for themsleves and not wholly believe influencers have your best interest at heart, is insanity. If you do then fair enough - you deserve to be continually tricked into upgrading. Only got to look at 4090 performance with cards not being utilized fully, that the game, performance wise, just isnt there yet regarding optimization as the game isn't using whats available and stuttering even on cards with 24GB.

Apologies for another wall of text.
Whilst I do no know this with any certainty or confidence, I'm concerned that any modern game from a large developer comes with ultra settings that add no noticeable improvement to visual fidelity but that are, perhaps intentionally, computationally heavy so to lower performance and make consumers question their hardware without any notion of doing a cost benefit analysis on using such settings. I've played about with a few games and I seem to always find that I can knock settings down from ultra to high without being able to see a visual difference. I'll happily take another 10 or 20 FPS at 4k and suffer that thing in the distance being ever so slight less refined when it catches my attention 0.1% of my play time.

Not to derail the conversation, but I also think the same thing is happening with RTX where its used to push new hardware because of the performance hit. Setting aside that I've yet to see RTX being used in a way that makes any game noticeably better during normal playing conditions, I think there has been a fixation on RTX as the Holy Grial of lighting and illumination for so many years (decades really) that no one has stopped to question whether traditional techniques have improved to a point that the difference between the two methods in terms of visual fidelity is marginal.
 
Last edited:
I'm happy to wait. I'm not a sweaty nerd living in the basement who needs a gpu :p
image.png
 
Last edited:
Right at the end, STeve says "That the game does run really well on 8GB just not on ultra quality preset". You probably wont see the difference between the next setting down. PLus - anyone who PC games - wouldn't use a preset anyway. If you are - get console if all the graphical settings to you, really aren't worth playing with.

I get what you are saying, but to me, this guy is paid by clicks for content, best way to get clicks is to be controversial from all the other sites he's in competition with to get people to go to your content around the game. I just don't think this game in it's current state, is worth claiming that gfx cards are dead. But, it could be NV & AMD are in cahoots with game devs so that whilst stupid people pay for MSRP for this game now, they'll also believe that they need a new gfx card. I don't know who paid MSRP or cheaper on legit key sites, but steam reviews just makes me wait. I don't hate this game and I want to play it, but I don't want to play it in it's current state, when I know that when it's cheap, IT WILL play like it's meant to be played.

Only problem I have with this, is that the game in it's current state just isn't worth a review, particularly with reference to making gfx card purchases. Only got to look at the steam reviews OF THE ACTUAL GAME to see what's happening. Fair enough if you believe what he is saying and you need to upgrade your gfx card at great expense if you must play this beta now. But to me and all the other so called expert review sites, they get paid by content - they just want clicks. Like Reality TV - it isn't reality, it is highly scripted and edited to display controversy because that's what people with no critical thinking will be drawn to.

The mindset that people need to go out and buy new gfx cards to play a beta or alpha quality release is baffling. Then to moan about prices of cards they don't need. IT isn't AMD or NV ripping you off, it's yourselves, believing that AAA games released are well optimized, like all other AAA releases over the last year or two. You deserve to be ripped off, once when buying the game on release and upgrading a gfx card to play it.

It's a clever trick that someone can make it on the internet - anyone can. But they are all in competition to keep those clicks, subscribers etc for revenue. They don't get it right all the time either. Like when JAYZ2CENTS tried putting the voltages of the previous gen of AMD's CPU's through th3 Ryzen 3000's which were designed for a lower voltage, when it didn't work, threw his hands upand said they were rubbish. One end of the spectrum I know, but please, these people after your clicks - generating more clicks by folk posting links in forums.

USing performance of games on release as a yardstick for gfx card purchases is futile. Check steam reviews first before buying the game, not buy the game first then believe you need an upgrade. If you use a preset because you cant fathom all the graphical settings available or just cant be bothered, you don't deserve to call yourself a PC gamer. They are there to be used - presets are for console users coming to PC to give you a selection of starting points - fettle from there.

The spoiler link that will apparently upset everyone, has loads of old games that all run fine on a 3080. He doesn't elude to any of the games like FC6, Hogwarts, the twitter page by Kepler where you drew that spoiler, 4000 followers, and you've landed on that and thought, on the internet innit! Must be true. But it says right there, they work with game devs to find out what they need - maybe game devs say it will only need X, but find that due to their pressures to release, it doesn't work on what was envisaged on release.

Of course this arguement will go on and you will be right in the future. But right now, if you are using game release performance, play with a preset because the gfx menus you don't understand - then you will, need to place gfx upgrade before your knowledge of PC gaming. When the next gen games on say Unreal 5 are released - given a bit of time for optimisation, if THEN VRAM issues are common across next gen engines then you really can start calling what the min VRAM is.

Using day 1 release performance as a yard stick - to anyone sane that can think for themsleves and not wholly believe influencers have your best interest at heart, is insanity. If you do then fair enough - you deserve to be continually tricked into upgrading. Only got to look at 4090 performance with cards not being utilized fully, that the game, performance wise, just isnt there yet regarding optimization as the game isn't using whats available and stuttering even on cards with 24GB.

Apologies for another wall of text.
CGxtoBm.gif
 
Sure, its rubbish that prices keep going up, but if the 4070 hits the possible height of its rumoured performance (ie between 3090 and 3090ti), that isnt going to be absolutely terrible for a $599 card.

It would be even better if $599 equated to more like £500 before brexit plummeted our currency into oblivion...
 
The mindset is indeed baffling. I mean you only needed a 10GB card which was perfectly fine the other day, no issues; to now owning a 24GB card thrice its value.

Not as baffling as you slating Nvidia and Nvidia users for paying silly moneys over and over and then proceeding to spend £1400 on a 3090 FE and then carrying on with hands in ears like it never happened :cry:

look-yo-man-look.gif
 
I don't know why new games just don't lock the texture setting based on your vram if it's an issue? Didn't DOOM do that back in 2016?

Anyway 10GB is still fine for me until the 5080, RE4 is the first game I've seen an issue which I resolved myself in 10 minutes on the demo playing with the settings (RT off).
 
I don't know why new games just don't lock the texture setting based on your vram if it's an issue? Didn't DOOM do that back in 2016?

Anyway 10GB is still fine for me until the 5080, RE4 is the first game I've seen an issue which I resolved myself in 10 minutes on the demo playing with the settings (RT off).


I'm not sure about re4 but in last of us if you turn the settings down then the graphics become worse than a console and I dunno about you but some people get a little angsty when their $1000 GPU runs and looks worse than a console
 
Last edited:
8gb should be banned on £200+ cards in 2023. Hope reviewers can the 4060s.

Nvidia clearly don't want be scammed more from their GDDR6/X memory chips suppliers unless they charge even more astronomical prices for their GPUs.You can upgrade memory on a lot of devices so why not GPUs, theres plently of GPUs with different VRAM configs thats tied to their bus width like the 12GB/6GB RTX 2060, 3GB/6GB GTX 1060 etc.I say bring back standardised upgradeable VRAM sticks even though its a technical hurdle atm.
 
Last edited:
The fact that it looks like Nvidia is skimping ram again in 2023 on its new cards is a sick joke

Memory has fallen in value so much; micron just posted a $2 billion loss because memory prices have tanked and volumes fallen - in fact so far this year wholesale prices for memory has fallen 10% a month!

So Jensen can't stand there and say he can't give gamers more vram, take a hike douche
 
The fact that it looks like Nvidia is skimping ram again in 2023 on its new cards is a sick joke

Memory has fallen in value so much; micron just posted a $2 billion loss because memory prices have tanked and volumes fallen - in fact so far this year wholesale prices for memory has fallen 10% a month!

So Jensen can't stand there and say he can't give gamers more vram, take a hike douche

You will be first in line to buy that douches graphics card next year when the 5090 is released, even if it still has 24GB again :p
 
The title of the video says it all...

The Last of Us Part I, RIP 8GB GPUs! Nvidia's Planned Obsolescence In Effect​


Finally a review site uses that term.. That most of us here have been using since day one with Nvidia's games.

What I love about this video is that it clearly demonstrates the nvidia fine wine.

A 3080 @ 1440p or 4k high basically matches a 6800xt @ 1440p or 4k MEDIUM.

Sorry, but I find that extremely hilarious considering the title of the video is about nvidias planned obsolescence.
 
Back
Top Bottom