• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for 4K? Discuss..

Status
Not open for further replies.
But But :p

Nvidia was trying to sell it as a 4080 not a 4070ti :p

It only got renamed because of the outrage from the internet. Also look at the 4060ti (128bit bus and less cuda cores than previous gen it replaces) specs less than the 3060ti (256 bit bus and more cuda cores). What we would call a xx50 class or less back in the day...


Good point, it wasn't even a 70(ti) matching xx90, it was a xx60(ti) :p ;)

But agree, amd and nvidia are both proper dodgy with their reshuffling of the tiers, I'm not really that bothered what a gpu is labelled as, as long as the price to performance is there, which for all these current gpus, it simply isn't.
 
Good point, it wasn't even a 70(ti) matching xx90, it was a xx60(ti) :p ;)

But agree, amd and nvidia are both proper dodgy with their reshuffling of the tiers, I'm not really that bothered what a gpu is labelled as, as long as the price to performance is there, which for all these current gpus, it simply isn't.
Exactly, let's be honest the 4080 12GB was really what should have been the 4060ti, they just wanted to be extra greedy and call it a 4080 and then well renamed it to 4070ti and a so called $100 price cut but in real world that cut is not even there in the street price.

Also I'm same I don't really care about the naming used but do when they are using it to mislead us and the unaware customers.
 
Nvidia was trying to sell it as a 4080 not a 4070ti :p For $100 more and we know the fake msrp game too..

It only got renamed because of the outrage from the internet. Also look at the 4060ti (128bit bus and less cuda cores than previous gen it replaces) specs less than the 3060ti (256 bit bus and more cuda cores). What we would call a xx50 class or less back in the day...

Also stagnation or sometimes known as regression. :p

Exactly, let's be honest the 4080 12GB was really what should have been the 4060ti, they just wanted to be extra greedy and call it a 4080 and then well renamed it to 4070ti and a so called $100 price cut but in real world that cut is not even there in the street price.

Also I'm same I don't really care about the naming used but do when they are using it to mislead us and the unaware customers.

But but nvidia good!
 
Last edited:
BUT BUT 8K... :rolleyes:

I agree it's all getting silly again, from both teams now trying to push 8K while 4K is still a problem for even their new cards on new games.

But they want to push 4x the pixels now from 4K or selling half 8k monitors now to push next graphics cards that can't push that many pixels without tricks. Also I actually really like the half 8k ultrawide screens but maybe when 5090 or 6090 comes out and look at them as I like the format and the pixels and the very large sizes they come in.



The problem we all face right now is the world doesn't have the money right now for such tech and people are more worried about paying their bills not upgrading what they currently have in most cases. This forum is not even a drop in the ocean of real computer users out there, most buy hardware and only think about replacing it when it dies or doesn't do what they need anymore, here we are pretty extreme about tech and hardware and we are enthusiasts but real world is a very different market and real world right now is pushing people to breaking point and not one thought is about tech upgrades. Lets hope the world improves this year and coming years as this will even damage the market for us enthusiasts as sorry we will not be paying £3k for a gpu as nvidia would love us to do and some models of gpus are very close to that right now.
people need to have a vision, something to drive the roadmap
4070 ti seems to have an acceptable level of perf at 4k.. this is supposed to be xx70 class mainstream gpu (in name only, but still hold on to that thought for a while)
i think next gen 4k would have gone fully mainstream and then you'd need something to drive future efforts rather than wandering aimlessly..
and ya know, someone more responsible has got to show make the path to for AMD
 
Last edited:
It's a list of excuses for nexus to use? Might recognise the template. Why are you asking me for a list of games?

I am asking you because this thread is about 12GB not being enough. Your post makes it sound like a huge list. What are these games?

So far all I see is Portal RTX and that don't even make sense as pointed out as you need DLSS to make it playable, at which point you no longer need 16GB.
 
8k is and is going to remain pretty pointless for most people. The majority of 1440p and 4k 16:9 screens used by PC gamers have a high enough pixel density that 8k doesn't matter to them.

Where 8k can gain a market is in large format displays - that's people using TV size screens as monitors and sitting close to it. But even this is niche because most people who use a TV for a screen sit far enough away that pixel density doesn't matter. But if you are one of those who have a 48 inch 16:9 TV on your desk 45cm from your eyes than yes 8k will help.

The other area where 8k can help is ultrawides. Most/all ultrawides on the market suffer from low pixel density and higher resolutions will really help.


Potentially another area of growth for 8k is actually in tiny screens - such as VR headsets, those would benefit the most from higher pixel density
 
Last edited:
The other area where 8k can help is ultrawides. Most/all ultrawides on the market suffer from low pixel density and higher resolutions will really help.
Not sure the density is any lower than an equivalent 16:9 monitor. Take a 34” 1440p UW monitor running at 3440x1440. It has the same height as a 27” 16:9 monitor. If that monitor runs at 1440p, 2560x1440 then the pixel density is the same.
 

I am assuming it is 16k each eye @240hz? supposedly the holy grail.. but there will be other diversions to account for while reaching that goal. And i doubt if you could reach there without dlss unless people are fine with 5kw GPUs..
You can get 16k/240... If you're ready to go back 20 years in image quality in every other respect.
Has anyone tried to fire up Team Fortress 2 at at least 8k to see what kind of performance they get?
 
I'm not sure about needing 240hz but yes you need 16k per eye to match or beat the image quality of a 4k tv and this because of how close your eyes are to the screens in the headset

The highest I've tried is 4k per eye headset and that had image quality of sitting 1 meter away from a between a 720p and 1080p TV
 
Last edited:
You can get 16k/240... If you're ready to go back 20 years in image quality in every other respect.
Has anyone tried to fire up Team Fortress 2 at at least 8k to see what kind of performance they get?


You don't need to go back 20 years for a 16:9 aspect ratio. You can do 16k 16:9 30-40fps in grand theft auto 5 with an rtx4090

 

the statement was made in dx12 era before rt which i would assume is the baseline for scene complexity, now i think you got to do 16k x 16k with rt. I mean you could achieve those numbers if all thats being rendered on screen is a single box but thats not a reasonable assumption
 
I am asking you because this thread is about 12GB not being enough. Your post makes it sound like a huge list. What are these games?

So far all I see is Portal RTX and that don't even make sense as pointed out as you need DLSS to make it playable, at which point you no longer need 16GB.
Maybe read the post a few more times
 
I am asking you because this thread is about 12GB not being enough. Your post makes it sound like a huge list. What are these games?

So far all I see is Portal RTX and that don't even make sense as pointed out as you need DLSS to make it playable, at which point you no longer need 16GB.

It's less than a handful of games so far and half of them are not even worth playing. Lets be honest and see what games come out that really need more than 12GB or even 10GB (that are actually even worth playing and making a fuss over), developers are always going to target what the market owns anyways as we know and any odd game with extreme requirements so far has turned out to be a dud and used these extreme requirements to get free marketing and in real world turning one or two settings down has made them playable at acceptable frame rates and quality.

Nvidia and AMD would love to push the requirements up and up to carry on selling us their new wares with the planned obsolescence games they also play. So far AMD has also shown its true colours with the games they are sponsoring too and how they are targetting Nvidia card users without enough VRAM for the ULTRA settings in these games and high resolutions.
 
Last edited:
Maybe read the post a few more times

Problem is it doesn't work as well when we have 1 very questionable game and even if there were say up to 5-10 games that genuinely require more vram over grunt, it still doesn't outweigh all the RT titles we have i.e. when nvidia perform better in RT than amd:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by nvidia, not fair
- hardly any games have RT so who cares
- who on earth enables RT

Essentially every RT game except for FC 6

:D

It's less than a handful of games so far and half of them are not even worth playing. Lets be honest and see what games come out that really need more than 12GB or even 10GB (that are actually even worth playing and making a fuss over), developers are always going to target what the market owns anyways as we know and any odd game with extreme requirements so far has turned out to be a dud and used these extreme requirements to get free marketing and in real world turning one or two settings down has made them playable at acceptable frame rates and quality.

Nvidia and AMD would love to push the requirements up and up to carry on selling us their new wares with the planned obsolescence games they also play. So far AMD has also shown its true colours with the games they are sponsoring too and how they are targetting Nvidia card users without enough VRAM for the ULTRA settings in these games and high resolutions.

TBF, I think it is also largely because of how amd handle vram management/textures loading in, it's clear to see that they aren't optimised anywhere as well as nvidia when it comes to vram usage so do seem to need more vram perhaps?

 
Last edited:
  • Like
Reactions: TNA
Problem is it doesn't work as well when we have 1 very questionable game and even if there were say up to 5-10 games that genuinely require more vram over grunt, it still doesn't outweigh all the RT titles we have i.e. when nvidia perform better in RT than amd

It works fine can see already hitting points on it, only 1 game ,sponsored by amd , nobody plays that game ,it's crap :cry:

Just lower your texture quality settings and move on.
 
Status
Not open for further replies.
Back
Top Bottom