• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

12GB vram enough for 4K? Discuss..

Status
Not open for further replies.
There's a massive thread about 10GB being enough for the 3080 and it's mostly about Far Cry 6 to be fair. In my experience the benchmark uses about 11GB max at 2160p max settings which will cause the 3080 to divebomb in FPS. A 4070Ti should be fine but there could be other stuff in background using up VRAM.
Yeah i get that fc6 is the worst game to draw conclusions from, but still this doesnt make sense. If 12 gigs are not enough in the build in bench then something went horribly wrong.

I know other games that use a ton of vram, for example warzone was using 19 gigs on my 3090, but the game was running super smooth on cards with 8gb as well.

Performance shouldnt plummet like that when vram is not enough, usually what happens is constant stuttering.
 
Last edited:
Yeah i get that fc6 is the worst game to draw conclusions from, but still this doesnt make sense. If 12 gigs are not enough in the build in bench then something went horribly wrong.

I know other games that use a ton of vram, for example warzone was using 19 gigs on my 3090, but the game was running super smooth on cards with 8gb as well.

Performance shouldnt plummet like that when vram is not enough, usually what happens is constant stuttering.
Yet we have to put up with conclusions from sponsored tripe like Cyberbug77 or Portal. :cry:
 
Portal is obvious why, I don't think anyone has said any differently, even DF/Alex have called out nvidia on this (oh wait, forgot, they're shills though!)

CP 2077, well as shown, amd is performing exactly how it should, heck, amd are obviously proud of their performance in it now given it was one of their main titles to show in their event pr slides along with dl 2 :cry:

Sadly, there are hundreds of other games all with RT where amd takes a hit no matter what, got to be a good excuse for one of them though, wait let me get my list again:

- game is old, who still plays that
- game is ****, who on earth plays that
- game is sponsored by nvidia, not fair
- hardly any games have RT so who cares
- who on earth enables RT

:D
 
Yet we have to put up with conclusions from sponsored tripe like Cyberbug77 or Portal. :cry:
I agree with portal, but cyberpunk is perfectly fine. If you actually check original reviews (like hwunboxed) it seems to be working fine with amd gpus. The 5700xt was easily beating the 2060s in raster performance for example.

Usually nvidia sponsored games run fine on amd. Amd sponsored games work like complete crap on nvidia and intel hardware. Remember valhala at low resolutions? 3090 was losing to like a 6700xt. Remember dirt or forza?
 
I agree with portal, but cyberpunk is perfectly fine. If you actually check original reviews (like hwunboxed) it seems to be working fine with amd gpus. The 5700xt was easily beating the 2060s in raster performance for example.

Usually nvidia sponsored games run fine on amd. Amd sponsored games work like complete crap on nvidia and intel hardware. Remember valhala at low resolutions? 3090 was losing to like a 6700xt. Remember dirt or forza?
And Far Cry 6 works great on my 4090. ;)

The part in bold is highly questionable to say the least, but if you are new to PC gaming then you might truly believe that. I guess you are too young to remember the GameWorks era now that we're in the RTX era. It's all the same, just a different name on top.

Intel is slightly faster than AMD in Valhalla, but not much in it. Also, Valhalla is CPU limited on Nvidia due to driver overhead, nothing more nothing less. AMD have a hardware scheduler and smart access memory can work very well in those scenarios. Nvidia rely on software for scheduling, that has some disadvantages (and some advantages for DX11) which is what you see here.
 
Last edited:
And Far Cry 6 works great on my 4090. ;)

The part in bold is highly questionable to say the least, but if you are new to PC gaming then you might truly believe that. I guess you are too young to remember the GameWorks era now that we're in the RTX era. It's all the same, just a different name on top.

Intel is slightly faster than AMD in Valhalla, but not much in it. Also, Valhalla is CPU limited on Nvidia due to driver overhead, nothing more nothing less. AMD have a hardware scheduler and smart access memory can work very well in those scenarios. Nvidia rely on software for scheduling, that has some disadvantages (and some advantages for DX11) which is what you see here.
That doesn't seem to be the case. Anyhow looks like Nvidia treats it's block scheduler as a trade secret while you could find some info on thread schedulers

 
Last edited:
And Far Cry 6 works great on my 4090. ;)

The part in bold is highly questionable to say the least, but if you are new to PC gaming then you might truly believe that. I guess you are too young to remember the GameWorks era now that we're in the RTX era. It's all the same, just a different name on top.

Intel is slightly faster than AMD in Valhalla, but not much in it. Also, Valhalla is CPU limited on Nvidia due to driver overhead, nothing more nothing less. AMD have a hardware scheduler and smart access memory can work very well in those scenarios. Nvidia rely on software for scheduling, that has some disadvantages (and some advantages for DX11) which is what you see here.

It's amazing that the 4090 still takes top spot in Valhalla given how poor in general the game runs on Nvidia hardware. Off the top of my head I belive it runs 20% or 30% better on AMD hardware

It will be interesting to test the new game later this year given it's going to look completely different
 
Last edited:
And Far Cry 6 works great on my 4090. ;)

The part in bold is highly questionable to say the least, but if you are new to PC gaming then you might truly believe that. I guess you are too young to remember the GameWorks era now that we're in the RTX era. It's all the same, just a different name on top.

Intel is slightly faster than AMD in Valhalla, but not much in it. Also, Valhalla is CPU limited on Nvidia due to driver overhead, nothing more nothing less. AMD have a hardware scheduler and smart access memory can work very well in those scenarios. Nvidia rely on software for scheduling, that has some disadvantages (and some advantages for DX11) which is what you see here.
Really? Can you find me a recent aaa game nvidia sponsored game that runs like crap on amd? Cause i can find a ton of them running like crap on intel and nvidia. When both consoles have amd hardware, saying something is nvidia sponsored is just... Yeah whateva.
 
It's amazing that the 4090 still takes top spot in Valhalla given how poor in general the game runs on Nvidia hardware. Off the top of my head I belive it runs 20% or 30% better on AMD hardware

It will be interesting to test the new game later this year given it's going to look completely different
I've yet to test it on my 4090 and 7900 XTX, but it certainly runs better on the 4090 compared to last generation. I suspect if I push the 7900 XTX it'll be faster though.
 
when i had my 3080 10gb playing fc6 at 1440p with the hd texture pack meant i had to turn of HW accel for browsers/steam/disc so i had enough memory to run the game without any hitching so it wouldnt have been enough at 4k
 
Ah, that's where the global tech press findings are hiding. Some random on ******* with 616 followers. :cry:

Doesn't matter where the customers are, these are the facts right there. Run a pole here or any enthusiast forums and you will find same results. These are just as random as it gets really so not sure why that matters and why the results don't matter to you ? Also you have a 24GB 4090 so surprised you find them results odd too.


The truth hurts ?
 
Last edited:
Status
Not open for further replies.
Back
Top Bottom