• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Running out of VRAM on RTX 3070?

Coming from years of amd gpus and being probably the biggest IQ snob there is (see all my posts on how **** LCD is and superior oled is ;) :p :D), I was expecting to be fully dissatisfied going to nvidia but I have not noticed much, if any difference, there is a slight difference in gamma settings I feel but other than that, unless side by side and literally pixel peeping, I think what people notice is just placebo.

- both my amd and nvidia cards were set to the correct rgb setting etc.
- amd has image sharpening enabled by default, which does help clarity especially in games that use TAA, however, you can also enable sharpening with nvidia or/and redux
 
amd has image sharpening enabled by default
Fact check false, default it is disabled.

If you select a profile customization after installing the drivers and launching the software, it is enabled by default. If you pick the default standard profile, it is disabled.
 
Fact check false, default it is disabled.

If you select a profile customization after installing the drivers and launching the software, it is enabled by default. If you pick the default standard profile, it is disabled.
Wrong, it's disabled by default (when I freshly install a driver at least).
Ah fair, has that changed then? Pretty sure when it first came out, it was the default?

Was so long since I done a "fresh/clean" install of amd drivers as I always just installed on top with each new driver.
 
Ah fair, has that changed then? Pretty sure when it first came out, it was the default?

Was so long since I done a "fresh/clean" install of amd drivers as I always just installed on top with each new driver.
It has never been on by default AFAIK.
 
Coming from years of amd gpus and being probably the biggest IQ snob there is (see all my posts on how **** LCD is and superior oled is ;) :p :D), I was expecting to be fully dissatisfied going to nvidia but I have not noticed much, if any difference, there is a slight difference in gamma settings I feel but other than that, unless side by side and literally pixel peeping, I think what people notice is just placebo.

- both my amd and nvidia cards were set to the correct rgb setting etc.
- amd has image sharpening enabled by default, which does help clarity especially in games that use TAA, however, you can also enable sharpening with nvidia or/and redux
I think the 'AMD has superior image quality' thing is just one of those things people have in their head based on out of date information. It was true for a long, long time that AMD looked much better out of the box, because Nvidia cards would run in limited RGB mode and there was no option in the control panel to change it. You used to have to download a third party application to force it into full range mode and not have everything look washed out. But Nvidia added the option to switch them around the Maxwell era, and every Nvidia card I've used in recent years has picked the proper full range option by default on a clean install. I've switched back and forth between AMD and Nvidia cards more times than I can count over the past few years and there's simply no difference in image quality (unless you get into godawful sharpening filters and such).
 
I just swapped sold my Radeon VII after buying an RTX 3070. In benchmarks i've got around 10% extra fps and an extra £200+ in my pocket and play at 1440p ultra.

One of the games that me any my brother play through a lot is Shadow of the tomb raider, but ive noticed sometimes when playing and especially when running the benchmark just to see if there was an fps increase, that the texture of some of the buildings change, shadows randomly appear, and items popup in the background which I find extremely jarring every time something happens. Afterburner tells me its constantly running at 7.5gb VRAM.

Am I experiencing this because of the lack of VRAM?

Are there any recommended settings that I can change to try and ease the VRAM usage if this is the problem?

I did also try out RTX on ultra and high, but noticed no difference other than the framerate counter being halved, so I assume that any benefits for ray tracing aren't visible in this game.
Have you watched any of the many benchmark vids on youtube for the 3070 Shadow of the tomb raider to see if they too suffer these issues?
 
One of the games that me any my brother play through a lot is Shadow of the tomb raider, but ive noticed sometimes when playing and especially when running the benchmark just to see if there was an fps increase, that the texture of some of the buildings change, shadows randomly appear, and items popup in the background which I find extremely jarring every time something happens. Afterburner tells me its constantly running at 7.5gb VRAM.

Am I experiencing this because of the lack of VRAM?
Right, I just ran the SotTR benchmark for myself and I assume you're talking about the pop-in of shadows and level of detail changes on the houses as the camera swings down towards the village at the end? That's entirely normal and just (the failings of) the game engine's LoD system and draw distance at work. Presumably it wasn't really designed for sweeping aerial shots over such densely-packed areas, since you'd be at ground level and Lara's PoV during any normal gameplay. Nothing to do with VRAM. I didn't notice any pop-in anywhere else during the benchmark and it was smooth as silk at 1440p with every setting maxed (beyond the highest preset), with RT on and DLSS off. VRAM usage peaked at ~7.4GB, which by the Afterburner indicator includes what Windows is using (and Chromium, since I couldn't be bothered to close it). Literally not one stutter during the entire run, which is what you'd be seeing if you were running out of VRAM.

If you're seeing any pop-in or stuttering beyond that one brief point in the benchmark where the engine is being taxed beyond its limits, I'd perhaps look at your storage. Do you have the game installed on a hard drive? That can certainly lead to pop-in in some modern titles, as the engine can't stream the data in fast enough. I have it on an NVMe SSD, for the record.

750920_2021011814010274kuy.png
 
I think the 'AMD has superior image quality' thing is just one of those things people have in their head based on out of date information. It was true for a long, long time that AMD looked much better out of the box, because Nvidia cards would run in limited RGB mode and there was no option in the control panel to change it. You used to have to download a third party application to force it into full range mode and not have everything look washed out. But Nvidia added the option to switch them around the Maxwell era, and every Nvidia card I've used in recent years has picked the proper full range option by default on a clean install. I've switched back and forth between AMD and Nvidia cards more times than I can count over the past few years and there's simply no difference in image quality (unless you get into godawful sharpening filters and such).
It's never been anything to do with the rgb settings, as I knew to set it.

I work on Amd, I use Nvidia when I'm not working/gaming, I prefer Amd's default colour profile, its not a case of 'AMD has superior image quality', their default settings are more vibrant imo.
 
It's never been anything to do with the rgb settings, as I knew to set it.

I work on Amd, I use Nvidia when I'm not working/gaming, I prefer Amd's default colour profile, its not a case of 'AMD has superior image quality', their default settings are more vibrant imo.
No, because both AMD and Nvidia's drivers use Windows' colour settings by default, so there's zero difference unless your monitor is doing something weird.

colourd2j0f.png


Or maybe you're picking (or picked) one of AMD's "profiles" when first installing the drivers, which tweaks a bunch of stuff automatically, applying different settings to things like videos (and always looks worse for it in my experience). In fact, it seemed to do that even when I didn't want it to on my 5700 XT, which caused major problems with video playback (specifically gamma levels) in Chromium-based browsers. I'd also question whether "more vibrant" actually equals "better", since colour accuracy isn't a subjective thing and I certainly wouldn't trust any preset generic profile to be doing anything good for it. "More vibrant" usually just means "more inaccurate" in 99.9% of cases. But then some people like that, and will also have their TVs set to a (IMO disgusting) 'Vivid' picture preset or equivalent, which throws colour accuracy out of the window in favour of artificial boosting, usually to make a TV look more punchy on a shop floor under bright lights.

In any case, anything that's not default and using the same settings is obviously going to look different, but that's entirely personal preference. There's no image quality difference at equivalent settings, and that's a fact.
 
Last edited:
No, because both AMD and Nvidia's drivers use Windows' colour settings by default, so there's zero difference unless your monitor is doing something weird.

colourd2j0f.png


Or maybe you're picking (or picked) one of AMD's "profiles" when first installing the drivers, which tweaks a bunch of stuff automatically, applying different settings to things like videos (and always looks worse for it in my experience). In fact, it seemed to do that even when I didn't want it to on my 5700 XT, which caused major problems with video playback (specifically gamma levels) in Chromium-based browsers.

In any case, anything that's not default and using the same settings is obviously going to look different, but that's entirely personal preference. There's no image quality difference at equivalent settings, and that's a fact.
Everything default.

Neither you or anybody else is going to change my opinion of one looking more vibrant than the other is a fact.

Disagreeing on ones ability to notice :D'Vibrance':D can only happen on the internet lol.
 
Everything default.

Neither you or anybody else is going to change my opinion of one looking more vibrant than the other is a fact.

Disagreeing on ones ability to notice .:D'Vibrance' can only happen on the internet lol
No, I'm just letting you know about the objective fact that you definetly don't have everything at default if you're noticing a difference, unless your monitor is making changes independently. I'm sorry if you think it's an argument, but it's not. It's simply stating facts. There is no difference at default settings, because neither driver handles anything to do with colour at those default settings. Windows does (at least if you're using Windows 10). Beyond that you're free to tweak as you wish to your liking, but please don't try and mislead others into thinking that means there's an image quality difference between AMD and Nvidia cards. There is not, and that's a fact.

Peace. :)
 
It's never been anything to do with the rgb settings, as I knew to set it.

I work on Amd, I use Nvidia when I'm not working/gaming, I prefer Amd's default colour profile, its not a case of 'AMD has superior image quality', their default settings are more vibrant imo.

One thing which impacted the colours and is extremely confusing with amd is their temperature control setting, came across this myself just last week there, you need to have the below settings:

https://community.amd.com/t5/driver...atic-or-6500k/m-p/92622/highlight/true#M17638

Longer explanation:
HDMI and DP have video metadata so GPU can tell the monitor which color space it's using. The options are limited to a few standard color spaces, sRGB, BT601, BT709, BT2020, P3 etc.
Fact 1: It just so happens that all these standard color spaces in PC space use D65 white point (i.e. 6500K color temperature).

In 19.12.2, "Custom Color" disabled means that color temperature is matching the color temperature of the color space indicated in the video metadata. This is really how it's supposed to work - HDMI/DP use some standard color spaces, GPU tells monitors which one it's using, and monitors convert to their native color spaces.
When you enable "Custom Color", default setting is Color Temperature Control enabled with 6500K. Due to Fact 1, this is the same as if you disabled "Custom Color" altogether. This is why it's not a bug, using 6500K is the right default.

With Custom Color enabled, and Color Temperature Control disabled, GPU is doing the conversion to panel's native color space. It's really not obvious from the name, it used to be called differently.
The problem is that monitor doesn't know this, and if it follows HDMI/DP spec and honors video metadata (this is what I meant by well-behaved in my earlier reply), it will convert from the color space in the video metadata to its native color space, without being aware that GPU already did that. Basically, this setting will only work if the monitor is ignoring HDMI/DP spec regarding color space info. Or maybe with DVI where there's no metadata.
This is why in most cases "Color Temperature Control disabled" isn't right.

xW2tfqo.png
 
I'm not sure where this image quality rumour comes from, but it's not my experience. I've been working in graphics for 25years, and I've done a lot of colour correct work over that time. The studios that deal with colour perfect reproduction account for everything in the room, like light temp, the wall, ceiling and floor colours and reflectivity, the desk colour and finish, the chair colour, sometimes even the operators uniform colour. They'll use full adobe rgb 10bit panels that are calibrated regularly. The Macs sometimes come with nvidia GPU's, sometimes AMD and its been like that for years. Nobody cares what the gpu is because it's not a factor in the workflow at all - other than the performance. The last mac I bought was a G5 in 2008, and it had a GTX8800.

The GPU has never been a consideration at all for image quality or accuracy, and there has never been a time when Nvidia powered workstations haven't been able to show the full range of colour on a 10bit panel. If there is any difference it's got nothing to do with the hardware.
 
Last edited:
Back
Top Bottom