• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs

It's ridiculous, it takes about 6 secs to open the nvidia control panel and that's on a 5950x and gen 4 nvme SSD (WD SN850). It's in desperate need of scrapping.

Yup, it really is dated, from looking around online it seems to date from around 2004\2005, they just added options to it over the years. Daft amount of time to keep it essentially the same.
 
It might also be a problem that the Ampere architecture is a very wide Vega-like architecture with many compute units and it's tough to keep them all occupied with work.

However, that wouldn't explain the GPU utilisation problem in Valhalla because that happens on both my 1080 and my 3080.
Funny you should mention that, i get the same on my 3090. The more taxing the game becomes the less gpu utilisation occurs, it can start off floating around 60-70% and drop all the way down to 30-40% when there's a lot of buildings, materials etc.
 
Yup, it really is dated, from looking around online it seems to date from around 2004\2005, they just added options to it over the years. Daft amount of time to keep it essentially the same.
Would love to see an updated control panel, but i kinda see why they haven't innovated in that department... if it works don't fix it mentality.
 
Contradicts what W1zzard is saying:

https://www.techpowerup.com/review/gpu-test-system-update-march-2021/

After our first Big Navi reviews I realized that something was odd about the power consumption testing method I've been using for years without issue. It seemed the Radeon RX 6800 XT was just SO much more energy efficient than NVIDIA's RTX 3080. It definitely is more efficient because of the 7 nm process and AMD's monumental improvements in the architecture, but the lead just didn't look right. After further investigation, I realized that the RX 6800 XT was getting CPU bottlenecked in Metro: Last Light at even the higher resolutions, whereas the NVIDIA card ran without a bottleneck. This of course meant NVIDIA's card consumed more power in this test because it could run faster.
 
I remember Digital Foundry joked in one of their videos that AMD's DX11 driver was so bad it was like it was from the 90s when compared to Nvidia's modern and advanced driver. I wonder what they'll say about Nvidia's seemingly inferior DX12 driver. Let me guess: nothing.
 
I remember Digital Foundry joked in one of their videos that AMD's DX11 driver was so bad it was like it was from the 90s when compared to Nvidia's modern and advanced driver. I wonder what they'll say about Nvidia's seemingly inferior DX12 driver. Let me guess: nothing.

That raises an interesting question doesn't it - is Nvidia's CPU overhead still equally heavy in DX11, it's just that by comparison to AMD's poor performance on DX11 it looks good?

It's something that you can only actually really compare relative to the competition rather than quantifiably in isolation.
 
Surely using the cards for their intended purpose ie 4k ultra settings makes the whole video pointless?

Who uses a 3090 at 1080p?
Future games that are more demanding on the CPU (given the new consoles) may expose this bottleneck further. Also, there are sometimes CPU bound areas in games even on the highest end CPUs.
 
there's absolutely no doubt that 3000 series experiences heavy CPU bottlenecking on anything below latest gen i7/Ryzen 7, I've been banging that drum when people ask for system quotes since these GPUs launched. The thing is, in my mind, if it was specifically a driver issue then why does the 2080Ti not suffer the same behaviour? That suggests something architecture/bandwidth specific to me.

There's also an argument that if you are spending 1/4 of what you spend on a graphics card on your CPU, then is it really the GPU/Driver's fault?

Whilst I've not yet been able to explain the precise reason for the bottlenecking, I'm not sold on it being a driver issue per se. So far I know that DLSS isn't the issue and overall performance is heavily related to single core performance, as apparent from only minute differences in performance between 5800X/5900X and 5950X (extra cores don't seem to help - which if it was an overhead issue I'd expect to be the opposite).

I might be wrong but that's my view.
 
there's absolutely no doubt that 3000 series experiences heavy CPU bottlenecking on anything below latest gen i7/Ryzen 7, I've been banging that drum when people ask for system quotes since these GPUs launched. The thing is, in my mind, if it was specifically a driver issue then why does the 2080Ti not suffer the same behaviour? That suggests something architecture/bandwidth specific to me.

The 2080Ti did exhibit the same behaviour in the video

"8:54 - is ampere to blame" timestamp
 
Don't have time to watch video at the moment - did they try with a clean driver without GeForce Experience and telemetry rubbish?
 
The 2080Ti did exhibit the same behaviour in the video

"8:54 - is ampere to blame" timestamp
Interesting, can't say we ever saw that but we rarely paired 2080Ti with first gen Ryzen 5. Again though, the performance of the i3 suggests that I'm right about single core vs multithreaded performance and that screams to me that it might not be drivers.
 
It's ridiculous, it takes about 6 secs to open the nvidia control panel and that's on a 5950x and gen 4 nvme SSD (WD SN850). It's in desperate need of scrapping.

Leave it Desktop Colour Settings page and it opens instantly, how often do you need to open it anyhow?

Instead of all these "Sky is Falling Threads" from "Experts" I just play my games on my 3090...
 
Back
Top Bottom