• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

For anyone worried about older gear i.e. x16 PCIE2.0, PCIE3.0, PCIE4.0 with Navi/3000 Series

Bit premature for this, wait until gpus can directly access the nvme drives for texture streaming then we will have relevant benchmarks
 
Last edited:
Yes but remember that older cpus will introduce a bottleneck as well, so although results from PCIE 3.0 seem to have negligible difference from 4.0, the bottlenecking introduced by older cpu will have a more pronounced impact on performance.
 
PCIe bandwidth seems to matter more at 1080p than 4k and at 1080p with the 3080 you are starting to see more CPU bottlenecks.

Hardware Unboxed did similar testing and found that even the 5700XT can benefit from PCIe 4.0 over PCIe 3.0 at 1080p in some games.
 
PCIe bandwidth seems to matter more at 1080p than 4k and at 1080p with the 3080 you are starting to see more CPU bottlenecks.

Hardware Unboxed did similar testing and found that even the 5700XT can benefit from PCIe 4.0 over PCIe 3.0 at 1080p in some games.
Which is weird because you'd expect exactly the opposite i.e. 4k, more textures, more information over PCI4 to and from... strange...

Either way I think that in general though people worrying about older tech and runing 4k have no worries, think my XEON has got plenty of life in it for the next few years lol
 
Which is weird because you'd expect exactly the opposite i.e. 4k, more textures, more information over PCI4 to and from... strange...

Either way I think that in general though people worrying about older tech and runing 4k have no worries, think my XEON has got plenty of life in it for the next few years lol

I think it needs more thorough testing. Horizon Zero Dawn was showing bigger differences at 1080p vs 4k over at Hardware Unboxed but they didn't test at different graphical settings to see how much of that is frame rate related vs something else.
 
I’m kinda limited to 3.0 x 8 with my ssd setup and was worried the 3080 would bottleneck but looking at this I’ll lose a few % at most, is this right?
 
I’m kinda limited to 3.0 x 8 with my ssd setup and was worried the 3080 would bottleneck but looking at this I’ll lose a few % at most, is this right?
average-fps_3840-2160.png


Make of that what you will, but yeah possibly, it's really not making that much difference at the minute and every gen people all start crying out that it's going to effect things and if never pans out, I mean, I'm not sure at what point a GPU will be using more than say 3.5GB per second through put of PCIE3.0x16? Maybe I'm just over simplifying, but I think people with older machines as said earlier with older CPU's like mine yeah may not get THE BEST but I'm certainly nto worried, reckon i can get another 5 years out of this rig with a new GPU lol no problem!
 
average-fps_3840-2160.png


Make of that what you will, but yeah possibly, it's really not making that much difference at the minute and every gen people all start crying out that it's going to effect things and if never pans out, I mean, I'm not sure at what point a GPU will be using more than say 3.5GB per second through put of PCIE3.0x16? Maybe I'm just over simplifying, but I think people with older machines as said earlier with older CPU's like mine yeah may not get THE BEST but I'm certainly nto worried, reckon i can get another 5 years out of this rig with a new GPU lol no problem!

Unfortunately they don’t have 3 x 8 on the graph but my understanding is that it’s similar to 2 x 16?

In which case it’s a tad down but nowt to lose sleep over?
 
Bit premature for this, wait until gpus can directly access the nvme drives for texture streaming then we will have relevant benchmarks

Pretty much this IMO, when the GPU can directly access the NVME then that removes the CPU bottleneck and will require a fair chunk of PCIe bandwidth, although even then how close are they to saturating an x16 slot? You are only going to get limited gains due to latency from PCIe 4 vs 3 if they are not bandwidth limited.

It's almost comical seeing youtubers decreeing PCIe generation doesn't matter, which is true TODAY.. but none really try to measure bandwidth to see what RTX IO might require and therefore if PCIe 4.0 is a definite future advantage.
 
PCIe bandwidth seems to matter more at 1080p than 4k and at 1080p with the 3080 you are starting to see more CPU bottlenecks.
Hardware Unboxed did similar testing and found that even the 5700XT can benefit from PCIe 4.0 over PCIe 3.0 at 1080p in some games.

Which is weird because you'd expect exactly the opposite i.e. 4k, more textures, more information over PCI4 to and from... strange...
Techpowerup talk about it in conclusions. Get much higher FPS at 1080p, so more stuff is happening per second. And textures are the same at 1080 and 4k as long as quality setting are the same.
 
Hmmm, you could also try overclocking the PCI-E, something I haven't even bothered doing but is usually a feature in the bios. You might get an extra 1-2% performance!

Honestly, PCI-E 3.0 to 4.0 is even less useful than faster ram.
 
Back
Top Bottom