• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA ‘Ampere’ 8nm Graphics Cards

500W 2080ti vs 500W 3080.

Perfromance pretty even in games until you switch DLS on and then the 3080 is 10-20% faster.

Just goes to show most of the performance gain from the 3000 series is just brute force extra power.


He's doing a 1Kw 3090 review later :eek::D
 
Guys, if you want to see something that truly makes you question your faith in the ability of people to act rationally then check out this video of people queueing, and SLEEPING. outside a store for 3 DAMN DAYS in order to get a 3090... and not everyone even got one! :eek:


The. Mind. Boggles. :(


Yeah, that certainly did not help... but again this was Nvidias own fault for trying to bully TSMC. Their arrogance and hubris (or rather, Jensens) has now grown to the point where it has negatively impacted their business. The need to reduce his power and presence because it's a prime example of one man and his personality wielding a level of power and decision-making within a company that should not be granted to any one person.
As I said I know people that will do this I'm one of them if i could id do it not klike there is anythigng else going on in corvid world.
 
SLi always works in x8/x8 (atleast with pcie 3) ...that being said, you would have to lookup that motherboard manual if the x8 slot is capable of SLi

But as things stand right now you got to worry abt 2 things:

1. Nvidia has noted that the 3090 will not have implicit SLi support..now we don't know what that means..if it's just missing profiles then there's the trusty Nvidia inspector to fall back on... But if it's fully disabled then no more SLi in DX 11 or earlier games. (I am yet to play witcher 3 that's how huge my backlog is)

2. Right now only the 4 slot bridge is available. I don't have a 4 slot on my motherboard so I am chilling out :D.. but it's definitely going to be faster than the 2080 ti setup, provided it works it's a nobrainer

Edit: it seems some motherboards can do x16/x16 as well with some additional onboard chip..so the manual again it is :)

I do believe that, at worst, I will be able to force SLI by using Nvidia Inspector.

My motherboard can do x16/x16 and I’m currently using 2 2080 Tis at x16/x16.

However, I can only run 2 3090s at x16/x8, and the question is whether performance will be gimped.
 
I do believe that, at worst, I will be able to force SLI by using Nvidia Inspector.

My motherboard can do x16/x16 and I’m currently using 2 2080 Tis at x16/x16.

However, I can only run 2 3090s at x16/x8, and the question is whether performance will be gimped.
What will be your specific use cases, only gaming? If so then the overall gains from 2x 3090's are anyway going to be, proportionally speaking, insanely low unless you are actually gaming at 8k where the max load possible is on the GPU's.
 
What will be your specific use cases, only gaming? If so then the overall gains from 2x 3090's are anyway going to be, proportionally speaking, insanely low unless you are actually gaming at 8k where the max load possible is on the GPU's.

If he can get sli working in games he will get a solid 120fps in every game at 4k ultra. Maybe thats what he is after
 
Yeah, that certainly did not help... but again this was Nvidias own fault for trying to bully TSMC. Their arrogance and hubris (or rather, Jensens) has now grown to the point where it has negatively impacted their business. The need to reduce his power and presence because it's a prime example of one man and his personality wielding a level of power and decision-making within a company that should not be granted to any one person.
Yes, while it's tempting to display Schadenfreude and think Jensen and the Nvidia marketeers, spin-doctors, and product pricers deserve it, there obviously have thousands of other employees who have to suffer the big ego.
Can't see many engineers at ARM being happy to be bought by Nvidia.
 
What will be your specific use cases, only gaming? If so then the overall gains from 2x 3090's are anyway going to be, proportionally speaking, insanely low unless you are actually gaming at 8k where the max load possible is on the GPU's.

My goal is to play any game out there maxed out at 4K with FPS never dipping lower than 60.

A single 3090 can’t achieve this goal. RDR2 is a good example.
 
My goal is to play any game out there maxed out at 4K with FPS never dipping lower than 60.

A single 3090 can’t achieve this goal. RDR2 is a good example.

A single watercooled 3090 with a 500W bios can though in RDR2 ;)

I had assumed you were trying to get every game to 120fps in 4k.
 
A single watercooled 3090 with a 500W bios can though in RDR2 ;)

I had assumed you were trying to get every game to 120fps in 4k.

Show me footage of RDR2 maxed out at 4K in the Annesburg part of the map never dipping under 60 FPS and I’ll believe you.

Other examples where a single 3090 is not going to cut it.

SOTTR
Metro Exodus
Tarkov
GTA V
 
Last edited:
Nothing is going to run Tarkov well because it's an unoptimised turd. Hardly worth using as any form of benchmark for performance.
 
the question is whether performance will be gimped.

I shouldnt matter when the NVLink is being actively used, anyhow you would be ending up with better performance after the upgrade.
I believe the best path would be to take the 4 slot option at launch and later when the Quadros are released in Jan/Feb, you can swap the 4 slot for the 3 slot Quadro nvlink and revert to x16/x16

Here's some data [via Gamers Nexus]:

pcie-benchmark-firestrike-ultra.png
 
500W 2080ti vs 500W 3080.

Perfromance pretty even in games until you switch DLS on and then the 3080 is 10-20% faster.

Just goes to show most of the performance gain from the 3000 series is just brute force extra power.


He's doing a 1Kw 3090 review later :eek::D

https://www.eurogamer.net/articles/digitalfoundry-2020-nvidia-geforce-rtx-3080-review?page=6

Code:
Control: 1440p, DX12, High, High RT, TAA

   2080     35
   2080Ti   46
   3080     70

Metro Exodus: 1440p, DX12, Ultra, Ultra RT, TAA

   2080     51
   2080Ti   66
   3080     90

Quake 2: 1440p, RTX, Vulkan, Max Settings

   2080     31
   2080Ti   41
   3080     62

This is before they optimise for the 30 series. I'm sure I read recently that Wolfenstien YB is getting an Async upgrade just for the 30 series.
 
I shouldnt matter when the NVLink is being actively used, anyhow you would be ending up with better performance after the upgrade.
I believe the best path would be to take the 4 slot option at launch and later when the Quadros are released in Jan/Feb, you can swap the 4 slot for the 3 slot Quadro nvlink and revert to x16/x16

Here's some data [via Gamers Nexus]:

pcie-benchmark-firestrike-ultra.png

I think that is a good suggestion.

Is there a date for the release of the Quadro 3-slot NVLINK bridge?

Also, if I run the 3090s in the first and third slot, they will be very close to each other. Do you think thermal throttling could be an issue?
 
Is there a date for the release of the Quadro 3-slot NVLINK bridge?

Also, if I run the 3090s in the first and third slot, they will be very close to each other. Do you think thermal throttling could be an issue?

Yes thermals are going to be a big issue if you play with vsync disabled.. i generally force half refresh rate in nvidia control panel while playing games
Regarding Quadro, its just a speculation, but it shouldn't matter a lot you could as well skip it, unless you are talking about games like AotS that dont use NVlink.. I only suggested it if you want to claim an additional 0.5 FPS improvement for competitive benchmarking purposes.
Overall, I believe the x16/x8 setup looks better, holistically.

Edit: Do keep us posted on how that goes, i will have to anyhow wait till the 3 slot bridge becomes available, either through AIBs or after Quadros are released
 
Last edited:
Yes thermals are going to be a big issue if you play with vsync disabled.. i generally force half refresh rate in nvidia control panel while playing games
Regarding Quadro, its just a speculation, but it shouldn't matter a lot you could as well skip it, unless you are talking about games like AotS that dont use NVlink.. I only suggested it if you want to claim an additional 0.5 FPS improvement for competitive benchmarking purposes.
Overall, I believe the x16/x8 setup looks better, holistically.

Ok, thank you for your feedback.

I always play single player/non-competitive games with VSync on, so hopefully thermals should’t be a problem.

I will initially go for the x16/x8 configuration. If I’m not happy, I’ll go for the Quadro NVLINK bridge next year. Perhaps companies like EVGA or MSI will come up with a 3-slots bridge before 2021.
 
https://www.eurogamer.net/articles/digitalfoundry-2020-nvidia-geforce-rtx-3080-review?page=6

Code:
Control: 1440p, DX12, High, High RT, TAA

   2080     35
   2080Ti   46
   3080     70

Metro Exodus: 1440p, DX12, Ultra, Ultra RT, TAA

   2080     51
   2080Ti   66
   3080     90

Quake 2: 1440p, RTX, Vulkan, Max Settings

   2080     31
   2080Ti   41
   3080     62

This is before they optimise for the 30 series. I'm sure I read recently that Wolfenstien YB is getting an Async upgrade just for the 30 series.

I am looking at 4k numbers and in the review i linked to the 3080 gives you 11fps or 13% increase over the 2080ti when they are both drawing the same wattage.

And control is the best gain for the 3080 our of all the games he tested.
 
Back
Top Bottom